arrow_back

Using gsutil to Perform Operations on Buckets and Objects

Join Sign in

Using gsutil to Perform Operations on Buckets and Objects

45 minutes 1 Credit

GSP130

Google Cloud selp-paced labs logo

Overview

In this lab, you will use gsutil to create a bucket and perform operations on objects. gsutil is a Python application that lets you access Cloud Storage from the command line. The gsutil tool has commands such as mb and cp to perform operations. Each command has a set of options that are used to customize settings further.

What you'll learn to do

  • Create a bucket

  • Copy files from a local folder to a bucket

  • Synchronize the contents of the local folder with the contents of the bucket

  • Change access control permissions on objects

  • Delete a bucket.

Setup

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud Console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:

    • The Open Google Console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Console. The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username from the Lab Details panel and paste it into the Sign in dialog. Click Next.

  4. Copy the Password from the Lab Details panel and paste it into the Welcome dialog. Click Next.

    Important: You must use the credentials from the left panel. Do not use your Google Cloud Skills Boost credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  5. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Cloud Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking the Navigation menu at the top-left. Navigation menu icon

Activate Cloud Shell

Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.

  1. In the Cloud Console, in the top right toolbar, click the Activate Cloud Shell button.

Cloud Shell icon

  1. Click Continue.

It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. The output contains a line that declares the PROJECT_ID for this session:

Your Cloud Platform project in this session is set to YOUR_PROJECT_ID

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  1. (Optional) You can list the active account name with this command:

gcloud auth list

(Output)

ACTIVE: * ACCOUNT: student-01-xxxxxxxxxxxx@qwiklabs.net To set the active account, run: $ gcloud config set account `ACCOUNT`
  1. (Optional) You can list the project ID with this command:

gcloud config list project

(Output)

[core] project = <project_ID>

(Example output)

[core] project = qwiklabs-gcp-44776a13dea667a6 For full documentation of gcloud, in Google Cloud, Cloud SDK documentation, see the gcloud command-line tool overview.

In Cloud Shell session execute the following command to download sample data for this lab from a git repository:

git clone https://github.com/GoogleCloudPlatform/training-data-analyst

Change to the blogs directory:

cd training-data-analyst/blogs

Working with buckets and objects

First, set some environment variables:

PROJECT_ID=`gcloud config get-value project` BUCKET=${PROJECT_ID}-bucket

Create a bucket

Create a bucket and multi-regional storage class:

gsutil mb -c multi_regional gs://${BUCKET}

Click Check my progress to verify the objective.

Create a bucket

Upload objects to your bucket

Run the following to copy the endpointslambda object to your bucket:

gsutil -m cp -r endpointslambda gs://${BUCKET}

Click Check my progress to verify the objective.

Upload objects to your bucket

If you have a large number of files to transfer, you might want to use the -m option, to perform a parallel (multi-threaded/multi-processing) copy for faster performance. The -r option allows gsutil to recurse through directories.

List objects

To list objects in your bucket, execute the following command:

gsutil ls gs://${BUCKET}/*

Sync changes with bucket

Use the following commands to rename and delete some files:

mv endpointslambda/Apache2_0License.txt endpointslambda/old.txt rm endpointslambda/aeflex-endpoints/app.yaml

Now synchronize the local changes with the bucket:

gsutil -m rsync -d -r endpointslambda gs://${BUCKET}/endpointslambda

In this command, the -d option deletes files from the target if they're missing in the source (in this case, it deletes app.yaml from the bucket). The -r option runs the command recursively on directories.

To verify that the bucket is now in sync with your local changes, list the files in the bucket again:

gsutil ls gs://${BUCKET}/*

Make objects public

To allow public access to all files under the endpointslambda folder in your bucket, execute the following command:

gsutil -m acl set -R -a public-read gs://${BUCKET}

Click Check my progress to verify the objective.

Make objects public

To confirm files are viewable by the public, open the following link in a new incognito or private browser window, replacing <your-bucket-name> with the full name of your bucket, not the environment variable:

http://storage.googleapis.com/<your-bucket-name>/endpointslambda/old.txt

This URL uses the Cloud Storage API link to view the object without authentication. For more information, see Accessing Public Data.

Copy with different storage class

Next, copy a file with Nearline storage class instead of the bucket's default Multi-regional storage class:

gsutil cp -s nearline ghcn/ghcn_on_bq.ipynb gs://${BUCKET}

Check storage classes

Run the following to check the storage classes and view other detailed information about the objects in your bucket:

gsutil ls -Lr gs://${BUCKET} | more

Press the space key to continue viewing the rest of the command's output.

The output shows that the ghcn_on_bq.ipynb object has NEARLINE storage class while the other objects have MULTI_REGIONAL storage class.

Output:

gs://qwiklabs-gcp-90345ac124778ed8-bucket/ghcn_on_bq.ipynb: Creation time: Tue, 13 Aug 2019 20:19:27 GMT Update time: Tue, 13 Aug 2019 20:19:27 GMT Storage class: NEARLINE Content-Length: 980176 Content-Type: application/octet-stream ... gs://qwiklabs-gcp-90345ac124778ed8-bucket/endpointslambda/: gs://qwiklabs-gcp-90345ac124778ed8-bucket/endpointslambda/README.md: Creation time: Tue, 13 Aug 2019 20:03:29 GMT Update time: Tue, 13 Aug 2019 20:15:43 GMT Storage class: MULTI_REGIONAL Content-Length: 452 Content-Type: text/markdown ...

You can use Ctrl + c to return to the command line.

Delete your bucket

Before deleting a bucket, you must first delete all objects in the bucket. To delete all objects, execute the following command:

gsutil rm -rf gs://${BUCKET}/*

Now delete the bucket:

gsutil rb gs://${BUCKET}

Click Check my progress to verify the objective.

Delete the bucket

Congratulations

You have now learned how to perform operations on Cloud Storage buckets and objects!

SDK-01_125.png

Finish Your Quest

This self-paced lab is part of the Qwiklabs Using the Cloud SDK Command Line Quest. A Quest is a series of related labs that form a learning path. Completing this Quest earns you the badge above, to recognize your achievement. You can make your badge public and link to them in your online resume or social media account. Enroll in this Quest and get immediate completion credit if you've taken this lab. See other available Qwiklabs Quests.

Take your next lab

Continue your Quest with BigQuery: Qwik Start - Command Line or try one of these suggestions:

Next Steps/Learn More

For complete information about the gsutil command-line options, see:

End your lab

When you have completed your lab, click End Lab. Qwiklabs removes the resources you’ve used and cleans the account for you.

You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.

The number of stars indicates the following:

  • 1 star = Very dissatisfied
  • 2 stars = Dissatisfied
  • 3 stars = Neutral
  • 4 stars = Satisfied
  • 5 stars = Very satisfied

You can close the dialog box if you don't want to provide feedback.

For feedback, suggestions, or corrections, please use the Support tab.

Manual Last Updated: April 13, 2022
Lab Last Tested: April 13, 2022

Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.