Rent-a-VM to Process Earthquake Data

Join Sign in

Rent-a-VM to Process Earthquake Data

40 minutes 1 Credit


Google Cloud self-paced labs logo


Using Google Cloud to set up a virtual machine to process earthquake data frees you from IT minutia to focus on your scientific goals. You can ingest and process data, then present the results in various formats. In this lab, you will ingest real-time earthquake data published by the United States Geological Survey (USGS) and create maps that look like the following:

World map displaying earthquake indicators

In this lab you will spin up a virtual machine, access it remotely, and then manually create a pipeline to retrieve, process and publish the data.

What you will learn

In this lab, you will learn how to do the following:

  • Create a Compute Engine instance with specific security permissions.
  • SSH into the instance.
  • Install the software package Git (for source code version control).
  • Ingest data into the Compute Engine instance.
  • Transform data on the Compute Engine instance.
  • Store the transformed data on Cloud Storage.
  • Publish Cloud Storage data to the web.


Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud Console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:

    • The Open Google Console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Console. The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username from the Lab Details panel and paste it into the Sign in dialog. Click Next.

  4. Copy the Password from the Lab Details panel and paste it into the Welcome dialog. Click Next.

    Important: You must use the credentials from the left panel. Do not use your Google Cloud Skills Boost credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  5. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Cloud Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking the Navigation menu at the top-left. Navigation menu icon

Task 1. Create Compute Engine instance with the necessary API access

  1. To create a Compute Engine instance, from the Navigation menu click on Compute Engine > VM instances:

  2. Click Create Instance and wait for the "Create an instance" form to load.

  3. Use default Region and Zone for creating the instance:

  4. In the Boot Disk section, click Change.

  5. Change the Version to Debian GNU/Linux 10 (buster).

  6. Leave the other settings as is and click Select.

  7. Change Identify API access for the Compute Engine default service account to Allow full access to all Cloud APIs, then click Create.

You'll see a green circle with a check when the instance is created.

Click Check my progress below to verify you're on track in this lab.

Create a Compute Engine instance with the necessary API access

Task 2. SSH into the instance

You can remotely access your Compute Engine instance using Secure Shell (SSH):

  1. Click the SSH button next to your newly created VM:

The VM instance details displays.

Note: Make sure your browser is not blocking pop-ups.

SSH keys are automatically transferred; no extra software is needed to ssh directly from the browser.

  1. To find some information about the Compute Engine instance, type the following into the command-line:

cat /proc/cpuinfo

You should see a similar output:

processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 63 model name : Intel(R) Xeon(R) CPU @ 2.30GHz ....

Task 3. Install software

  1. Still in the SSH window, enter the following commands:

sudo apt-get update sudo apt-get -y -qq install git sudo apt-get install python-mpltoolkits.basemap
  1. Enter Y when asked if it's acceptable to use additional disk space.

  2. Verify that git is now installed:

git --version

You should see a similar output:

git version 2.11.0

Click Check my progress below to verify you're on track in this lab.

Install software

Task 4. Ingest USGS data

  1. Still in the SSH window, enter the following command to download the code from GitHub:

git clone Note: If you get a git authorization error, it is likely that the GitHub URL has a typo in it. Please copy and paste the above code.
  1. Navigate to the folder corresponding to this lab:

cd training-data-analyst/CPB100/lab2b
  1. Examine the ingest code using less:


The less command allows you to view the file (Press the spacebar to scroll down; the letter b to back up a page; the letter q to quit).

  1. Enter q to exit the editor.

The program downloads a dataset of earthquakes in the past 7 days from the US Geological Survey. Notice where the file is downloaded to (disk or Cloud Storage.)

  1. Enter the following command to run the ingest code:


Click Check my progress below to verify you're on track in this lab.

Ingest USGS data

Task 5. Transform the data

You will use a Python program to transform the raw data into a map of earthquake activity:

The transformation code is explained in detail in this notebook.

Feel free to read the narrative to understand what the transformation code does. The notebook itself was written in Datalab, a Google Cloud product that you will use later in this set of labs.

  1. Still in the Compute Engine instance, enter the following command to install the necessary Python packages on the Compute Engine instance:

  1. Enter the following command to run the transformation code:

  1. You will notice a new image file earthquakes.png in your current directory if you enter the following command:

ls -l

Click Check my progress below to verify you're on track in this lab.

Transform the data

Task 6. Create a Cloud Storage bucket

Return to the Cloud Console for this step.

  1. From the Navigation menu select Cloud Storage:

  2. Click on Create Bucket, then create your bucket with the following characteristics:

  • Choose a globally unique bucket name (but not a name you'd like to use for your own projects), then click Continue.

  • You can leave it as Multi-Regional, or improve speed and reduce costs by making it Regional (choose the same region as your Compute Engine instance).

  • For Choose how to control access to objects, uncheck the box for Enforce public access prevention on this bucket and select Fine-grained for Access control.

  1. Then, click Create.

Take note of your bucket name. You will insert its name whenever the instructions ask for <YOUR-BUCKET>.

Task 7. Store data

You will now learn how to store the original and transformed data in Cloud Storage.

  1. In the SSH window of the Compute Engine instance, run the following, changing <YOUR-BUCKET> to the bucket name you created earlier:

gsutil cp earthquakes.* gs://<YOUR-BUCKET>/earthquakes/

This command copies the files to your bucket in Cloud Storage.

  1. Return to the Cloud Console and on the Storage Browser page click on the Refresh button near the top of the page. Now click on the bucket name then the /earthquakes folder.

You should now see the following three files in the earthquakes folder:

  • earthquakes.csv
  • earthquakes.htm
  • earthquakes.png

Click Check my progress below to verify you're on track in this lab.

Create bucket and Store data

Task 8. Publish Cloud Storage files to web

You will now publish the files in your bucket to the web.

  1. To create a publicly accessible URL for the files, click on the earthquakes.htm file, then click the three dots at the end of the row and select Edit access from the dropdown menu.

  2. In the overlay that appears, click the + Add entry button.

  3. Add a permission for all users by entering in the following:

  • Select Public for the Entity.
  • Enter allUsers for the Name.
  • Select Reader for the Access.
  • Then click Save.

Edit access page

  1. Repeat the above steps for earthquakes.png.

  2. Click on the name of a file and notice the URL of the published Cloud Storage file and how it relates to your bucket name and content. It should resemble the following:
  1. If you click on the earthquakes.png image file and then on the public URL, a new tab will be opened with the following image loaded:

World map with earthquake indicators

  1. Go ahead and close the SSH window.


You have completed this lab and learned how to spin up a compute engine instance, access it remotely, then manually create a pipeline to retrieve, process and publish the data.

Finish Your Quest

This self-paced lab is part of the Scientific Data Processing quest. A quest is a series of related labs that form a learning path. Completing this quest earns you a badge to recognize your achievement. You can make your badge or badges public and link to them in your online resume or social media account. Enroll in this quest or any quest that contains this lab and get immediate completion credit. See the Google Cloud Skills Boost catalog to see all available quests.

Take Your Next Lab

Continue your Quest with Weather Data in BigQuery, or try Distributed Image Processing in Cloud Dataproc

Next steps / Learn more

Here are some follow-up steps:

Google Cloud training and certification

...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.

Manual Last Updated June 9, 2022

Lab Last Tested May 11, 2022

Copyright 2023 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.