
Before you begin
- Labs create a Google Cloud project and resources for a fixed time
- Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
- On the top left of your screen, click Start lab to begin
In this lab, you will open a Dataflow project, use pipeline filtering, and execute the pipeline locally and on the cloud.
In this lab, you will learn how to write a simple Dataflow pipeline and run it both locally and on the cloud.
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00
), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
Google Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud.
Google Cloud Shell provides command-line access to your Google Cloud resources.
In Cloud console, on the top right toolbar, click the Open Cloud Shell button.
Click Continue.
It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:
gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.
Output:
Example output:
Output:
Example output:
Use the Google Cloud Shell Code Editor to easily create and edit directories and files in the Cloud Shell instance.
You now have three interfaces available:
Before you begin your work on Google Cloud, you need to ensure that your project has the correct permissions within Identity and Access Management (IAM).
In the Google Cloud console, on the Navigation menu (), select IAM & Admin > IAM.
Confirm that the default compute Service Account {project-number}-compute@developer.gserviceaccount.com
is present and has the editor
role assigned. The account prefix is the project number, which you can find on Navigation menu > Cloud Overview > Dashboard.
editor
role, follow the steps below to assign the required role.729328892908
).{project-number}
with your project number.Specific steps must be completed to successfully execute this lab:
Verify that you have a Cloud Storage bucket (one was created for you automatically when the lab environment started).
On the Google Cloud Console title bar, click Activate Cloud Shell. If prompted, click Continue. Clone the lab code github repository using the following command:
The goal of this lab is to become familiar with the structure of a Dataflow project and learn how to execute a Dataflow pipeline. You will use the powerful build tool Maven to create a new Dataflow project.
In the Cloud Shell code editor navigate to the directory
/training-data-analyst/courses/data_analysis/lab2
.
Then select the path javahelp/src/main/java/com/google/cloud/training/dataanalyst/javahelp/
and view the file Grep.java.
Alternatively, you could view the file with nano
editor. Do not make any changes to the code.
Can you answer these questions about the file Grep.java
?
There are three apply statements in the pipeline:
output.txt
. If the output is large enough, it will be sharded into separate parts with names like: output-00000-of-00001
. If necessary, you can locate the correct file by examining the file's time:Does the output seem logical?
In the Cloud Shell code editor navigate to the directory /training-data-analyst/courses/data_analysis/lab2/javahelp/src/main/java/com/google/cloud/training/dataanalyst/javahelp
Edit Dataflow pipeline in the file Grep.java:
Example lines before:
What is the difference between this Maven command and the one to run locally?
Because this is such a small job, running on the cloud will take significantly longer than running it locally (on the order of 2-3 minutes).
Example completion of command line:
Example:
Wait for the job status to turn to Succeeded. At this point, your Cloud Shell will display a command-line prompt.
Dataflow job
to the cloudExamine the output in the Cloud Storage bucket. On the Navigation menu (), click Cloud Storage > Buckets and click on your bucket.
Click the javahelp directory. This job will generate the file output.txt
. If the file is large enough it will be sharded into multiple parts with names like: output-0000x-of-000y
. You can identify the most recent file by name or by the Last modified field. Click on the file to view it.
Alternatively, you could download the file in Cloud Shell and view it:
When you have completed your lab, click End Lab. Google Cloud Skills Boost removes the resources you’ve used and cleans the account for you.
You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.
The number of stars indicates the following:
You can close the dialog box if you don't want to provide feedback.
For feedback, suggestions, or corrections, please use the Support tab.
Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
This content is not currently available
We will notify you via email when it becomes available
Great!
We will contact you via email if it becomes available
One lab at a time
Confirm to end all existing labs and start this one