Loading...
No results found.

Apply your skills in Google Cloud console

05

Building Batch Data Pipelines on Google Cloud

Get access to 700+ labs and courses

Serverless Data Analysis with Dataflow: Side Inputs (Java)

Lab 1 hour 30 minutes universal_currency_alt 5 Credits show_chart Advanced
info This lab may incorporate AI tools to support your learning.
Get access to 700+ labs and courses

Overview

In this lab, you learn how to load data into BigQuery and run complex queries. Next, you will execute a Dataflow pipeline that can carry out Map and Reduce operations, use side inputs, and stream into BigQuery.

Objective

In this lab, you learn how to use BigQuery as a data source into Dataflow, and how to use the results of a pipeline as a side input to another pipeline.

  • Read data from BigQuery into Dataflow
  • Use the output of a pipeline as a side-input to another pipeline

Setup

For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.

  1. Sign in to Qwiklabs using an incognito window.

  2. Note the lab's access time (for example, 1:15:00), and make sure you can finish within that time.
    There is no pause feature. You can restart if needed, but you have to start at the beginning.

  3. When ready, click Start lab.

  4. Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.

  5. Click Open Google Console.

  6. Click Use another account and copy/paste credentials for this lab into the prompts.
    If you use other credentials, you'll receive errors or incur charges.

  7. Accept the terms and skip the recovery resource page.

Activate Google Cloud Shell

Google Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud.

Google Cloud Shell provides command-line access to your Google Cloud resources.

  1. In Cloud console, on the top right toolbar, click the Open Cloud Shell button.

  2. Click Continue.

It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  • You can list the active account name with this command:
gcloud auth list

Output:

Credentialed accounts: - @.com (active)

Example output:

Credentialed accounts: - google1623327_student@qwiklabs.net
  • You can list the project ID with this command:
gcloud config list project

Output:

[core] project =

Example output:

[core] project = qwiklabs-gcp-44776a13dea667a6 Note: Full documentation of gcloud is available in the gcloud CLI overview guide .

Launch Google Cloud Shell Code Editor

Use the Google Cloud Shell Code Editor to easily create and edit directories and files in the Cloud Shell instance.

  • Once you activate the Google Cloud Shell, click Open editor to open the Cloud Shell Code Editor.

You now have three interfaces available:

  • The Cloud Shell Code Editor
  • Console (By clicking on the tab). You can switch back and forth between the Console and Cloud Shell by clicking on the tab.
  • The Cloud Shell Command Line (By clicking on Open Terminal in the Console)

Check project permissions

Before you begin your work on Google Cloud, you need to ensure that your project has the correct permissions within Identity and Access Management (IAM).

  1. In the Google Cloud console, on the Navigation menu (), select IAM & Admin > IAM.

  2. Confirm that the default compute Service Account {project-number}-compute@developer.gserviceaccount.com is present and has the editor role assigned. The account prefix is the project number, which you can find on Navigation menu > Cloud Overview > Dashboard.

Note: If the account is not present in IAM or does not have the editor role, follow the steps below to assign the required role.
  1. In the Google Cloud console, on the Navigation menu, click Cloud Overview > Dashboard.
  2. Copy the project number (e.g. 729328892908).
  3. On the Navigation menu, select IAM & Admin > IAM.
  4. At the top of the roles table, below View by Principals, click Grant Access.
  5. For New principals, type:
{project-number}-compute@developer.gserviceaccount.com
  1. Replace {project-number} with your project number.
  2. For Role, select Project (or Basic) > Editor.
  3. Click Save.

Task 1. Preparation

For this lab, you will need the training-data-analyst repository.

Download Code Repository

  1. Clone the repository from the Cloud Shell command line:
git clone https://github.com/GoogleCloudPlatform/training-data-analyst
  1. Click on the Refresh icon in the left navigator panel.

You should see the training-data-analyst directory.

Ensure that the Dataflow API is successfully enabled

  1. In the Cloud Shell, run the following commands to ensure that the Dataflow API is enabled cleanly in your project.
gcloud services disable dataflow.googleapis.com --force gcloud services enable dataflow.googleapis.com

Create a Cloud Storage bucket

Follow these instructions to create a bucket.

  1. In the Console, on the Navigation menu, click Cloud Storage > Buckets.

  2. Click + Create.

  3. Specify the following, and leave the remaining settings as their defaults:

Property Value (type value or select option as specified)
Name
Location type Multi-region
Location
  1. Click Create.

  2. If you get the Public access will be prevented prompt, select Enforce public access prevention on this bucket and click Confirm.

  3. In the Cloud Shell, enter the following to create three environment variables. One named "BUCKET", another named "PROJECT", and the last named "REGION". Verify that each exists with the echo command:

BUCKET="{{{project_0.project_id|project_place_holder_text}}}" echo $BUCKET PROJECT="{{{project_0.project_id|project_place_holder_text}}}" echo $PROJECT REGION="{{{project_0.default_region|region_place_holder_text}}}" echo $REGION

Task 2. Try out a BigQuery query

  1. Return to the BigQuery web UI. If it is not already open, open Console.

  2. On the Navigation menu (), click BigQuery and then click Done.

  3. Click + Compose a New Query to run a query.

  4. Enter the following query:

SELECT content FROM `cloud-training-demos.github_repos.contents_java` LIMIT 10
  1. Click on Run.

What is being returned?

The BigQuery table cloud-training-demos.github_repos.contents_java contains the content (and some metadata) of all the Java files present in GitHub in 2016.

  1. To find out how many Java files this table has, click + to compose new query.

  2. Enter the following query:

SELECT COUNT(*) FROM `cloud-training-demos.github_repos.contents_java`
  1. Click on Run.

Is this a dataset you want to process locally or on the cloud?

Task 3. Explore the pipeline code

  1. In Cloud Shell editor, or in Cloud Shell, navigate to the lab directory:
cd ~/training-data-analyst/courses/data_analysis/lab2
  1. View the pipeline code using Cloud Shell editor or nano:
Note: Do not make any changes to the code. cd ~/training-data-analyst/courses/data_analysis/lab2/javahelp nano src/main/java/com/google/cloud/training/dataanalyst/javahelp/JavaProjectsThatNeedHelp.java
  1. Refer to this diagram as you read the code. The pipeline looks like this:

  1. Answer the following questions:
  • Looking at the class documentation at the very top, what is the purpose of this pipeline?
  • Where does GetJava get Java content from?
  • What does ToLines do? (Hint: look at the content field of the BigQuery result)
  • Why is the result of ToLines stored in a named PCollection instead of being directly passed to another apply()?
  • What are the two actions carried out on javaContent?
  • If a file has 3 FIXMEs and 2 TODOs in its content (on different lines), how many calls for help are associated with it?
  • If a file is in the package com.google.devtools.build, what are the packages that it is associated with?
  • Why is the numHelpNeeded variable not enough? Why do we need to do Sum.integersPerKey()? (Hint: there are multiple files in a package)
  • Why is this converted to a View?
  • Which operation uses the View as a side input?
  • Instead of simply ParDo.of(), this operation uses
  • Besides c.element() and c.output(), this operation also makes use of what method in ProcessContext?

Task 4. Execute the pipeline

  1. Execute the pipeline by typing the following into Cloud Shell:
cd ~/training-data-analyst/courses/data_analysis/lab2/javahelp ./run_oncloud3.sh $PROJECT $BUCKET JavaProjectsThatNeedHelp $REGION Note: Wait until the command is fully executed. It will take around 5 to 7 minutes.
  1. Return to the browser tab for Console.

  2. On the Navigation menu (), click Dataflow and click on your job to monitor progress. The job may take up to 25 minutes to complete.

  3. Once the pipeline has finished executing, download and view the output by running the following commands in the Cloud Shell:

BUCKET="{{{project_0.project_id|project_place_holder_text}}}" gcloud storage cp gs://$BUCKET/javahelp/output.csv . head output.csv

End your lab

When you have completed your lab, click End Lab. Google Cloud Skills Boost removes the resources you’ve used and cleans the account for you.

You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.

The number of stars indicates the following:

  • 1 star = Very dissatisfied
  • 2 stars = Dissatisfied
  • 3 stars = Neutral
  • 4 stars = Satisfied
  • 5 stars = Very satisfied

You can close the dialog box if you don't want to provide feedback.

For feedback, suggestions, or corrections, please use the Support tab.

Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.

Previous Next

Before you begin

  1. Labs create a Google Cloud project and resources for a fixed time
  2. Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
  3. On the top left of your screen, click Start lab to begin

This content is not currently available

We will notify you via email when it becomes available

Great!

We will contact you via email if it becomes available

One lab at a time

Confirm to end all existing labs and start this one

Use private browsing to run the lab

Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
Preview