
Before you begin
- Labs create a Google Cloud project and resources for a fixed time
- Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
- On the top left of your screen, click Start lab to begin
In this lab, you learn how to load data into BigQuery and run complex queries. Next, you will execute a Dataflow pipeline that can carry out Map and Reduce operations, use side inputs, and stream into BigQuery.
In this lab, you learn how to use BigQuery as a data source into Dataflow, and how to use the results of a pipeline as a side input to another pipeline.
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00
), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
Google Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud.
Google Cloud Shell provides command-line access to your Google Cloud resources.
In Cloud console, on the top right toolbar, click the Open Cloud Shell button.
Click Continue.
It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:
gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.
Output:
Example output:
Output:
Example output:
Use the Google Cloud Shell Code Editor to easily create and edit directories and files in the Cloud Shell instance.
You now have three interfaces available:
Before you begin your work on Google Cloud, you need to ensure that your project has the correct permissions within Identity and Access Management (IAM).
In the Google Cloud console, on the Navigation menu (), select IAM & Admin > IAM.
Confirm that the default compute Service Account {project-number}-compute@developer.gserviceaccount.com
is present and has the editor
role assigned. The account prefix is the project number, which you can find on Navigation menu > Cloud Overview > Dashboard.
editor
role, follow the steps below to assign the required role.729328892908
).{project-number}
with your project number.For this lab, you will need the training-data-analyst repository.
You should see the training-data-analyst directory.
Follow these instructions to create a bucket.
In the Console, on the Navigation menu, click Cloud Storage > Buckets.
Click + Create.
Specify the following, and leave the remaining settings as their defaults:
Property | Value (type value or select option as specified) |
---|---|
Name | |
Location type | Multi-region |
Location |
Click Create.
If you get the Public access will be prevented
prompt, select Enforce public access prevention on this bucket
and click Confirm.
In the Cloud Shell, enter the following to create three environment variables. One named "BUCKET", another named "PROJECT", and the last named "REGION". Verify that each exists with the echo command:
Return to the BigQuery web UI. If it is not already open, open Console.
On the Navigation menu (), click BigQuery and then click Done.
Click + Compose a New Query to run a query.
Enter the following query:
What is being returned?
The BigQuery table cloud-training-demos.github_repos.contents_java contains the content (and some metadata) of all the Java files present in GitHub in 2016.
To find out how many Java files this table has, click + to compose new query.
Enter the following query:
Is this a dataset you want to process locally or on the cloud?
Return to the browser tab for Console.
On the Navigation menu (), click Dataflow and click on your job to monitor progress. The job may take up to 25 minutes to complete.
Once the pipeline has finished executing, download and view the output by running the following commands in the Cloud Shell:
When you have completed your lab, click End Lab. Google Cloud Skills Boost removes the resources you’ve used and cleans the account for you.
You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.
The number of stars indicates the following:
You can close the dialog box if you don't want to provide feedback.
For feedback, suggestions, or corrections, please use the Support tab.
Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
This content is not currently available
We will notify you via email when it becomes available
Great!
We will contact you via email if it becomes available
One lab at a time
Confirm to end all existing labs and start this one