Checkpoints
Create a Cloud Dataproc cluster (region: us-central1)
/ 5
Submit a Spark job to your cluster (region: us-central1)
/ 5
Introduction to Cloud Dataproc: Hadoop and Spark on Google Cloud
GSP123
Overview
Cloud Dataproc is a managed Spark and Hadoop service that lets you take advantage of open source data tools for batch processing, querying, streaming, and machine learning. Cloud Dataproc automation helps you create clusters quickly, manage them easily, and save money by turning clusters off when you don't need them. With less time and money spent on administration, you can focus on your jobs and your data.
This lab is adapted from https://cloud.google.com/dataproc/quickstart-console.
What you'll learn
-
How to create a managed Cloud Dataproc cluster (with Apache Spark pre-installed).
-
How to submit a Spark job
-
How to shut down your cluster
What you'll need
Setup and Requirements
Before you click the Start Lab button
Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.
This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.
What you need
To complete this lab, you need:
- Access to a standard internet browser (Chrome browser recommended).
- Time to complete the lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab.
Note: If you are using a Chrome OS device, open an Incognito window to run this lab.
How to start your lab and sign in to the Google Cloud Console
-
Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is a panel populated with the temporary credentials that you must use for this lab.
-
Copy the username, and then click Open Google Console. The lab spins up resources, and then opens another tab that shows the Sign in page.
Tip: Open the tabs in separate windows, side-by-side.
-
In the Sign in page, paste the username that you copied from the left panel. Then copy and paste the password.
Important: You must use the credentials from the left panel. Do not use your Google Cloud Training credentials. If you have your own Google Cloud account, do not use it for this lab (avoids incurring charges).
-
Click through the subsequent pages:
- Accept the terms and conditions.
- Do not add recovery options or two-factor authentication (because this is a temporary account).
- Do not sign up for free trials.
After a few moments, the Cloud Console opens in this tab.
Check project permissions
Before you begin your work on Google Cloud, you need to ensure that your project has the correct permissions within Identity and Access Management (IAM).
-
In the Google Cloud console, on the Navigation menu (
), click IAM & Admin > IAM.
-
Confirm that the default compute Service Account
{project-number}-compute@developer.gserviceaccount.com
is present and has theeditor
role assigned. The account prefix is the project number, which you can find on Navigation menu > Home.
If the account is not present in IAM or does not have the editor
role, follow the steps below to assign the required role.
-
In the Google Cloud console, on the Navigation menu, click Home.
-
Copy the project number (e.g.
729328892908
). -
On the Navigation menu, click IAM & Admin > IAM.
-
At the top of the IAM page, click Add.
-
For New principals, type:
Replace {project-number}
with your project number.
- For Role, select Project (or Basic) > Editor. Click Save.
Create a Cloud Dataproc cluster
In the console, click Navigation menu > Dataproc > Clusters on the top left of the screen:
To create a new cluster, click Create cluster.
There are many parameters you can configure when creating a new cluster. Set values for the parameters listed below, leave the default settings for the other parameters.
Parameter | Value |
---|---|
Name | qlab |
Region | us-central1 |
Zone | us-central1-c |
Click Configure nodes, for Master node - Machine type | 4 vCPUs (n1-standard-4) |
Worker node - Machine type | 2 vCPUs (n1-standard-2) |
Click on Create to create the new cluster. You willll see the Status go from Provisioning to Running—move on to the next step once you're output resembles the following:
Test Completed Task
Click Check my progress to verify your performed task. If you have completed the task successfully you will granted with an assessment score.
Submit a Spark job to your cluster
Select Jobs to switch to Dataproc's jobs view.
Click Submit job.
Set values for the parameters listed below, leave the default settings for the other parameters.
Parameter | Value |
---|---|
Region | us-central1 |
Cluster | qlab |
Job type | Spark |
Main class or jar | org.apache.spark.examples.SparkPi |
Jar files | file:///usr/lib/spark/examples/jars/spark-examples.jar |
Arguments | 1000 (This sets the number of tasks) |
Click Submit.
Your job should appear in the Jobs list, which shows all your project's jobs with their cluster, type, and current status. The new job displays as "Running"—move on once you see "Succeeded" as the Status.
Test Completed Task
Click Check my progress to verify your performed task. If you have completed the task successfully you will granted with an assessment score.
To see your completed job's output, click the job ID in the Jobs list.
To avoid scrolling, select Line Wrap to ON.
You should see that your job has successfully calculated a rough value for pi!
Shut down your cluster
You can shut down a cluster on the Clusters page.
Select the checkbox next to the qlab cluster and click Delete.
Click CONFIRM to confirm deletion.
Test your Understanding
Below are multiple-choice questions to reinforce your understanding of this lab's concepts. Answer them to the best of your abilities.
Congratulations!
You learned how to create a Dataproc cluster, submit a Spark job, and shut down your cluster!
Next Steps / Learn More
Continue your Google Cloud learning with these suggestions:
- Learn more about Dataproc by exploring Dataproc Documentation.
- Take more labs, for example Provisioning and Using a Managed Hadoop/Spark Cluster with Cloud Dataproc (Command Line).
- Start a Quest! A Qwiklabs Quest is a series of related labs that form a learning path. Completing a Quest earns you a digital badge, to recognize your achievement. You can make your badge (or badges) public and link to them in your online resume or social media account. See available Qwiklabs Quests.
Google Cloud Training & Certification
...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.
Manual Last Updated February 25, 2021
Lab Last Tested February 25, 2021
Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.