arrow_back

Predict Housing Prices with Tensorflow and AI Platform

Join Sign in

Predict Housing Prices with Tensorflow and AI Platform

1 hour 30 minutes 5 Credits

GSP418

Google Cloud selp-paced labs logo

Overview

In this lab, you will build an end to end machine learning solution using Tensorflow 1.x and AI Platform and leverage the cloud for distributed training and online prediction.

tf.estimator — a high-level TensorFlow API that greatly simplifies machine learning programming. Estimators encapsulate the following actions:

  • training
  • evaluation
  • prediction
  • export for serving

This lab is focused on interacting with Jupyter and AI Platform. Non-relevant concepts and code blocks are glossed over and are provided for you to execute in your Jupyter notebook.

Set up

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud Console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:

    • The Open Google Console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Console. The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username from the Lab Details panel and paste it into the Sign in dialog. Click Next.

  4. Copy the Password from the Lab Details panel and paste it into the Welcome dialog. Click Next.

    Important: You must use the credentials from the left panel. Do not use your Google Cloud Skills Boost credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  5. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Cloud Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking the Navigation menu at the top-left. Navigation menu icon

Create Storage Bucket

Create a bucket using the Cloud Console:

  1. Click on the Navigation menu, and select Cloud Storage.

  2. Click on Create bucket.

  3. Set a unique name (use your project ID because it is unique) and then choose a regional bucket, setting the region to us-central1. Then, click Create.

Click Check my progress to verify your performed task. If you have completed the task successfully you will be granted with an assessment score.

Create Storage Bucket

Launch Vertex AI Notebooks instance

  1. In the Google Cloud Console, on the Navigation Menu, click Vertex AI > Workbench.

  2. On the Notebook instances page, click New Notebook > TensorFlow Enterprise > TensorFlow Enterprise 1.15 (with LTS) > Without GPUs.

  3. In the New notebook instance dialog, confirm the name of the deep learning VM, and then click Create. The new VM will take 2-3 minutes to start.

  4. Click Open JupyterLab. A JupyterLab window will open in a new tab.

Click Check my progress to verify your performed task. If you have completed the task successfully you will be granted with an assessment score.

Create the AI Platform notebook instance

Download lab notebook

To clone the training-data-analyst repository in your JupyterLab instance:

  1. In JupyterLab, click the Terminal icon to open a new terminal.

Open Terminal

  1. At the command-line prompt, type the following command and press Enter:

git clone https://github.com/GoogleCloudPlatform/training-data-analyst
  1. To confirm that you have cloned the repository, in the left panel, double click the training-data-analyst folder to see its contents.

Files in the training-data-analyst directory

Click Check my progress to verify your performed task. If you have completed the task successfully you will be granted with an assessment score.

Download lab notebook

Open and execute the housing prices notebook

From within the Jupyter console, select training-data-analyst > blogs > housing_prices > cloud-ml-housing-prices.ipynb to begin the lab. Now you're ready to start!

In the top ribbon, click Edit > Clear All Outputs.

From the top right corner, select Python 2 and change it to Python 3, if it is not already set to Python 3.

From here, read the instructions in the notebook to complete the lab.

Execute the cells one by one and observe the results. A convenient way to progress through the cells is by clicking in a cell, then click Shift + Enter, waiting for each cell to complete before progressing.

Read the instructions and the comments in the code blocks carefully. You will be asked to edit some of the code blocks before running them. For example, you will be setting environment variables in the notebook, so add your bucket name and project ID before running the cell.

Click Check my progress to verify your performed task. If you have completed the task successfully you will be granted with an assessment score.

Train and deploy the Model for Predictions

Congratulations!

You have used Tensorflow's high level Estimator API to deploy Tensorflow 1.x code for distributed training in the cloud and evaluated the results, then deployed the model to the cloud for online prediction.

Data_Engineering_badge_125.png ML-TensorFlow-on-GCP-badge.png

Finish Your Quest

This self-paced lab is part of the Qwiklabs Data Engineering and Intermediate ML: TensorFlow on GCP Quests. A Quest is a series of related labs that form a learning path. Completing a Quest earns you a badge to recognize your achievement. You can make your badge (or badges) public and link to them in your online resume or social media account. Enroll in one of the above Quests and get immediate completion credit if you've taken this lab. See other available Qwiklabs Quests.

Take Your Next Lab

Continue your Quest with Cloud Composer: Copy BigQuery Tables Across Different Locations, or check out these suggestions:

Next steps / learn more

Check out official documentation on:

Check out this talk from Google Cloud Next '17 on advanced data science on Google Cloud (43:15).

Google Cloud training and certification

...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.

Manual Last Updated December 8, 2021
Lab Last Tested December 8, 2021

Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.