TPUs are very fast, and the stream of training data must keep up with their training speed. In this lab, you will learn how to load data from Cloud Storage with the tf.data.Dataset API to feed your TPU.
Objectives
You will learn:
To use the tf.data.Dataset API to load training data.
To use TFRecord format to load training data efficiently from Cloud Storage.
Setup and requirements
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
To create and launch a Vertex AI Workbench notebook:
In the Navigation Menu Navigation menu icon, click Vertex AI > Workbench.
On the User-Managed Notebook page, click Enable Notebooks API (if it isn't enabled yet), then click Create New.
In the New instance menu, choose the latest version of TensorFlow Enterprise 2.11 (Intel® MKL-DNN/MKL) in Environment.
Name the notebook.
Set Region to and Zone to .
Leave the remaining fields at their default and click Create.
After a few minutes, the Workbench page lists your instance, followed by Open JupyterLab.
Click Open JupyterLab to open JupyterLab in a new tab. If you get a message saying beatrix jupyterlab needs to be included in the build, just ignore it.
Task 2. Clone course repo within your Vertex AI Notebooks instance
To clone the training-data-analyst notebook in your JupyterLab instance:
In JupyterLab, to open a new terminal, click the Terminal icon.
At the command-line prompt, run the following command:
To confirm that you have cloned the repository, double-click on the training-data-analyst directory and ensure that you can see its contents.
The files for all the Jupyter notebook-based labs throughout this course are available in this directory.
Task 3. TPU-Speed Data Pipelines: tf.data.Dataset and TFRecords
In the notebook interface, navigate to training-data-analyst > courses > machine_learning > deepdive2 > production_ml > labs, and open tpu_speed_data_pipelines.ipynb.
In the notebook interface, click Edit > Clear All Outputs.
Carefully read through the notebook instructions and fill in lines marked with #TODO where you need to complete the code.
Tip: To run the current cell, click the cell and press SHIFT+ENTER. Other cell commands are listed in the notebook UI under Run.
Hints may also be provided for the tasks to guide you. Highlight the text to read the hints, which are in white text.
To see the complete solution, navigate to training-data-analyst > courses > machine_learning > deepdive2 > production_ml > solutions, and open tpu_speed_data_pipelines.ipynb.
End your lab
When you have completed your lab, click End Lab. Qwiklabs removes the resources you’ve used and cleans the account for you.
You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.
The number of stars indicates the following:
1 star = Very dissatisfied
2 stars = Dissatisfied
3 stars = Neutral
4 stars = Satisfied
5 stars = Very satisfied
You can close the dialog box if you don't want to provide feedback.
For feedback, suggestions, or corrections, please use the Support tab.
Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
Lab membuat project dan resource Google Cloud untuk jangka waktu tertentu
Lab memiliki batas waktu dan tidak memiliki fitur jeda. Jika lab diakhiri, Anda harus memulainya lagi dari awal.
Di kiri atas layar, klik Start lab untuk memulai
Gunakan penjelajahan rahasia
Salin Nama Pengguna dan Sandi yang diberikan untuk lab tersebut
Klik Open console dalam mode pribadi
Login ke Konsol
Login menggunakan kredensial lab Anda. Menggunakan kredensial lain mungkin menyebabkan error atau dikenai biaya.
Setujui persyaratan, dan lewati halaman resource pemulihan
Jangan klik End lab kecuali jika Anda sudah menyelesaikan lab atau ingin mengulanginya, karena tindakan ini akan menghapus pekerjaan Anda dan menghapus project
Konten ini tidak tersedia untuk saat ini
Kami akan memberi tahu Anda melalui email saat konten tersedia
Bagus!
Kami akan menghubungi Anda melalui email saat konten tersedia
Satu lab dalam satu waktu
Konfirmasi untuk mengakhiri semua lab yang ada dan memulai lab ini
Gunakan penjelajahan rahasia untuk menjalankan lab
Gunakan jendela Samaran atau browser pribadi untuk menjalankan lab ini. Langkah ini akan mencegah konflik antara akun pribadi Anda dan akun Siswa yang dapat menyebabkan tagihan ekstra pada akun pribadi Anda.
TPUs are very fast. The stream of training data must keep up with their training speed. In this lab, you will learn how to load data from GCS with the tf.data.Dataset API to feed your TPU.