TPUs are very fast, and the stream of training data must keep up with their training speed. In this lab, you will learn how to load data from Cloud Storage with the tf.data.Dataset API to feed your TPU.
Objectives
You will learn:
To use the tf.data.Dataset API to load training data.
To use TFRecord format to load training data efficiently from Cloud Storage.
Setup and requirements
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
To create and launch a Vertex AI Workbench notebook:
In the Navigation Menu Navigation menu icon, click Vertex AI > Workbench.
On the User-Managed Notebook page, click Enable Notebooks API (if it isn't enabled yet), then click Create New.
In the New instance menu, choose the latest version of TensorFlow Enterprise 2.11 (Intel® MKL-DNN/MKL) in Environment.
Name the notebook.
Set Region to and Zone to .
Leave the remaining fields at their default and click Create.
After a few minutes, the Workbench page lists your instance, followed by Open JupyterLab.
Click Open JupyterLab to open JupyterLab in a new tab. If you get a message saying beatrix jupyterlab needs to be included in the build, just ignore it.
Task 2. Clone course repo within your Vertex AI Notebooks instance
To clone the training-data-analyst notebook in your JupyterLab instance:
In JupyterLab, to open a new terminal, click the Terminal icon.
At the command-line prompt, run the following command:
To confirm that you have cloned the repository, double-click on the training-data-analyst directory and ensure that you can see its contents.
The files for all the Jupyter notebook-based labs throughout this course are available in this directory.
Task 3. TPU-Speed Data Pipelines: tf.data.Dataset and TFRecords
In the notebook interface, navigate to training-data-analyst > courses > machine_learning > deepdive2 > production_ml > labs, and open tpu_speed_data_pipelines.ipynb.
In the notebook interface, click Edit > Clear All Outputs.
Carefully read through the notebook instructions and fill in lines marked with #TODO where you need to complete the code.
Tip: To run the current cell, click the cell and press SHIFT+ENTER. Other cell commands are listed in the notebook UI under Run.
Hints may also be provided for the tasks to guide you. Highlight the text to read the hints, which are in white text.
To see the complete solution, navigate to training-data-analyst > courses > machine_learning > deepdive2 > production_ml > solutions, and open tpu_speed_data_pipelines.ipynb.
End your lab
When you have completed your lab, click End Lab. Qwiklabs removes the resources you’ve used and cleans the account for you.
You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.
The number of stars indicates the following:
1 star = Very dissatisfied
2 stars = Dissatisfied
3 stars = Neutral
4 stars = Satisfied
5 stars = Very satisfied
You can close the dialog box if you don't want to provide feedback.
For feedback, suggestions, or corrections, please use the Support tab.
Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
Les ateliers créent un projet Google Cloud et des ressources pour une durée déterminée.
Les ateliers doivent être effectués dans le délai imparti et ne peuvent pas être mis en pause. Si vous quittez l'atelier, vous devrez le recommencer depuis le début.
En haut à gauche de l'écran, cliquez sur Démarrer l'atelier pour commencer.
Utilisez la navigation privée
Copiez le nom d'utilisateur et le mot de passe fournis pour l'atelier
Cliquez sur Ouvrir la console en navigation privée
Connectez-vous à la console
Connectez-vous à l'aide des identifiants qui vous ont été attribués pour l'atelier. L'utilisation d'autres identifiants peut entraîner des erreurs ou des frais.
Acceptez les conditions d'utilisation et ignorez la page concernant les ressources de récupération des données.
Ne cliquez pas sur Terminer l'atelier, à moins que vous n'ayez terminé l'atelier ou que vous ne vouliez le recommencer, car cela effacera votre travail et supprimera le projet.
Ce contenu n'est pas disponible pour le moment
Nous vous préviendrons par e-mail lorsqu'il sera disponible
Parfait !
Nous vous contacterons par e-mail s'il devient disponible
Un atelier à la fois
Confirmez pour mettre fin à tous les ateliers existants et démarrer celui-ci
Utilisez la navigation privée pour effectuer l'atelier
Ouvrez une fenêtre de navigateur en mode navigation privée pour effectuer cet atelier. Vous éviterez ainsi les conflits entre votre compte personnel et le compte temporaire de participant, qui pourraient entraîner des frais supplémentaires facturés sur votre compte personnel.
TPUs are very fast. The stream of training data must keep up with their training speed. In this lab, you will learn how to load data from GCS with the tf.data.Dataset API to feed your TPU.
Durée :
0 min de configuration
·
Accessible pendant 120 min
·
Terminé après 120 min