arrow_back

Build an LLM and RAG-based Chat Application with AlloyDB and Vertex AI

Sign in Join
Get access to 700+ labs and courses

Build an LLM and RAG-based Chat Application with AlloyDB and Vertex AI

Lab 1 hour 30 minutes universal_currency_alt 5 Credits show_chart Intermediate
info This lab may incorporate AI tools to support your learning.
Get access to 700+ labs and courses

Overview

One of the best tools for improving the quality of responses from large language models (LLMs) is retrieval augmented generation (RAG). RAG is the pattern of retrieving some non-public data and using that data to augment your prompt sent to the LLM. RAG allows the LLM to generate more accurate responses based on the data included in the prompt.

You'll use AlloyDB, Google Cloud's scalable and performant PostgreSQL-compatible database, to store and search by a special kind of vector data called vector embeddings. Vector embeddings can be retrieved using a semantic search, which allows retrieval of the available data that is the best match for a user's natural language query. The retrieved data is then passed to the LLM in the prompt.

To retrieve data from the database you'll use MCP Toolbox, a middleware server that exposes database operations as a set of tools. The agent connects to Toolbox to execute these tools. This provides a secure, scalable, and modular way to manage database interactions.

You'll also use Vertex AI, Google Cloud's fully-managed, unified AI development platform for building and using generative AI. Your application uses Gemini Pro, a multimodal foundation model that supports adding image, audio, video, and PDF files in text or chat prompts and supports long-context understanding.

agent app architecture

What you will learn

In this lab, you'll learn:

  • How RAG enhances LLM capabilities by retrieving relevant information from a knowledge base.
  • How AlloyDB can be used to find relevant information using semantic search.
  • How you can use Vertex AI and Google's foundation models to provide powerful generative AI capabilities to applications.

Setup and requirements

Before you click the Start Lab button

Note: Read these instructions.

Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This Qwiklabs hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

What you need

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
  • Time to complete the lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab. Note: If you are using a Pixelbook, open an Incognito window to run this lab.

How to start your lab and sign in to the Console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is a panel populated with the temporary credentials that you must use for this lab.

    Credentials panel

  2. Copy the username, and then click Open Google Console. The lab spins up resources, and then opens another tab that shows the Choose an account page.

    Note: Open the tabs in separate windows, side-by-side.
  3. On the Choose an account page, click Use Another Account. The Sign in page opens.

    Choose an account dialog box with Use Another Account option highlighted

  4. Paste the username that you copied from the Connection Details panel. Then copy and paste the password.

Note: You must use the credentials from the Connection Details panel. Do not use your Google Cloud Skills Boost credentials. If you have your own Google Cloud account, do not use it for this lab (avoids incurring charges).
  1. Click through the subsequent pages:
  • Accept the terms and conditions.
  • Do not add recovery options or two-factor authentication (because this is a temporary account).
  • Do not sign up for free trials.

After a few moments, the Cloud console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking the Navigation menu at the top-left. Cloud Console Menu

Activate Google Cloud Shell

Google Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud.

Google Cloud Shell provides command-line access to your Google Cloud resources.

  1. In Cloud console, on the top right toolbar, click the Open Cloud Shell button.

    Highlighted Cloud Shell icon

  2. Click Continue.

It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:

Project ID highlighted in the Cloud Shell Terminal

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  • You can list the active account name with this command:
gcloud auth list

Output:

Credentialed accounts: - @.com (active)

Example output:

Credentialed accounts: - google1623327_student@qwiklabs.net
  • You can list the project ID with this command:
gcloud config list project

Output:

[core] project =

Example output:

[core] project = qwiklabs-gcp-44776a13dea667a6 Note: Full documentation of gcloud is available in the gcloud CLI overview guide .

Task 1. Initialize the environment

In this task, you install Python, MCP Toolbox, and the PostgreSQL client.

SSH to the app VM

A virtual machine (VM) has been created. This VM hosts the application and the MCP Toolbox.

  1. To connect to the VM, in Cloud Shell, run the following command:

    gcloud compute ssh app-vm --zone={{{project_0.default_zone | ZONE }}}

    If asked to authorize, click Authorize.

  2. For each question asked by the gcloud compute ssh command, click Enter or Return to specify the default input.

    After a short wait, you are logged into the VM.

Install Python and Git

  1. To install Python and Git, in the VM, run the following commands:

    sudo apt update sudo apt install -y python3.11-venv git python3 -m venv .venv source ~/.venv/bin/activate pip install --upgrade pip

    When the installation is complete, you are left in the virtual Python environment, with a (.venv) prompt.

    If the VM SSH session ever times out or the tab is closed, you can SSH into the VM again and use the command source ~/.venv/bin/activate to restart the virtual Python environment.

  2. To confirm the python version, run the following command:

    python -V

    Your response should look similar to this:

    (.venv) student@app-vm:~$ python -V Python 3.11.2 (.venv) student@app-vm:~$

Install the PostgreSQL client

  1. To install the PostgreSQL client, in the VM session, run the following commands:

    sudo apt install -y postgresql-client Note: The client may already be installed.

Download the demo and MCP Toolbox

  1. To clone the code for this demo, run the following commands:

    cd ~ git clone https://github.com/GoogleCloudPlatform/cymbal-air-toolbox-demo.git cd cymbal-air-toolbox-demo
  2. To download the MCP Toolbox binary, run the following commands:

    export MCP_TOOLBOX_VERSION="{{{project_0.startup_script.gcp_mcp_toolbox_version | TOOLBOX_VERSION}}}" curl -O https://storage.googleapis.com/genai-toolbox/v$MCP_TOOLBOX_VERSION/linux/amd64/toolbox chmod +x toolbox

Task 2. Create the vector database

In this task, you use the PostgreSQL client to create the vector database.

Create the vector database

An AlloyDB instance has already been created.

  1. To create a new database, run the following command:

    export PROJECT_ID=$(gcloud config get-value project) export REGION={{{project_0.default_region | REGION }}} export ADBCLUSTER={{{project_0.startup_script.gcp_alloydb_cluster_name | CLUSTER}}} export ADBINSTANCE={{{project_0.startup_script.gcp_alloydb_primary_instance | INSTANCE}}} export INSTANCE_IP=$(gcloud alloydb instances describe $ADBINSTANCE --cluster=$ADBCLUSTER --region=$REGION --format="value(ipAddress)") export PGUSER={{{project_0.startup_script.gcp_alloydb_user | PG_USER}}} export PGPASSWORD={{{project_0.startup_script.gcp_alloydb_password | PG_PASSWORD}}} export PGDATABASE={{{project_0.startup_script.gcp_alloydb_database | DATABASE}}} psql "host=$INSTANCE_IP user=$PGUSER dbname=postgres" -c "CREATE DATABASE $PGDATABASE"

    psql responds with CREATE DATABASE.

    To enable the database to support semantic searches, the entities should be represented by vector embeddings.

  2. To enable vector embeddings in this database, run the following command:

    psql "host=$INSTANCE_IP user=$PGUSER dbname=$PGDATABASE" -c "CREATE EXTENSION vector"

    psql responds with CREATE EXTENSION.

Click Check my progress to verify the objective.

Create the AlloyDB database and enable the vector extension.

Task 3. Populate the example database

In this task, you use MCP Toolbox to populate the vector database in AlloyDB with sample data. This data is used for the chat application.

Examine the data models

  1. To see the data model, run the following command:

    cd ~/cymbal-air-toolbox-demo cat models/models.py

    The Python data models are shown here. The model includes airports, flights, amenities within the terminals, policies, and tickets.

  2. To see an example of the airport data, run the following commands:

    head -1 data/airport_dataset.csv; grep SFO data/airport_dataset.csv

    These commands show the CSV header that specifies the column names for the airport dataset followed by the row for the San Francisco International airport (SFO). The data in the airport model can be retrieved based on the International Air Transport Association (IATA) code, or by country, city, and airport name. You can use keyword search to find rows in this table, so there are no vector embeddings for this data.

  3. To see an example of the flight data, run the following commands:

    head -1 data/flights_dataset.csv; grep -m10 "SFO" data/flights_dataset.csv

    These commands show the CSV header that specifies the column names for the flights dataset followed by the first 10 rows of flights to or from SFO. The data in the flights model can be retrieved based on the airline and flight number, or by the departure and arrival airport codes.

  4. To see an example of the amenities data, run the following command:

    head -2 data/amenity_dataset.csv

    This command shows the CSV header that specifies the column names for the amenities dataset followed by the first amenity.

    You'll notice that the first amenity has several simple values, including name, description, location, terminal, category, and business hours. The next value is content, which incorporates the name, description, and location. The last value is embedding, the vector embedding for the row.

    The embedding is an array of 768 numbers which is used when performing a semantic search. These embeddings are calculated using an AI model provided by Vertex AI. When a user provides a query, a vector embedding can be created from the query, and data with vector embeddings that are close to the search's embedding can be retrieved.

    The policy data also uses vector embeddings in a similar fashion.

    Note: The calculation of embeddings takes a while, so the embeddings have already been provided. The run_generate_embeddings.py script can be examined to see how embeddings are generated.

Start the toolbox

  1. To configure the required environment variables for MCP Toolbox, run the following commands:

    export ALLOYDB_POSTGRES_PROJECT=$(gcloud config get-value project) export ALLOYDB_POSTGRES_REGION={{{project_0.default_region | REGION }}} export ALLOYDB_POSTGRES_CLUSTER={{{project_0.startup_script.gcp_alloydb_cluster_name | CLUSTER}}} export ALLOYDB_POSTGRES_INSTANCE={{{project_0.startup_script.gcp_alloydb_primary_instance | PRIMARY_INSTANCE}}} export ALLOYDB_POSTGRES_DATABASE={{{project_0.startup_script.gcp_alloydb_database | DATABASE}}} export ALLOYDB_POSTGRES_USER={{{project_0.startup_script.gcp_alloydb_user | PG_USER}}} export ALLOYDB_POSTGRES_PASSWORD={{{project_0.startup_script.gcp_alloydb_password | PG_PASSWORD}}} export ALLOYDB_POSTGRES_IP_TYPE=private

    These MCP Toolbox environment variables are similar to those that were used for the PostgreSQL client.

    The next step is to run the MCP Toolbox.

  2. To run MCP Toolbox in the background for data initialization, run the following command:

    cd ~/cymbal-air-toolbox-demo ./toolbox --prebuilt alloydb-postgres &

    The ampersand (&) causes toolbox to run in the background, so you are able to perform the database init in the same SSH session.

Populate the database

  1. To install the Python dependencies of the demo app, run the following command:

    source ~/.venv/bin/activate pip install -r requirements.txt
  2. To run the database script, run the following commands:

    export PYTHONPATH=~/cymbal-air-toolbox-demo python data/run_database_init.py

    The first command adds a path to the Python modules, and the init script populates the database.

  3. To verify the data that was populated, run the following command:

    psql "host=$INSTANCE_IP user=$PGUSER dbname=$PGDATABASE" -c "SELECT COUNT(*) FROM airports"

    The PostgreSQL command returns the number of airports in the airports table.

  4. To end the background Toolbox process, run the following command:

    pkill toolbox
Populate the database with the sample dataset.

Task 4. Create a service account for the MCP toolbox

In this task, you create a service account for the MCP toolbox.

For our chat app, the MCP toolbox is responsible for extracting relevant information from the database. It extracts the necessary information from the database based on the request from an AI application. This service account is used as the identity of the MCP toolbox service running in Cloud Run.

Create service account

The SSH user does not have permission for the project instance to provide the service account with the correct role. You create the service account using a new Cloud Shell tab.

  1. In Cloud Shell, to open a new Cloud Shell tab, click Open a new tab (+).

  2. To create a service account, in the new tab, run the following command:

    gcloud iam service-accounts create toolbox-identity

    This service account is created.

    Note: If an error is returned that you do not currently have an active account selected, this may just be a delay in propagation. Run the command again.
  3. To grant the service account the necessary privileges, run the following command:

    export PROJECT_ID=$(gcloud config get-value project) gcloud projects add-iam-policy-binding $PROJECT_ID \ --member="serviceAccount:toolbox-identity@$PROJECT_ID.iam.gserviceaccount.com" \ --role="roles/alloydb.client" gcloud projects add-iam-policy-binding $PROJECT_ID \ --member="serviceAccount:toolbox-identity@$PROJECT_ID.iam.gserviceaccount.com" \ --role="roles/serviceusage.serviceUsageConsumer" gcloud projects add-iam-policy-binding $PROJECT_ID \ --member="serviceAccount:toolbox-identity@$PROJECT_ID.iam.gserviceaccount.com" \ --role="roles/secretmanager.secretAccessor"

    This service account is granted the following roles:

    • roles/alloydb.client allows the toolbox to access AlloyDB databases.
    • roles/serviceusage.serviceUsageConsumer allows the toolbox to consume services.
    • roles/secretmanager.secretAccessor allows the toolbox to retrieve secrets stored in Secret Manager.
  4. To close the new tab, run the following command:

    exit
Create the service account retrieval-identity.

Task 5. Register the OAuth consent screen

In this task, you register the OAuth consent screen that is presented to users who are logging in.

When you use OAuth 2.0 for authorization, Google displays a consent screen to capture the user's consent to share data with the application.

  1. In the Google Cloud console, select the Navigation menu (Navigation menu icon), and then select APIs & Services > OAuth consent screen.

  2. Click Get Started.

  3. For App name, enter Cymbal Air.

  4. Click User support email, then click the student email, and then click Next.

  5. For Audience, select Internal, and then click Next.

    Users with access to the project should be able to log in to the app.

  6. On the left panel of the lab instructions, copy the Username.

    Copy username

  7. For Contact information, paste the copied username.

  8. Click Next.

  9. Click Checkbox to agree the User Data Policy, then click Continue, and then click Create.

    The consent screen is now set up.

Task 6. Create a client ID for the application

In this task, you create a client ID for the application.

The application requires a client ID to use Google's OAuth service. You configure the allowed origins that can make this request, and a redirect URI where the web app is redirected after the user has consented to log in.

  1. In the Google Cloud console, select the Navigation menu (Navigation menu icon), and then select APIs & Services > Credentials.

  2. Click + Create Credentials, and then click OAuth client ID.

    A client ID is used to identify a single app to Google's OAuth servers.

  3. For Application type, select Web application.

  4. For Name, enter Cymbal Air.

    You can generate the JavaScript origin and redirect URI using Cloud Shell.

  5. In Cloud Shell, to open a new Cloud Shell tab, click Open a new tab (+).

  6. To get the origin and redirect URI, in the new tab, run the following commands:

    echo "origin:"; echo "https://8080-$WEB_HOST"; echo "redirect:"; echo "https://8080-$WEB_HOST/login/google"
  7. For Authorized JavaScript origins, click + Add URI.

    Note: Select the Add URI button under Authorized Javascript origins, not under Authorized redirect URIs.
  8. Copy the origin URI that was created by the echo command, and then, for URIs 1, paste in the URI.

  9. For Authorized redirect URIs, click + Add URI.

    Note: This is the second Add URI button, under Authorized redirect URIs.
  10. Copy the redirect URI that was created by the echo command, and then, for URIs 1, paste in the URI.

  11. To create the environment variable, switch to the VM SSH Cloud Shell tab, then paste the following command without clicking Enter:

    export CLIENT_ID= Note: Make sure that you are creating the CLIENT_ID environment variable inside the virtual machine session.
  12. In the Credentials window, click Create.

    The client ID and client secret are created. For this test application, you only use the client ID.

  13. Click Copy client ID (Copy client ID icon).

    The client ID is copied to the clipboard.

    Note: The client ID can also be copied from the Credentials page.
  14. In the VM SSH Cloud Shell tab, paste the client ID, and then click Enter.

    The export should look similar to this:

    export CLIENT_ID=937631684809-q7hs2r191jbks7f7dopih2uafuknb92h.apps.googleusercontent.com
Create a client ID for the application.

Task 7. Deploy the toolbox to Cloud Run

In this task, you deploy MCP toolbox to Cloud Run.

Build the configuration file for the toolbox

A sample tools.yaml file is provided for the toolbox, but several settings need to be modified.

  1. To see the configuration settings, run the following command:

    head -20 ~/cymbal-air-toolbox-demo/tools.yaml

    This YAML file contains configuration settings for the database and for OAuth.

    It should look similar to this:

    sources: my-pg-instance: kind: alloydb-postgres project: retrieval-app-testing region: us-central1 cluster: my-alloydb-cluster instance: my-alloydb-instance database: assistantdemo user: postgres password: postgres authServices: my_google_service: kind: google clientId: 706535509072-qa5v22ur8ik8o513b0538ufo0ne9jfn5.apps.googleusercontent.com
  2. To update the settings to match your environment, run the following commands:

    export PROJECT="$(gcloud config get-value project)" export REGION="{{{project_0.default_region | REGION }}}" export CLUSTER="{{{project_0.startup_script.gcp_alloydb_cluster_name | CLUSTER}}}" export INSTANCE="{{{project_0.startup_script.gcp_alloydb_primary_instance | PRIMARY_INSTANCE}}}" export DATABASE="{{{project_0.startup_script.gcp_alloydb_database | DATABASE}}}" export USER="{{{project_0.startup_script.gcp_alloydb_user | PG_USER}}}" export PASSWORD="{{{project_0.startup_script.gcp_alloydb_password | PG_PASSWORD}}}" export IP_TYPE="private" sed \ -e "s/^\( *project:\).*$/\1 $PROJECT/" \ -e "s/^\( *region:\).*$/\1 $REGION/" \ -e "s/^\( *cluster:\).*$/\1 $CLUSTER/" \ -e "s/^\( *instance:\).*$/\1 $INSTANCE/" \ -e "s/^\( *database:\).*$/\1 $DATABASE/" \ -e "s/^\( *user:\).*$/\1 $USER/" \ -e "s/^\( *password:\).*$/\1 $PASSWORD\\n ipType: $IP_TYPE/" \ -e "s/^\( *clientId:\).*$/\1 $CLIENT_ID/" \ ~/cymbal-air-toolbox-demo/tools.yaml > ~/tools.yaml

    These commands use sed to replace the specific settings into the example tools.yaml file. The ipType line is also added below the password line, because the default IP type is public.

  3. To see the file updates, run the following command:

    head -20 ~/tools.yaml

    The top of the settings file now contains your values.

    Sensitive settings (especially the database password and the client ID) should be secured as secrets.

Create a secret for the tools.yaml file

Instead of checking sensitive information into a code repository or as part of the running app, Secret Manager can be used to secure your secrets.

  1. To create a secret, run the following command:

    cd ~ gcloud secrets create tools --data-file=tools.yaml
  2. To validate that the secret was stored, run the following command:

    gcloud secrets describe tools

    You should now see that there is a secret named tools.

Deploy the toolbox to Cloud Run

  1. To deploy the toolbox to Cloud Run, in the VM SSH Cloud Shell tab, run the following commands:

    export REGION={{{project_0.default_region | REGION }}} export MCP_TOOLBOX_VERSION="{{{project_0.startup_script.gcp_mcp_toolbox_version | TOOLBOX_VERSION}}}" export IMAGE="us-central1-docker.pkg.dev/database-toolbox/toolbox/toolbox:$MCP_TOOLBOX_VERSION" gcloud run deploy toolbox \ --image $IMAGE \ --service-account toolbox-identity \ --region $REGION \ --set-secrets "/app/tools.yaml=tools:latest" \ --args="--tools-file=/app/tools.yaml","--address=0.0.0.0","--port=8080" \ --network default \ --subnet default \ --no-allow-unauthenticated \ --quiet

    Wait until the deployment completes.

  2. To verify the service, run the following command:

    curl -H "Authorization: Bearer $(gcloud auth print-identity-token)" $(gcloud run services list --filter="(toolbox)" --format="value(URL)")

    If you see the "Hello, World!" message, the service is up and serving requests.

Deploy the toolbox service.

Task 8. Run the sample application

In this task, you run a sample chat application that uses the retrieval service.

Run the application

  1. To return to the root of the chat application, in the VM SSH Cloud Shell tab, run the following commands:

    source ~/.venv/bin/activate cd ~/cymbal-air-toolbox-demo

    Before starting the application, you need to set up some environment variables. The basic functionality of the application, including querying flights and returning airport amenities, requires an environment variable named TOOLBOX_URL to contain the URL of the toolbox service running on Cloud Run.

  2. To specify the URL of the toolbox service, run the following commands:

    export TOOLBOX_URL=$(gcloud run services list --filter="(toolbox)" --format="value(URL)") echo $TOOLBOX_URL

    The toolbox URL is used by the local application to access databases through MCP Toolbox.

  3. To run the application, run the following command:

    python run_app.py

    Your response should look similar to this:

    (.venv) student-03-d87d6b142a95@app-vm:~/cymbal-air-toolbox-demo$ python run_app.py INFO: Started server process [26127] INFO: Waiting for application startup. Loading application... INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8081 (Press CTRL+C to quit)

    The application is now running.

Connect to the VM

You have several ways to connect to the application running on the VM. For example, you can open port 8081 on the VM using firewall rules in the VPC, or create a load balancer with a public IP. Here you use a SSH tunnel to the VM, translating the Cloud Shell port 8080 to the VM port 8081.

  1. In Cloud Shell, to open a new Cloud Shell tab, click Open a new tab (+).

  2. To create an SSH tunnel to the VM port, in the new tab, run the following command:

    gcloud compute ssh app-vm --zone={{{project_0.default_zone | ZONE }}} -- -L localhost:8080:localhost:8081

    The gcloud command connects localhost port 8080 in Cloud Shell with port 8081 on the VM. You can ignore the error "Cannot assign requested address."

  3. To run the application in the web browser, click Web Preview, and then select Preview on port 8080.

    Web Preview on port 8080

    A new tab is opened in the browser, and the application is running. The Cymbal Air application prompts "Welcome to Cymbal Air! How may I assist you?"

  4. Enter the following query:

    When is the next flight to Los Angeles?

    The application responds with the next flight from SFO to LAX, or asks you for clarifying information.

  5. Enter the following query:

    For that flight, which restaurants are near the departure gate?

    The chat app may ask you to clarify which flight, but the app can understand the context and respond with restaurants near the departure gate in SFO.

Task 9. Log in to the application (optional)

In this task, you log into the application to book the flight.

  1. Click Sign in.

    A pop-up window opens.

  2. In the pop-up window, select the student.

  3. To allow Cymbal Air to access the info about the student user, click Continue.

    The student account is logged in.

  4. Enter the following query:

    Please book that flight.

    The application presents the flight that was being discussed.

  5. Click Looks good to me. Book it.

    The flight is booked.

  6. Enter the following query:

    Which flights have I booked?

    The flight you just booked is shown.

    The chat app can help answer user questions like:

    • Are there any luxury shops around gate D50?
    • Where can I get coffee near gate A6?

    The application uses the latest Google foundation models to generate responses and augment them with information about flights and amenities from the operational AlloyDB database. You can read more about this demo application on the GitHub page of the project.

Congratulations!

You've successfully built a chat application that leverages large language models (LLMs) and retrieval augmented generation (RAG) to create engaging and informative conversations.

Next steps/learn more

End your lab

When you have completed your lab, click End Lab. Google Cloud Skills Boost removes the resources you’ve used and cleans the account for you.

You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.

The number of stars indicates the following:

  • 1 star = Very dissatisfied
  • 2 stars = Dissatisfied
  • 3 stars = Neutral
  • 4 stars = Satisfied
  • 5 stars = Very satisfied

You can close the dialog box if you don't want to provide feedback.

For feedback, suggestions, or corrections, please use the Support tab.

Manual Last Updated October 13, 2025

Lab Last Tested October 13, 2025

Copyright 2025 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.

Before you begin

  1. Labs create a Google Cloud project and resources for a fixed time
  2. Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
  3. On the top left of your screen, click Start lab to begin

Use private browsing

  1. Copy the provided Username and Password for the lab
  2. Click Open console in private mode

Sign in to the Console

  1. Sign in using your lab credentials. Using other credentials might cause errors or incur charges.
  2. Accept the terms, and skip the recovery resource page
  3. Don't click End lab unless you've finished the lab or want to restart it, as it will clear your work and remove the project

This content is not currently available

We will notify you via email when it becomes available

Great!

We will contact you via email if it becomes available

One lab at a time

Confirm to end all existing labs and start this one

Use private browsing to run the lab

Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.