Loading...
No results found.

Apply your skills in Google Cloud console

Building Applications with Eventarc on Google Cloud

Get access to 700+ labs and courses

Building Eventarc Receivers

Lab 1 hour 30 minutes universal_currency_alt 5 Credits show_chart Intermediate
info This lab may incorporate AI tools to support your learning.
Get access to 700+ labs and courses

Overview

Serverless computing on Google Cloud lets you develop and deploy highly scalable applications on a fully managed serverless platform. Services are automatically scaled up and down depending on traffic.

Eventarc lets you build event-driven architectures without having to implement, customize, or maintain the underlying infrastructure. Eventarc offers a standardized solution to manage the flow of state changes, called events, between decoupled services.

An Eventarc event receiver service is a service designed to consume Eventarc events. Eventarc delivers events formatted using the CloudEvents format.

In this lab, you create a Cloud Functions service that is triggered when a file is copied into a specific Cloud Storage bucket. You containerize the service and deploy it to Google Kubernetes Engine (GKE). Finally, you modify the service to directly use a CloudEvents SDK and Google Cloud type library to consume the event.

What you will learn

In this lab, you will learn to:

  • Deploy an event receiver service to Cloud Functions 2nd generation.
  • Containerize and deploy an event receiver service to GKE.
  • Create an event receiver by using a CloudEvents SDK.

Setup

For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.

  1. Sign in to Qwiklabs using an incognito window.

  2. Note the lab's access time (for example, 1:15:00), and make sure you can finish within that time.
    There is no pause feature. You can restart if needed, but you have to start at the beginning.

  3. When ready, click Start lab.

  4. Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.

  5. Click Open Google Console.

  6. Click Use another account and copy/paste credentials for this lab into the prompts.
    If you use other credentials, you'll receive errors or incur charges.

  7. Accept the terms and skip the recovery resource page.

Activate Google Cloud Shell

Google Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud.

Google Cloud Shell provides command-line access to your Google Cloud resources.

  1. In Cloud console, on the top right toolbar, click the Open Cloud Shell button.

  2. Click Continue.

It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  • You can list the active account name with this command:
gcloud auth list

Output:

Credentialed accounts: - @.com (active)

Example output:

Credentialed accounts: - google1623327_student@qwiklabs.net
  • You can list the project ID with this command:
gcloud config list project

Output:

[core] project =

Example output:

[core] project = qwiklabs-gcp-44776a13dea667a6 Note: Full documentation of gcloud is available in the gcloud CLI overview guide .

Task 1. Configure prerequisites

In this task, you enable APIs that are required for the event-driven architecture for this lab. You create the cloud storage bucket that will be used by the Eventarc triggers. Finally, you create a service account that will be used for the Eventarc triggers and modify service account permissions.

Enable APIs

  1. To enable the required APIs, in Cloud Shell, run the following command:

    gcloud services enable \ pubsub.googleapis.com \ logging.googleapis.com \ eventarc.googleapis.com \ cloudfunctions.googleapis.com \ run.googleapis.com \ container.googleapis.com \ cloudbuild.googleapis.com \ artifactregistry.googleapis.com

    This application uses several Google Cloud services, and you must enable each of the APIs for these services.

    The APIs being enabled are:

    • The Pub/Sub API manages Pub/Sub topics and subscriptions and publishes Pub/Sub messages. Pub/Sub is used to manage the event transport in Eventarc. In this lab, a Pub/Sub topic is be used by Eventarc to generate events when messages are published to the topic.
    • The Cloud Logging API writes log entries and manages Cloud Logging configuration. Cloud Logging log entries are used by Eventarc to generate many types of events, including IAM events.
    • The Eventarc API manages Eventarc configuration. In this lab, Eventarc is used to create triggers that deliver events to the event destination service.
    • The Cloud Functions API creates and manages Cloud Functions services. In this lab, a Cloud Functions service will act as an event receiver.
    • The Cloud Run API creates and manages Cloud Run services. Cloud Functions 2nd generation services run in Cloud Run.
    • The Kubernetes Engine API builds and manages container-based applications. In this lab, an event receiver service will be deployed to GKE.
    • The Cloud Build API manages application builds.
    • The Artifact Registry API manages build artifacts and registries.

Create Cloud Storage bucket

Uploading a file to a Cloud Storage bucket will cause events to be triggered in this lab.

  1. To create the bucket, run the following command:

    gcloud storage buckets create gs://${GOOGLE_CLOUD_PROJECT}-bucket --location={{{ project_0.default_region | REGION_PLACEHOLDER }}}

    This command creates a Cloud Storage bucket in the region .

Create the trigger service account and add roles

  1. To create the service account for the trigger, in Cloud Shell, run the following command:

    export FUNC_TRIGGER_SA=bucket-trigger-func-sa gcloud iam service-accounts create ${FUNC_TRIGGER_SA}
  2. To enable the trigger to receive Eventarc events, run the following command:

    export FUNC_TRIGGER_SA=bucket-trigger-func-sa gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \ --member=serviceAccount:${FUNC_TRIGGER_SA}@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com \ --role='roles/eventarc.eventReceiver'

Grant Cloud Storage permission to create events

Services that generate direct events must have the pubsub.publisher role.

  1. To grant the Cloud Storage service account permission to create events, run the following command:

    export CLOUD_STORAGE_SA="$(gsutil kms serviceaccount -p ${GOOGLE_CLOUD_PROJECT})" gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \ --member="serviceAccount:${CLOUD_STORAGE_SA}" \ --role="roles/pubsub.publisher"

Grant Pub/Sub permission to create tokens

  1. To grant the Pub/Sub service account permission to create tokens, run the following commands:

    export PROJECT_NUMBER=$(gcloud projects describe "$GOOGLE_CLOUD_PROJECT" \ --format "value(projectNumber)") export PUBSUB_SA="service-${PROJECT_NUMBER}@gcp-sa-pubsub.iam.gserviceaccount.com" gcloud beta services identity create --project ${GOOGLE_CLOUD_PROJECT} --service pubsub gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \ --member="serviceAccount:${PUBSUB_SA}"\ --role='roles/iam.serviceAccountTokenCreator'

    The gcloud beta services identity create command immediately creates the Pub/Sub service account, which is otherwise created later when Pub/Sub is actually used.

    The gcloud projects add-iam-policy-binding command grants the Pub/Sub service account the permission to create tokens. Tokens will be used to call your authenticated event destination services.

Click Check my progress to verify the objective. Configure prerequisites

Task 2. Create a Cloud Functions event receiver service

In this task, you create a Cloud Functions 2nd generation event receiver service.

Cloud Functions 2nd generation lets you run your code without managing containers or servers. Code using the 2nd generation version of Cloud Functions runs on Cloud Run.

Create the service

  1. In the Google Cloud console, navigate to Cloud Run Functions.

    You are redirected to Cloud Run in the console.

  2. Click Write a Function.

  3. For Service Name, type my-function.

  4. For Region, Select .

  5. Click + Add trigger, and then select Cloud Storage trigger.

  6. For Event type, select google.cloud.storage.object.v1.finalized.

  7. For Bucket, click Browse.

  8. Click the -bucket, and then click Select.

    An event will be generated when the creation of an object in the bucket completes.

  9. For Service account, select the bucket-trigger-func-sa service account.

Note: If you encountered the permission error for default service account, grant a role of Eventarc Event Receiver to the Compute engine default Service Account.
  1. Click Save Trigger.

  2. Scroll to the bottom and expand Containers, Volumes, Networking, Security.

  3. Under Revision scaling, for Maximum number of instances, enter 2.

  4. Leave the remaining settings as their defaults, and click Create.

    You are taken to the initial code for the function. The function will be implemented using Node.js. There are two files used in the default code: index.js and package.json.

    package.json:

    { "dependencies": { "@google-cloud/functions-framework": "^3.0.0" } }

    The default code in index.js requires a single dependency: the Functions Framework. The open source Functions Framework lets you write lightweight functions that run in many different environments, including:

    • Cloud Functions
    • Your local development machine
    • Cloud Run
    • Knative-based environments

    The Functions Framework for each language runs an HTTP server using a common pattern for the given language. For example, the Functions Framework for Node.js uses Express, a very popular web framework for Node.js. For a different language, the web server package would be something different.

    The functions-framework dependency includes Express, so you do not have to include Express separately in your package.json file.

    index.js:

    const functions = require('@google-cloud/functions-framework'); // Register a CloudEvent callback with the Functions Framework that will // be triggered by Cloud Storage. functions.cloudEvent('helloGCS', cloudEvent => { console.log(`Event ID: ${cloudEvent.id}`); console.log(`Event Type: ${cloudEvent.type}`); const file = cloudEvent.data; console.log(`Bucket: ${file.bucket}`); console.log(`File: ${file.name}`); console.log(`Metageneration: ${file.metageneration}`); console.log(`Created: ${file.timeCreated}`); console.log(`Updated: ${file.updated}`); });

    The default code is customized for the type of event that the trigger will generate.

    For this trigger that uses Cloud Storage direct events, helloGCS is the entry point for calls to the function.

    Inside the cloudEvent callback, the cloudEvent object contains all of the information in the event, and you can add code that immediately acts on the event data. The default code logs two standard fields in a CloudEvent:

    • id (a unique ID for events from the event source)
    • type (the event type)

    Data for Cloud Storage direct events is also logged. We will look at that data in a moment.

  5. To log the entire CloudEvent object, in index.js, add the following line of code as the last line of the callback:

    console.log(`cloudEvent: ${JSON.stringify(cloudEvent)}`);

    This will log a pretty-printed JSON structure for the cloudEvent.

    The cloudEvent callback should now look similar to this:

    functions.cloudEvent('helloGCS', cloudEvent => { console.log(`Event ID: ${cloudEvent.id}`); console.log(`Event Type: ${cloudEvent.type}`); const file = cloudEvent.data; console.log(`Bucket: ${file.bucket}`); console.log(`File: ${file.name}`); console.log(`Metageneration: ${file.metageneration}`); console.log(`Created: ${file.timeCreated}`); console.log(`Updated: ${file.updated}`); console.log(`cloudEvent: ${JSON.stringify(cloudEvent)}`); });
  6. Click Save and redeploy.

    Note: If you get an error indicating that the trigger can't be created because you recently started to use Eventarc, close the error message and deploy the function again. It may take a couple of attempts. After the function successfully deploys, you should never see the error.

    You are taken to the Function details page for my-function. The deployment progress is shown on this page.

  7. Wait until the service and trigger have deployed.

    It can take a few minutes for the deployment to complete.

  8. To let the trigger invoke the Cloud Functions service, in Cloud Shell, run the following command:

    export SERVICE_REGION=REGION_PLACEHOLDER export FUNC_TRIGGER_SA=bucket-trigger-func-sa gcloud run services add-iam-policy-binding my-function \ --region=${SERVICE_REGION} \ --member="serviceAccount:${FUNC_TRIGGER_SA}@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com" \ --role="roles/run.invoker"

Click Check my progress to verify the objective. Create a Cloud Functions event receiver service

Task 3. Test the function

In this task, you copy a file to the Cloud Storage bucket and validate the logged event data.

Copy a file to the Cloud Storage bucket

  1. To copy a new file to the Cloud Storage bucket, in Cloud Shell, run the following commands:

    cat << EOF >> ~/testfile1.txt Test file 1 EOF gsutil cp ~/testfile1.txt gs://${GOOGLE_CLOUD_PROJECT}-bucket

    The cat command inserts the lines before EOF into the testfile1.txt file.

  2. On the my-function details page, click the Logs tab.

    You should see one or more calls to the function. Look for a log of an Event ID. You can drag the scroll bar to search for newer entries in the log.

    Note: If you see POST 403 calls, the permissions required to invoke the function may not have propagated to the trigger service account yet. If you enabled retry on failure, you should see a successful delivery after a few 403 calls. You may run the commands to copy the file to the bucket again.

    You should see some logs that look like this:

    The google.cloud.storage.object.v1.finalized event type indicates that a new file was copied to the specified bucket.

  3. Expand the log entry for the cloudEvent object:

    You can see the entire CloudEvent object logged here.

Click Check my progress to verify the objective. Test the function

Task 4. Containerize the code

In this task, you retrieve the code from the Cloud Function and containerize the code.

Download the source code

Cloud Functions retains the source code for functions in a Cloud Storage bucket.

  1. To retrieve the latest source code for my-function, in Cloud Shell, run the following commands:

    export PROJECT_NUMBER=$(gcloud projects describe "$GOOGLE_CLOUD_PROJECT" \ --format "value(projectNumber)") export SERVICE_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} export GOOGLE_CLOUD_PROJECT=$(gcloud config get-value project) gsutil cp -r gs://run-sources-${GOOGLE_CLOUD_PROJECT}-${SERVICE_REGION}/services/my-function ~

    A ZIP file of the function code has been placed in the my-function subdirectory.

  2. To extract the code, run the following commands:

    cd ~/my-function unzip *.zip rm *.zip ls

    The code in the ZIP file is extracted, and then the ZIP file is deleted.

Create the containerized application

The Functions Framework can also be used in containerized applications. You will start with the exact same code that ran in Cloud Functions. Later, you will modify it to use CloudEvents more directly.

  1. To create a folder for the GKE application, in Cloud Shell, run the following commands:

    mkdir ~/my-gke-receiver cp ~/my-function/* ~/my-gke-receiver cd ~/my-gke-receiver ls

    Next, you update the package.json file for use in a container.

  2. To edit package.json using nano, run the following command:

    nano package.json
  3. Inside the outer curly braces, insert the following configuration at the top:

    "name": "my-gke-receiver", "version": "1.0.0", "description": "GKE event receiver (Functions Framework)", "main": "index.js", "scripts": { "start": "functions-framework --port=8080 --target=helloGCS" },

    You may be used to seeing the start script running the main Node file using node (node index.js). The start script for this container starts the Functions Framework using the helloGCS function in index.js, and exposes it on port 8080.

    After you add that configuration, the file should look similar to this:

    { "name": "my-gke-receiver", "version": "1.0.0", "description": "GKE event receiver (Functions Framework)", "main": "index.js", "scripts": { "start": "functions-framework --port=8080 --target=helloGCS" }, "dependencies": { "@google-cloud/functions-framework": "^3.0.0" } }
  4. To save and close the file, press CTRL-X, and then press Y and Enter.

    Next, you create a Dockerfile for the container.

  5. Run the following command:

    nano Dockerfile
  6. Add the following content to the empty Dockerfile:

    FROM node:18-slim WORKDIR /usr/src/app COPY package*.json ./ RUN npm install --production COPY . ./ CMD [ "npm", "start" ]

    A simple Dockerfile is required to containerize the application.

  7. To save and close the file, press CTRL-X, and then press Y and Enter.

Create a repository in Artifact Registry

Artifact Registry is the next generation of Container Registry. You can store build artifacts inside an Artifact Registry repository.

  1. To create an Artifact Registry repository for Docker images, in Cloud Shell, run the following command:

    export REPO_NAME=eventarc-apps-repo export REPO_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} gcloud artifacts repositories create ${REPO_NAME} \ --location=${REPO_REGION} --repository-format=docker
  2. To retrieve the repository details, run the following command:

    export REPO_NAME=eventarc-apps-repo export REPO_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} gcloud artifacts repositories describe ${REPO_NAME} --location={{{ project_0.default_region | REGION_PLACEHOLDER }}}

Build the container image

  1. To build the container image of the service, in Cloud Shell, run the following commands:

    export REPO_NAME=eventarc-apps-repo export REPO_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} export SERVICE_NAME=my-gke-receiver cd ~/my-gke-receiver gcloud builds submit \ . \ --tag ${REPO_REGION}-docker.pkg.dev/${GOOGLE_CLOUD_PROJECT}/${REPO_NAME}/${SERVICE_NAME}

    The container image is built using the Dockerfile and stored in Artifact Registry.

Click Check my progress to verify the objective. Containerize the code

Task 5. Create a GKE cluster

In this task, you create a GKE Autopilot cluster to run the event receiver service.

Note: If you need to run small services like this in your own project, Cloud Run may be a more economical solution than running a service in a GKE cluster.
  1. In Cloud Shell, run the following command:

    export CLUSTER_NAME=eventarc-cluster export CLUSTER_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} gcloud container clusters create-auto ${CLUSTER_NAME} --region=${CLUSTER_REGION}

    The command will return when the cluster has been created. Cluster creation can take several minutes to complete.

  2. Wait for the cluster creation to complete.

Click Check my progress to verify the objective. Create a GKE cluster

Task 6. Deploy the GKE event receiver service

In this task, you deploy the event receiver service to GKE.

  1. To get authentication credentials to interact with the cluster using kubectl, in Cloud Shell, run the following command:

    export CLUSTER_NAME=eventarc-cluster export CLUSTER_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} gcloud container clusters get-credentials ${CLUSTER_NAME} \ --region=${CLUSTER_REGION}

    Once this command has been run, you can use kubectl to manage the cluster.

  2. To deploy the container to Kubernetes as a deployment on GKE, run the following command:

    export REPO_NAME=eventarc-apps-repo export REPO_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} export SERVICE_NAME=my-gke-receiver kubectl create deployment ${SERVICE_NAME} \ --image=${REPO_REGION}-docker.pkg.dev/${GOOGLE_CLOUD_PROJECT}/${REPO_NAME}/${SERVICE_NAME} Note: You may get a warning that Autopilot set the default resource requests for the deployment. In a production environment, you would want to specify the CPU and memory requirements for your deployments.

    Next, you expose the deployment as a service in GKE with a stable IP address that is accessible within the cluster.

  3. To expose the deployment as an internal Kubernetes service, run the following command:

    export SERVICE_NAME=my-gke-receiver kubectl expose deployment ${SERVICE_NAME} \ --type ClusterIP --port 80 --target-port 8080
  4. To check the pod's status, run the following command:

    kubectl get pods

    Repeat the command until the my-gke-receiver pod's status is RUNNING.

  5. To verify that the service is exposed, run the following command:

    export SERVICE_NAME=my-gke-receiver kubectl get svc/${SERVICE_NAME}

    You should see that a cluster IP address has been assigned. There is no external IP address, so the service is not available from the internet.

Click Check my progress to verify the objective. Deploy the GKE event receiver service

Task 7. Create the Eventarc trigger for the GKE event destination

In this task, you enable Eventarc to manage GKE clusters and create a trigger that calls the GKE service.

Enable Eventarc to manage GKE clusters

  1. To enable Eventarc to manage GKE clusters, in Cloud Shell, run the following command:

    gcloud eventarc gke-destinations init
  2. Enter Y and press Enter.

    Eventarc creates a separate Event Forwarder in your cluster for each trigger that targets a GKE service. The Event Forwarder receives events from Pub/Sub and forwards them as HTTP requests to the GKE event receiver within the cluster.

Create service account and add roles

  1. To create the service account for the GKE trigger, in Cloud Shell, run the following command:

    export GKE_TRIGGER_SA=bucket-trigger-gke-sa gcloud iam service-accounts create ${GKE_TRIGGER_SA}
  2. To enable the Eventarc trigger to receive events, run the following command:

    export GKE_TRIGGER_SA=bucket-trigger-gke-sa gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \ --member=serviceAccount:${GKE_TRIGGER_SA}@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com \ --role='roles/eventarc.eventReceiver'
  3. To enable the trigger to receive events from Pub/Sub, run the following command:

    export GKE_TRIGGER_SA=bucket-trigger-gke-sa gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \ --member=serviceAccount:${GKE_TRIGGER_SA}@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com \ --role='roles/pubsub.subscriber'
  4. To enable the creation of GKE metrics, run the following command:

    export GKE_TRIGGER_SA=bucket-trigger-gke-sa gcloud projects add-iam-policy-binding ${GOOGLE_CLOUD_PROJECT} \ --member=serviceAccount:${GKE_TRIGGER_SA}@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com \ --role='roles/monitoring.metricWriter'

Create the Eventarc trigger

  1. To create the Eventarc trigger for the GKE service, run the following command:

    export TRIGGER_NAME=trigger-storage-gke export GKE_TRIGGER_SA=bucket-trigger-gke-sa export CLUSTER_NAME=eventarc-cluster export CLUSTER_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} export SERVICE_NAME=my-gke-receiver export BUCKET_NAME=${GOOGLE_CLOUD_PROJECT}-bucket export BUCKET_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} gcloud eventarc triggers create ${TRIGGER_NAME} \ --destination-gke-cluster=${CLUSTER_NAME} \ --destination-gke-location=${CLUSTER_REGION} \ --destination-gke-namespace=default \ --destination-gke-service=${SERVICE_NAME} \ --destination-gke-path=/ \ --event-filters="type=google.cloud.storage.object.v1.finalized" \ --event-filters="bucket=${BUCKET_NAME}" \ --location=${BUCKET_REGION} \ --service-account=${GKE_TRIGGER_SA}@${GOOGLE_CLOUD_PROJECT}.iam.gserviceaccount.com

    Eventarc will automatically manage the Eventarc forwarder.

Test the trigger

  1. To copy a new file to the Cloud Storage bucket, run the following commands:

    cat << EOF >> ~/testfile2.txt Test file 2 EOF gsutil cp ~/testfile2.txt gs://${GOOGLE_CLOUD_PROJECT}-bucket

    The GKE service will log the information to the Kubernetes pod logs.

  2. To retrieve the logs for the pods in the GKE deployment, run the following command:

    kubectl logs deployment/my-gke-receiver

    You should see the same type of information that was logged by the Cloud Function:

    > my-gke-receiver@1.0.0 start > functions-framework --target=helloGCS Serving function... Function: helloGCS Signature type: cloudevent URL: http://localhost:8080/ Event ID: 7114263859894439 Event Type: google.cloud.storage.object.v1.finalized Bucket: qwiklabs-gcp-04-a37540cd8ebd-bucket File: testfile2.txt Metageneration: 1 Created: 2023-03-08T04:05:31.333Z Updated: 2023-03-08T04:05:31.333Z cloudEvent: {"bucket":"qwiklabs-gcp-04-a37540cd8ebd-bucket","id":"7114263859894439","source":"//storage.googleapis.com/projects/_/buckets/qwiklabs-gcp-04-a37540cd8ebd-bucket","specversion":"1.0","subject":"objects/testfile2.txt","time":"2023-03-08T04:05:31.333720Z","type":"google.cloud.storage.object.v1.finalized","data":{"kind":"storage#object","id":"qwiklabs-gcp-04-a37540cd8ebd-bucket/testfile2.txt/1678248331297176","selfLink":"https://www.googleapis.com/storage/v1/b/qwiklabs-gcp-04-a37540cd8ebd-bucket/o/testfile2.txt","name":"testfile2.txt","bucket":"qwiklabs-gcp-04-a37540cd8ebd-bucket","generation":"1678248331297176","metageneration":"1","contentType":"text/plain","timeCreated":"2023-03-08T04:05:31.333Z","updated":"2023-03-08T04:05:31.333Z","storageClass":"STANDARD","timeStorageClassUpdated":"2023-03-08T04:05:31.333Z","size":"12","md5Hash":"G9VPmBzli+cVBc3Omd4Iaw==","mediaLink":"https://storage.googleapis.com/download/storage/v1/b/qwiklabs-gcp-04-a37540cd8ebd-bucket/o/testfile2.txt?generation=1678248331297176&alt=media","crc32c":"+axlJQ==","etag":"CJiDh526y/0CEAE="}}

Click Check my progress to verify the objective. Create the Eventarc trigger for the GKE event destination

Task 8. Update the GKE service to use CloudEvents

In this task, you modify the GKE service to use CloudEvents directly instead of the Functions Framework.

Update the code

  1. In Cloud Shell, run the following commands:

    mkdir ~/my-gke-cloudevents-receiver cp ~/my-gke-receiver/* ~/my-gke-cloudevents-receiver cd ~/my-gke-cloudevents-receiver ls
  2. Run the following command:

    cat package.json

    The previous service started the functions-framework.

  3. To replace package.json, run the following commands:

    rm package.json nano package.json
  4. Set the package.json contents to:

    { "name": "my-gke-cloudevents-receiver", "version": "1.0.0", "description": "GKE event receiver (CloudEvents)", "main": "index.js", "scripts": { "start": "node index.js" }, "dependencies": { "cloudevents": "^4.0.0", "express": "^4.18.2", "@google/events": "^5.4.0" } }

    Instead of using the functions-framework, the new service will be a standard Node app.

    The Functions Framework dependency has been replaced with dependencies for the cloudevents, express, and @google/events packages.

  5. To save and close the file, press CTRL-X, and then press Y and Enter.

  6. To replace index.js, run the following commands:

    rm index.js nano index.js
  7. Set the index.js contents to:

    const express = require("express"); const { HTTP } = require("cloudevents"); const { toStorageObjectData } = require('@google/events/cloud/storage/v1/StorageObjectData'); var app = express(); app.use(express.json()); app.post("/", function (req, res, next) { const cloudEvent = HTTP.toEvent({ headers: req.headers, body: req.body }); console.log(`cloudEvent: ${JSON.stringify(cloudEvent)}`); console.log(`Event ID: ${cloudEvent.id}`); console.log(`Event Type: ${cloudEvent.type}`); const file = toStorageObjectData(req.body); console.log(`Bucket: ${file.bucket}`); console.log(`File: ${file.name}`); console.log(`Metageneration: ${file.metageneration}`); console.log(`Created: ${file.timeCreated}`); console.log(`Updated: ${file.updated}`); }); const port = parseInt(process.env.PORT) || 8080; app.listen(port, () => { console.log("service listening on port", port); });

    The app now uses Express and the JavaScript SDK for CloudEvents directly.

    The code is executed when a POST / HTTP request is received. The cloudEvent object is created by sending the HTTP headers and body to the CloudEvents SDK.

    The Express app listens on port 8080.

  8. To save and close the file, press CTRL-X, and then press Y and Enter.

    The Dockerfile does not need to be changed.

  9. To build the updated container image, run the following commands:

    export REPO_NAME=eventarc-apps-repo export REPO_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} export SERVICE_NAME=my-gke-receiver cd ~/my-gke-cloudevents-receiver gcloud builds submit \ . \ --tag ${REPO_REGION}-docker.pkg.dev/${GOOGLE_CLOUD_PROJECT}/${REPO_NAME}/${SERVICE_NAME}
  10. To delete the existing deployment and service, run the following commands:

    export SERVICE_NAME=my-gke-receiver kubectl delete svc/${SERVICE_NAME} kubectl delete deployment/${SERVICE_NAME}
  11. To redeploy the container to Kubernetes and expose the deployment, run the following commands:

    export REPO_NAME=eventarc-apps-repo export REPO_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}} export SERVICE_NAME=my-gke-receiver kubectl create deployment ${SERVICE_NAME} \ --image=${REPO_REGION}-docker.pkg.dev/${GOOGLE_CLOUD_PROJECT}/${REPO_NAME}/${SERVICE_NAME} kubectl expose deployment ${SERVICE_NAME} \ --type ClusterIP --port 80 --target-port 8080
  12. To check the pod's status, run the following command:

    kubectl get pods

    Repeat the command until the pod's status is RUNNING.

  13. To verify that the service is exposed, run the following command:

    kubectl get svc
  14. To copy a new file to the Cloud Storage bucket, run the following commands:

    cat << EOF >> ~/testfile3.txt Test file 3 EOF gsutil cp ~/testfile3.txt gs://${GOOGLE_CLOUD_PROJECT}-bucket
  15. To retrieve the logs for the GKE deployment, run the following command:

    kubectl logs deployment/my-gke-receiver

    You should see the same type of information that was logged by the Cloud Function, but logged by a standard node app:

    > my-gke-cloudevents-receiver@1.0.0 start > node index.js listening on port 8080 Event ID: 7116070522297099 Event Type: google.cloud.storage.object.v1.finalized Bucket: qwiklabs-gcp-04-a37540cd8ebd-bucket File: testfile3.txt Metageneration: 1 Created: 2023-03-08T08:53:51.309Z Updated: 2023-03-08T08:53:51.309Z cloudEvent: {"id":"7116070522297099","time":"2023-03-08T08:53:51.309Z","type":"google.cloud.storage.object.v1.finalized","source":"//storage.googleapis.com/projects/_/buckets/qwiklabs-gcp-04-a37540cd8ebd-bucket","specversion":"1.0","datacontenttype":"application/json","subject":"objects/testfile3.txt","bucket":"qwiklabs-gcp-04-a37540cd8ebd-bucket","data":{"kind":"storage#object","id":"qwiklabs-gcp-04-a37540cd8ebd-bucket/testfile3.txt/1678265631298417","selfLink":"https://www.googleapis.com/storage/v1/b/qwiklabs-gcp-04-a37540cd8ebd-bucket/o/testfile3.txt","name":"testfile3.txt","bucket":"qwiklabs-gcp-04-a37540cd8ebd-bucket","generation":"1678265631298417","metageneration":"1","contentType":"text/plain","timeCreated":"2023-03-08T08:53:51.309Z","updated":"2023-03-08T08:53:51.309Z","storageClass":"STANDARD","timeStorageClassUpdated":"2023-03-08T08:53:51.309Z","size":"180","md5Hash":"306jGDG6MneNu/53Hv6QJw==","mediaLink":"https://storage.googleapis.com/download/storage/v1/b/qwiklabs-gcp-04-a37540cd8ebd-bucket/o/testfile3.txt?generation=1678265631298417&alt=media","crc32c":"4Bln1g==","etag":"CPGmq9b6y/0CEAE="}} file: {"kind":"storage#object","id":"qwiklabs-gcp-04-a37540cd8ebd-bucket/testfile3.txt/1678265631298417","selfLink":"https://www.googleapis.com/storage/v1/b/qwiklabs-gcp-04-a37540cd8ebd-bucket/o/testfile3.txt","name":"testfile3.txt","bucket":"qwiklabs-gcp-04-a37540cd8ebd-bucket","generation":"1678265631298417","metageneration":"1","contentType":"text/plain","timeCreated":"2023-03-08T08:53:51.309Z","updated":"2023-03-08T08:53:51.309Z","storageClass":"STANDARD","timeStorageClassUpdated":"2023-03-08T08:53:51.309Z","size":"180","md5Hash":"306jGDG6MneNu/53Hv6QJw==","mediaLink":"https://storage.googleapis.com/download/storage/v1/b/qwiklabs-gcp-04-a37540cd8ebd-bucket/o/testfile3.txt?generation=1678265631298417&alt=media","crc32c":"4Bln1g==","etag":"CPGmq9b6y/0CEAE="}

Click Check my progress to verify the objective. Update the GKE service to use CloudEvents

Congratulations!

In this lab, you created a Cloud Functions service that was triggered when a file was copied into a specific Cloud Storage bucket. You containerized the service and deployed it to GKE. Finally, you modified the service to directly use a CloudEvents SDK and a Google Cloud type library.

Next Steps / Learn More

For more information, view the documentation:

Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.

Previous Next

Before you begin

  1. Labs create a Google Cloud project and resources for a fixed time
  2. Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
  3. On the top left of your screen, click Start lab to begin

This content is not currently available

We will notify you via email when it becomes available

Great!

We will contact you via email if it becomes available

One lab at a time

Confirm to end all existing labs and start this one

Use private browsing to run the lab

Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
Preview