Serverless computing on Google Cloud lets you develop and deploy highly scalable applications on a fully managed serverless platform. Services are automatically scaled up and down depending on traffic.
Eventarc lets you build event-driven architectures without having to implement, customize, or maintain the underlying infrastructure. Eventarc offers a standardized solution to manage the flow of state changes, called events, between decoupled services.
Deploy an event receiver service to Cloud Functions 2nd generation.
Containerize and deploy an event receiver service to GKE.
Create an event receiver by using a CloudEvents SDK.
Setup
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
Activate Google Cloud Shell
Google Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud.
Google Cloud Shell provides command-line access to your Google Cloud resources.
In Cloud console, on the top right toolbar, click the Open Cloud Shell button.
Click Continue.
It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:
gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.
You can list the active account name with this command:
[core]
project = qwiklabs-gcp-44776a13dea667a6
Note:
Full documentation of gcloud is available in the
gcloud CLI overview guide
.
Task 1. Configure prerequisites
In this task, you enable APIs that are required for the event-driven architecture for this lab. You create the cloud storage bucket that will be used by the Eventarc triggers. Finally, you create a service account that will be used for the Eventarc triggers and modify service account permissions.
Enable APIs
To enable the required APIs, in Cloud Shell, run the following command:
This application uses several Google Cloud services, and you must enable each of the APIs for these services.
The APIs being enabled are:
The Pub/Sub API manages Pub/Sub topics and subscriptions and publishes Pub/Sub messages. Pub/Sub is used to manage the event transport in Eventarc. In this lab, a Pub/Sub topic is be used by Eventarc to generate events when messages are published to the topic.
The Cloud Logging API writes log entries and manages Cloud Logging configuration. Cloud Logging log entries are used by Eventarc to generate many types of events, including IAM events.
The Eventarc API manages Eventarc configuration. In this lab, Eventarc is used to create triggers that deliver events to the event destination service.
The Cloud Functions API creates and manages Cloud Functions services. In this lab, a Cloud Functions service will act as an event receiver.
The Cloud Run API creates and manages Cloud Run services. Cloud Functions 2nd generation services run in Cloud Run.
The Kubernetes Engine API builds and manages container-based applications. In this lab, an event receiver service will be deployed to GKE.
The gcloud beta services identity create command immediately creates the Pub/Sub service account, which is otherwise created later when Pub/Sub is actually used.
The gcloud projects add-iam-policy-binding command grants the Pub/Sub service account the permission to create tokens. Tokens will be used to call your authenticated event destination services.
Click Check my progress to verify the objective.
Configure prerequisites
Task 2. Create a Cloud Functions event receiver service
Cloud Functions 2nd generation lets you run your code without managing containers or servers. Code using the 2nd generation version of Cloud Functions runs on Cloud Run.
Create the service
In the Google Cloud console, navigate to Cloud Run Functions.
You are redirected to Cloud Run in the console.
Click Write a Function.
For Service Name, type my-function.
For Region, Select .
Click + Add trigger, and then select Cloud Storage trigger.
For Event type, select google.cloud.storage.object.v1.finalized.
For Bucket, click Browse.
Click the -bucket, and then click Select.
An event will be generated when the creation of an object in the bucket completes.
For Service account, select the bucket-trigger-func-sa service account.
Note:
If you encountered the permission error for default service account, grant a role of Eventarc Event Receiver to the Compute engine default Service Account.
Click Save Trigger.
Scroll to the bottom and expand Containers, Volumes, Networking, Security.
Under Revision scaling, for Maximum number of instances, enter 2.
Leave the remaining settings as their defaults, and click Create.
You are taken to the initial code for the function. The function will be implemented using Node.js. There are two files used in the default code: index.js and package.json.
The default code in index.js requires a single dependency: the Functions Framework. The open source Functions Framework lets you write lightweight functions that run in many different environments, including:
The Functions Framework for each language runs an HTTP server using a common pattern for the given language. For example, the Functions Framework for Node.js uses Express, a very popular web framework for Node.js. For a different language, the web server package would be something different.
The functions-framework dependency includes Express, so you do not have to include Express separately in your package.json file.
index.js:
const functions = require('@google-cloud/functions-framework');
// Register a CloudEvent callback with the Functions Framework that will
// be triggered by Cloud Storage.
functions.cloudEvent('helloGCS', cloudEvent => {
console.log(`Event ID: ${cloudEvent.id}`);
console.log(`Event Type: ${cloudEvent.type}`);
const file = cloudEvent.data;
console.log(`Bucket: ${file.bucket}`);
console.log(`File: ${file.name}`);
console.log(`Metageneration: ${file.metageneration}`);
console.log(`Created: ${file.timeCreated}`);
console.log(`Updated: ${file.updated}`);
});
The default code is customized for the type of event that the trigger will generate.
For this trigger that uses Cloud Storage direct events, helloGCS is the entry point for calls to the function.
Inside the cloudEvent callback, the cloudEvent object contains all of the information in the event, and you can add code that immediately acts on the event data. The default code logs two standard fields in a CloudEvent:
id (a unique ID for events from the event source)
type (the event type)
Data for Cloud Storage direct events is also logged. We will look at that data in a moment.
To log the entire CloudEvent object, in index.js, add the following line of code as the last line of the callback:
Note:
If you get an error indicating that the trigger can't be created because you recently started to use Eventarc, close the error message and deploy the function again. It may take a couple of attempts. After the function successfully deploys, you should never see the error.
You are taken to the Function details page for my-function. The deployment progress is shown on this page.
Wait until the service and trigger have deployed.
It can take a few minutes for the deployment to complete.
To let the trigger invoke the Cloud Functions service, in Cloud Shell, run the following command:
The cat command inserts the lines before EOF into the testfile1.txt file.
On the my-function details page, click the Logs tab.
You should see one or more calls to the function. Look for a log of an Event ID. You can drag the scroll bar to search for newer entries in the log.
Note:
If you see POST 403 calls, the permissions required to invoke the function may not have propagated to the trigger service account yet. If you enabled retry on failure, you should see a successful delivery after a few 403 calls. You may run the commands to copy the file to the bucket again.
You should see some logs that look like this:
The google.cloud.storage.object.v1.finalized event type indicates that a new file was copied to the specified bucket.
Expand the log entry for the cloudEvent object:
You can see the entire CloudEvent object logged here.
Click Check my progress to verify the objective.
Test the function
Task 4. Containerize the code
In this task, you retrieve the code from the Cloud Function and containerize the code.
Download the source code
Cloud Functions retains the source code for functions in a Cloud Storage bucket.
To retrieve the latest source code for my-function, in Cloud Shell, run the following commands:
A ZIP file of the function code has been placed in the my-function subdirectory.
To extract the code, run the following commands:
cd ~/my-function
unzip *.zip
rm *.zip
ls
The code in the ZIP file is extracted, and then the ZIP file is deleted.
Create the containerized application
The Functions Framework can also be used in containerized applications. You will start with the exact same code that ran in Cloud Functions. Later, you will modify it to use CloudEvents more directly.
To create a folder for the GKE application, in Cloud Shell, run the following commands:
mkdir ~/my-gke-receiver
cp ~/my-function/* ~/my-gke-receiver
cd ~/my-gke-receiver
ls
Next, you update the package.json file for use in a container.
To edit package.json using nano, run the following command:
nano package.json
Inside the outer curly braces, insert the following configuration at the top:
You may be used to seeing the start script running the main Node file using node (node index.js). The start script for this container starts the Functions Framework using the helloGCS function in index.js, and exposes it on port 8080.
After you add that configuration, the file should look similar to this:
The container image is built using the Dockerfile and stored in Artifact Registry.
Click Check my progress to verify the objective.
Containerize the code
Task 5. Create a GKE cluster
In this task, you create a GKE Autopilot cluster to run the event receiver service.
Note:
If you need to run small services like this in your own project, Cloud Run may be a more economical solution than running a service in a GKE cluster.
Once this command has been run, you can use kubectl to manage the cluster.
To deploy the container to Kubernetes as a deployment on GKE, run the following command:
export REPO_NAME=eventarc-apps-repo
export REPO_REGION={{{ project_0.default_region | REGION_PLACEHOLDER }}}
export SERVICE_NAME=my-gke-receiver
kubectl create deployment ${SERVICE_NAME} \
--image=${REPO_REGION}-docker.pkg.dev/${GOOGLE_CLOUD_PROJECT}/${REPO_NAME}/${SERVICE_NAME}
Note:
You may get a warning that Autopilot set the default resource requests for the deployment. In a production environment, you would want to specify the CPU and memory requirements for your deployments.
Next, you expose the deployment as a service in GKE with a stable IP address that is accessible within the cluster.
To expose the deployment as an internal Kubernetes service, run the following command:
To check the pod's status, run the following command:
kubectl get pods
Repeat the command until the my-gke-receiver pod's status isRUNNING.
To verify that the service is exposed, run the following command:
export SERVICE_NAME=my-gke-receiver
kubectl get svc/${SERVICE_NAME}
You should see that a cluster IP address has been assigned. There is no external IP address, so the service is not available from the internet.
Click Check my progress to verify the objective.
Deploy the GKE event receiver service
Task 7. Create the Eventarc trigger for the GKE event destination
In this task, you enable Eventarc to manage GKE clusters and create a trigger that calls the GKE service.
Enable Eventarc to manage GKE clusters
To enable Eventarc to manage GKE clusters, in Cloud Shell, run the following command:
gcloud eventarc gke-destinations init
Enter Y and press Enter.
Eventarc creates a separate Event Forwarder in your cluster for each trigger that targets a GKE service. The Event Forwarder receives events from Pub/Sub and forwards them as HTTP requests to the GKE event receiver within the cluster.
Create service account and add roles
To create the service account for the GKE trigger, in Cloud Shell, run the following command:
export GKE_TRIGGER_SA=bucket-trigger-gke-sa
gcloud iam service-accounts create ${GKE_TRIGGER_SA}
To enable the Eventarc trigger to receive events, run the following command:
The code is executed when a POST / HTTP request is received. The cloudEvent object is created by sending the HTTP headers and body to the CloudEvents SDK.
The Express app listens on port 8080.
To save and close the file, press CTRL-X, and then press Y and Enter.
The Dockerfile does not need to be changed.
To build the updated container image, run the following commands:
Click Check my progress to verify the objective.
Update the GKE service to use CloudEvents
Congratulations!
In this lab, you created a Cloud Functions service that was triggered when a file was copied into a specific Cloud Storage bucket. You containerized the service and deployed it to GKE. Finally, you modified the service to directly use a CloudEvents SDK and a Google Cloud type library.
Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
Labs create a Google Cloud project and resources for a fixed time
Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
On the top left of your screen, click Start lab to begin
Use private browsing
Copy the provided Username and Password for the lab
Click Open console in private mode
Sign in to the Console
Sign in using your lab credentials. Using other credentials might cause errors or incur charges.
Accept the terms, and skip the recovery resource page
Don't click End lab unless you've finished the lab or want to restart it, as it will clear your work and remove the project
This content is not currently available
We will notify you via email when it becomes available
Great!
We will contact you via email if it becomes available
One lab at a time
Confirm to end all existing labs and start this one
Use private browsing to run the lab
Use an Incognito or private browser window to run this lab. This
prevents any conflicts between your personal account and the Student
account, which may cause extra charges incurred to your personal account.
In this lab, you implement an event receiver service in Cloud Functions, and then containerize it and deploy it to GKE. For each deployed service, you create a trigger that sends events when files are copied into a Cloud Storage bucket.