
Before you begin
- Labs create a Google Cloud project and resources for a fixed time
- Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
- On the top left of your screen, click Start lab to begin
Create a Cloud Function
/ 50
Verify the data has successfully loaded into MongoDB
/ 50
This lab was developed with the partner, MongoDB. Your personal information may be shared with MongoDB, the lab sponsor, if you have opted-in to receive product updates, announcements, and offers in your Account Profile.
In this lab, you will construct a pipeline that seamlessly transports real-time data generated by edge devices or sensors to MongoDB using MongoDB Atlas Data API. Through this process, you will establish a Pub/Sub topic that monitors change stream data, an Eventarc trigger that connects the Pub/Sub topic to a Google Cloud Function, and a MongoDB collection to securely store the data. You'll also visualize the data with MongoDB Atlas Charts.
With MongoDB Atlas on Google Cloud, you will be able to integrate with the Google Cloud product suite to create more value. MongoDB with Google Cloud provides great geographic presence, Cost to performance ratio and security.
In this lab, you will do the following:
If you are new to MongoDB Atlas, take up the lab Introduction to MongoDB Atlas to setup your first MongoDB cluster or refer below documentation:
Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.
This Qwiklabs hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.
To complete this lab, you need:
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab.
Note: If you are using a Pixelbook, open an Incognito window to run this lab.
Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is a panel populated with the temporary credentials that you must use for this lab.
Copy the username, and then click Open Google Console. The lab spins up resources, and then opens another tab that shows the Sign in page.
Tip: Open the tabs in separate windows, side-by-side.
In the Sign in page, paste the username that you copied from the Connection Details panel. Then copy and paste the password.
Important: You must use the credentials from the Connection Details panel. Do not use your Qwiklabs credentials. If you have your own Google Cloud account, do not use it for this lab (avoids incurring charges).
Click through the subsequent pages:
After a few moments, the Cloud Console opens in this tab.
Scenario: There is a weather station generating weather data. The data generator is mimicking the weather station sensors. You will be creating a real time monitoring system for the weather data using MongoDB Charts in this section.
On the Deploy your cluster page select M0.
Name the cluster Sandbox.
Select Google Cloud in Cloud Provider & Region. Select the nearest region to you.
Click on Create Deployment.
Click Close.
Click on Browse Collections and then click on the +Create Database button.
Name your database weather_station
and collection name sensor
and click on Create. Make sure you observe this exact casing and spelling on the database and collection names, otherwise your Cloud Function will fail later.
To insert a sample document into the collection, click on Insert document.
You can either enter the fields one at a time, or add the whole document as JSON. Switch to the JSON view:
Copy and paste the below document into the collection.
Repeat the above step to insert one more document.
In this section you will configure the Data API in MongoDB Atlas. The MongoDB Atlas Data API lets you read and write data in Atlas with standard HTTPS requests. To use the Data API, all you need is an HTTPS client and a valid API key. You will use the Data API to communicate with the weather_station database from Cloud Functions.
Click on Users tab.
Click on Create API Key, Name your API key weather_station-api-key and click on Generate API Key. Copy the API key, store it locally and click on Close.
Click Data API tab and select Read and Write.
To get a sense of what's in the data set, You can use Charts and visualize the data. In this section you will create a Discrete Line chart to understand the correlation between the Temperature and Humidity and how it changes with time.
Navigate to Charts from MongoDB Atlas Home page. Click on Charts.
Click Select for Chart builder. (You may also get an option like Add dashboard > Name the dashboard > Save > Add chart)
Select the data source > Sandbox (Cluster) > weather_station > sensor.
You are Halfway through!!!
At this stage you have now completed the Atlas set up. You have set up an Atlas cluster, created a database and collection to store sensor data, created a Data API to write the data to the collections and created charts to visualize the data.
In the Google Cloud console, find the Search bar on the top and search for "Cloud Functions". You will see a previously created function that you'll use later.
Click on Create Function. Use the following configuration:
Property | Value |
---|---|
Environment | 2nd gen |
Function name | weather-sensor-function |
Region | The one allocated to you for the project |
Trigger type | Cloud Pub/Sub |
Select a Cloud Pub/Sub topic | weather-sensor-trigger |
Property | value |
---|---|
Memory allocated | 512 MiB |
Timeout | 540 seconds |
Autoscaling: Max number of instances | 5 |
Click Next. Enable all required APIs in the prompt that follows.
Change the Runtime to Python 3.9 or Python 3.10. The Entry Point will be replaced by hello_pubsub
.
Replace the default main.py content by the below code snippet.
Get the values for url and api_key from the MongoDB Atlas Data API section under Setup a MongoDB App services section. Please note that the url should be followed by /action/insertOne
.
Add below requirements in requirements.txt file:
Click Check my progress to verify the objective.
Go back to the list with your Cloud Functions. The cloud function with name "sensor-data-generator" is pre configured..
Click Check my progress to verify the objective.
In this lab you learned how to build an IoT use case using MongoDB Atlas and Google Cloud Eventarc, Cloud Pub/Sub and Cloud functions. You have also learned how to use MongoDB Data API to read and write from MongoDB and use MongoDB Charts to visualize the data.
To keep learning MongoDB try these labs:
Be sure to check out MongoDB on the Google Cloud Marketplace!
Get free $500 credits for MongoDB on Google Cloud Marketplace - Applicable only for new customers
...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.
Manual Last Updated: June 21, 2024
Lab Last Tested: June 21, 2024
Copyright 2024 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
This content is not currently available
We will notify you via email when it becomes available
Great!
We will contact you via email if it becomes available
One lab at a time
Confirm to end all existing labs and start this one