arrow_back

Fundamentals of Cloud Logging

Join Sign in

Fundamentals of Cloud Logging

1 hour 15 minutes 1 Credit

GSP610

Google Cloud self-paced labs logo

Overview

Cloud Logging is part of the Operations suite of products in Google Cloud. It includes storage for logs, a user interface called the Logs Viewer, and an API to manage logs programmatically. Use Cloud Logging to read and write log entries, search and filter your logs, export your logs, and create logs-based metrics.

In this hands-on lab, you learn how to use Cloud Logging to accumulate application logs in a single place, filter to reach the required log, understand how to create logs based metrics for advanced analysis, examine the audit logs use case, and export logs for compliance and/or advanced analysis needs.

What you'll learn

  • Launch an example Google App Engine application to generate logs.

  • Use Cloud Logging console to interact with the logs generated by the application.

  • Create log-based Cloud Monitoring metrics.

  • Use Cloud Logging to dive deep into Audit Logging.

  • Create an Export of logs into BigQuery.

Setup and requirements

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud Console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:

    • The Open Google Console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Console. The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username from the Lab Details panel and paste it into the Sign in dialog. Click Next.

  4. Copy the Password from the Lab Details panel and paste it into the Welcome dialog. Click Next.

    Important: You must use the credentials from the left panel. Do not use your Google Cloud Skills Boost credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  5. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Cloud Console opens in this tab.

Note: You can view the menu with a list of Google Cloud Products and Services by clicking the Navigation menu at the top-left. Navigation menu icon

Activate Cloud Shell

Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.

  1. Click Activate Cloud Shell Activate Cloud Shell icon at the top of the Google Cloud console.

When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. The output contains a line that declares the PROJECT_ID for this session:

Your Cloud Platform project in this session is set to YOUR_PROJECT_ID

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  1. (Optional) You can list the active account name with this command:

gcloud auth list
  1. Click Authorize.

  2. Your output should now look like this:

Output:

ACTIVE: * ACCOUNT: student-01-xxxxxxxxxxxx@qwiklabs.net To set the active account, run: $ gcloud config set account `ACCOUNT`
  1. (Optional) You can list the project ID with this command:

gcloud config list project

Output:

[core] project = <project_ID>

Example output:

[core] project = qwiklabs-gcp-44776a13dea667a6 Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview guide.

Task 1. Set up your database

First, configure.

  1. Open the Navigation menu and select Firestore from the list of services.

  2. In the Get Started page, click Select Native Mode.

  3. Click the Select a location dropdown and choose nam5 (United States).

Note: In a production environment, you'd store your data close to the users and services that need it.
  1. Click Create Database.
Note: Wait for your Database to finish deploying before moving to the next step.

Task 2. Deploy the application

Use Cloud Shell command line to deploy a sample Google App Engine application called bookshelf. This web application generates logs for us to examine.

  1. Clone the source code repository for the bookshelf application:

git clone https://github.com/GoogleCloudPlatform/getting-started-python
  1. Navigate to the cloned repo:

cd getting-started-python/bookshelf
  1. Install dependencies with pip:

virtualenv -p python3 env source env/bin/activate pip3 install Flask==2.0.0 \ google-cloud-firestore==2.1.1 \ google-cloud-storage==1.38.0 \ google-cloud-logging==2.3.0 \ google-cloud-error-reporting==1.1.2 \ gunicorn==20.1.0 \ six==1.16.0 Note: These dependencies are pulled from the bookinfo's requirements.txt file with slightly modified versions.
  1. Deploy the bookshelf application:

gcloud app deploy
  1. Then hit "Y" to continue. After a few minutes, the app is fully deployed.

Output

Note: For this lab, you can safely disregard this warning: WARNING: Found incompatible dependencies: "grpcio-status 1.45.0 has requirement grpcio>=1.45.0, but you have grpcio 1.44.8. Note: If getting an error: NoCredentialsForAccountException, please rerun the command gcloud app deploy

Task 3. Viewing and searching logs

  1. Navigate to Cloud Logs Explorer to configure which logs you view.

  2. Select Navigation menu > Logging > Logs Explorer.

The Cloud Logging console has the following features:

  • Resource selector: filters by resource types
  • Log selector: filters to specific log types of the resources selected
  • Severity selector: filters to specific log levels
  • Histogram
  • A search box for text, label, or regular expression search, advanced filter

Cloud Logging console

Generate logs

Generate logs to view by visiting the Google App Engine web app (bookshelf) you provisioned earlier.

  1. In a new tab, launch the bookshelf application. The URL for your application is:

https://<PROJECT_ID>.uc.r.appspot.com

  1. Replace <PROJECT_ID> with your Project ID in the left panel of this lab.
Note: Optionally, you may copy and paste the complete URL from Cloud Shell (it was output as target url when you launched the App Engine app).
  1. If you see an Internal Server Error, this is because the Datastore Index is not yet ready. Wait a minute and reload your browser.

Expected Result:

App Engine Bookshelf

When you see the App Engine Bookshelf in your browser tab, your App Engine application is deployed and verified. Let's generate some logs!

Click Check my progress to verify the objective. Deploy the application

  1. Refresh the browser and click Add book. Then fill out the form like the following and click Save:

Add book form

  1. Return to the Cloud Logs Viewer.

Task 4. Filters

The Logs Explorer provides a variety of basic filters and advanced filters to tailor your search.

Basic filters

  1. Still in the Logs Explorer, enable the show query button.

  2. In the first (Resource) dropdown, select GAE Application > default > All version_id as the service for which you want to view the logs. Click Apply. This displays all the logs for your bookshelf app.

GAE Application menu

  1. In the next (Log name) dropdown, select all the log names and click Apply.

Log name dropdown menu

  1. In the next (Severity) dropdown, click Select multiple, then select all the checkboxes and click Apply.

Select severity level dropdown menu

  1. Your Query builder should look like this:

 Query builder

  1. Click Run Query button in the top right of the Query builder.

Advanced filters

  1. In the Query builder text area, add the following new line:
protoPayload.latency\>=0.01s

This line displays all GAE app logs with latency of greater than or equal to 0.01seconds.

  1. Click Run Query and review the updated list of log entries, which show page loads longer than 0.01s.

Updated list of log entries

  1. Remove the filters by clearing all the text from the Query builder text area and click Run Query.

Task 5. Log based metrics

Log-based metrics are Cloud Monitoring metrics based on the content of log entries. Therefore, your logs don't just sit around and wait for someone to notice problems; Cloud Monitoring automatically monitors the logs for events and information you define in monitoring metrics. Log-based metrics are also a great way to achieve monitoring of your custom applications. If your application can write logs to a VM's filesystem, you can build monitoring on top of them!

Cloud Logging provides two kinds of user-defined logs-based metrics - Counter and Distribution.

Counter metrics

Counter metrics count the number of log entries matching an advanced logs filter. For example, a metric that counts log entries representing certain types of errors from specific resources. Want to be alerted if a lot of your website visitors are receiving HTTP 500 errors? Counter metrics can help.

Distribution metrics

Distribution metrics accumulate numeric data from log entries matching a filter, and perform mathematical calculations against them. A common use for distribution metrics is to track latency patterns/trends over time. As each log entry is received, a latency value is extracted from the log entry and added to the distribution. At regular intervals, the accumulated distribution is written to Cloud Monitoring.

Task 6. Create a counter metric

In this section, you create a counter metric to count the number of successful website hits - in this case, all logs with HTTP response status = 200.

  1. Still in Logs Explorer, in the Resource selector dropdown, select GAE Application > default > All version_id and click Apply.
  2. In the Log name selector dropdown, select all the log names and click Apply.

Your Query builder should look like this:

Query builder

  1. Click Run Query button. In the Query results, click the status "200" (in any row that has 200) and select Show matching entries:

Query results and the Show matching entries option highlighted

You'll see:

  • The list displays only the logs with a 200 status.
  • In the Query build text area, notice that a filter is automatically created with the condition (protoPayload.status=200 OR httpRequest.status=200).

Create a monitoring metric based on your filter:

  1. Click Create Metric to create a monitoring metric based on this filter.
  2. In the Metric Editor, set Metric Type as Counter.
  3. Under the Details section, set the Log metric name to 200responses. Leave all the other fields at their default.
  4. Click CREATE METRIC.

Click Check my progress to verify the objective. Create a counter metric

  1. Click Logs-based metrics and list your new metric under User-defined Metrics section.

  2. View metric details by clicking the three dots on the new metric line > View in Metrics Explorer.

View in Metrics Explorer option highlighted

This opens Cloud Monitoring, wait for your Cloud Monitoring workspace to build.

When the Cloud Monitoring window opens, your workspace is ready.

  1. In Metrics Explorer:

  • Click the dropdown under Resource & Metrics.
  • Disable show only active resources & metrics.
  • Select GAE Application > Logs-based metric You should see a list of available Logs based Metrics. Select logging/user/200responses and click Apply.

Navigation path to logging/user/200responses highlighted

This metric is ready to monitor and analyze your application's behavior:

Metric explorer page

Note: Don't worry if your graph is currently empty—it will be populated as you continue with the lab.
  1. Save your chart to a Dashboard to easily check during the lab.

  • Click Save Chart in the upper right.

  • Select New Dashboard under Dashboard.

  • Name your Dashboard.

  • Click Save.

  1. Click View Dashboard to view the dashboard.

Task 7. Create a distribution metric

In this section, you create a distribution counter to monitor bookshelf application latency.

  1. Return to the Cloud Logs Viewer (Navigation menu > Logging > Logs Explorer). Create a filter to select GAE Application > default > All version_id, All Logs, and All Severity in the Query builder as shown below and click Run query.

  2. Click Create Metric.

  3. In the Metric Editor panel, set the following fields to the values below:

Field

Value

Metric Type

Distribution

Log metric name

latency_metric

Description

latency distribution

Field name

protoPayload.latency

  1. Click Create Metric.

Click Check my progress to verify the objective. Create a distribution metric

  1. Verify the latency metric is created in Logs-based metrics:

User-defined metrics

  1. Generate more logs. Refresh the bookshelf application multiple times and add another book. Give the metric a minute or two to catch up and accumulate the data points.

  2. Click View in Metrics Explorer by selecting the option in the vertical ellipsis menu against the metric:

View in Metrics Explorer option highlighted

  1. In Metric Explorer:

  • Click the dropdown under Resource & Metrics.
  • Select GAE Application > Logs-based metric > logging/user/latency_metric and click Apply.

Metric explorer and the selection GAE Application > Logs-based metric > logging/user/latency/metric highlighted

  1. Optional: Save this chart to your Dashboard and/or check out the Dashboard to see if the chart you previously saved shows data for 200 responses.

Task 8. View log metrics in the Cloud Monitoring Console

The Cloud Monitoring Overview window provides a monitoring resource overview.

  1. From the left menu, select Dashboards. Click your previously created Dashboard.

Dashboard displaying chart data

Cloud Monitoring displays the chart data in one of four formats: Line, Stacked Bar, Stacked Area or Heatmap. To specify the format:

  1. Click the vertical ellipse for one of the charts then click Edit.

Edit Dashboard option highlighted

  1. In the upper right dropdown, for Add chart, select a format Stacked Area.

Stacked Area chart format

Try each of the four views to see which one best represents your latency metric.

  1. Experiment with the other charts. Challenge: can you edit or add one of the charts you made in Metric Explorer?

Task 9. Audit logging

Google Cloud provides Auditing of all Google Cloud resources by default. The audit logs answer the question "Who did what, when?" Let's look at Audit Logging, starting by creating a new Compute Engine (Compute Engine) virtual machine (VM). Launching a VM is an example of an audited privileged activity, so it generates logs.

Launch a VM to generate audit log events

  1. In the Cloud Console, select Navigation menu > Compute Engine > VM instances.

Wait for the Compute Engine service to initialize.

  1. Click Create Instance:

  2. Set the following fields to the values below, leave all others at their defaults.

Field

Value

Series

N1

Machine Type

g1-small (1 vCPU, 1.7 GB memory)

Firewall

check Allow HTTP traffic

  1. Click Create.

Click Check my progress to verify the objective. Launch a VM to generate audit log events

Viewing audit logs in Activity Viewer

Activity Viewer, on the main Google Cloud Dashboard, provides a quick view into audit logs.

  1. Return to the Google Cloud Dashboard by clicking Navigation Menu > Cloud overview > Dashboard.

  2. Switch to the Activity tab.

Activity tab

  1. View the recent Audit log entries, with several related to creating a VM at the top.

Audit log entries

In the screenshot above, notice the four log entries documenting your creation of the VM and the HTTP firewall rule you associated with it.

  1. Click different rows for a few minutes to see what they tell you. Do you recognize any of the previous actions you have taken during this lab?

Viewing audit logs in Cloud Logs Viewer

In Cloud Logs Viewer, you can view the same Audit log entries as in Activity Viewer. Logs Viewer is much more versatile, allowing advanced filters and other log management functionality.

  1. From the Cloud Console, return to Cloud Logs Viewer (Navigation menu > Logging > Logs Explorer).
  2. In the Resource selector, select VM Instance > All instance_id and click Apply:

Resource selector

  1. In the Log Name selector dropdown, select activity under CLOUD AUDIT, and click Apply:

Log Name selector dropdown menu and Cloud Audit > activity highlighted

  1. Click Run Query in the top right of the Query builder and view the two Audit log entries that correspond to the Create VM and Completed: Create VM entries you saw in the Activity Viewer.

  2. Expand the Query Preview to look at all audit logs for all Google Cloud services. Remove line 1 to remove the gce_instance filter, then click Run Query.

View all activities performed in Google Cloud by any user.

  1. In any log row, click your Username (email) and click Show matching entries.

Notice in the Query Preview, this adds a new filter row to the Advanced Filter, and limits the result set to actions performed by you.

Task 10. Exporting logs

Cloud Logging retains logs for 30 days, after which they are deleted. To retain logs longer, you should export them to another storage system, or "sink", such as BigQuery. Cloud Logging allows you to set up automated exporting jobs so that all logs will automatically be exported. Logs may then be further analyzed with the features of your chosen sink.

Creating an export job

Set up an export job to send all audit logs to BigQuery for long-term storage and analysis.

  1. In the Cloud Logs Explorer window, add resource.type="gce_instance" in the 1st line of the Query builder.

  2. Remove line 3 from the Query builder and click Run Query, so that you are viewing all audit logs for your Google Cloud Project:

Cloud Logs Explorer window

  1. Click More Actions dropdown in the top right of the Query results section. Click Create Sink and set the following fields to the values below.

Field

Value

Sink Name

AuditLogs

then click Next.

Select Sink Service

BigQuery dataset

Select BigQuery dataset

Create new BigQuery dataset

and then name the new BigQuery dataset AuditLogs and click Create Dataset.

  1. Click Create Sink.

Click Check my progress to verify the objective. Exporting logs

Viewing audit logs in BigQuery

  1. Launch BigQuery. Select Navigation menu > BigQuery.
  2. Click Done to close the Welcome banner.
  3. In the left menu, click the arrow next to your Project ID to expand your Google Cloud project name to see your new AuditLogs dataset:

AuditLogs dataset in the Google Cloud project

Notice that AuditLogs has no tables under it yet. Log Exporting begins to send data to the sink after the export job is created. Generate some audit log entries, which will create a table in the sink and add rows to it.

  1. Return to the VM instances window (Navigation menu > Compute Engine > VM instances).

  2. Click your Compute Engine VM instance to view its details.

  3. Click Edit at the top to make two small changes to the VM:

  • Check the checkbox for Enable connecting to serial ports.

  • Scroll down and check the checkbox to Allow HTTPS traffic.

  1. Click Save.

Go to the Activity tab on the main Google Cloud Dashboard. You should see several Audit Log entries, including one named "Set metadata on VM", another named "Create firewall rule" and others related to your VM changes. You'll also see an event named "Create Table" indicating that the BigQuery sink was created.

After a minute or so (you may need to refresh the page) you'll see Audit Log entries indicating that the BigQuery table has been updated with the new Audit Log entries you just generated by editing the VM. Look at the timestamps to recognize all of the log entries related to your VM edit action and the associated BigQuery capture.

Click Check my progress to verify the objective. Viewing audit logs in BigQuery

  1. Return to the BigQuery console (Navigation menu > BigQuery) and expand the AuditLogs dataset. You might need to refresh the page. You should see that a new cloudaudit table has been created in the dataset. Click the new table:

cloudaudit table in the AuditLogs dataset

Use BigQuery to explore the audit logs.

  1. Click the new cloudaudit table, then click QUERY > In new tab:

Query Table icon highlighted

  1. Copy the following code into the Query Editor:

SELECT * FROM `<your-project-ID>.AuditLogs.<your_audit_log_table_name>` LIMIT 1000

After a few seconds the query completes and you see the Audit Log entries in the bottom Results pane. There are many columns, some of which are nested.

  1. Click anywhere in the Results pane, then use the arrow keys to scroll right and left. Look around a bit—audit logs are very detailed!

Now define a narrower query to view just a summary of each audit entry.

  1. Copy the following code into the Query Editor:

SELECT timestamp, resource.type, protopayload_auditlog.authenticationInfo.principalEmail, protopayload_auditlog.methodName FROM `<your-project-ID>.AuditLogs.<your_audit_log_table_name>` WHERE protopayload_auditlog.authenticationInfo.principalEmail = "<your_qwiklabs_username_email>" LIMIT 1000
  1. Replace the following parameters with the values below:

Parameter

Value

<your-project-ID>

Project ID (found in the left panel of the lab)

<your_audit_log_table_name>

audit log table name (found under AuditLogs in the BigQuery Resources section)

<your_qwiklabs_username_email>

Username (found in the left panel of the lab)

  1. Click Run to run the query. You should see a smaller set of columns, limited to actions you performed in Google Cloud. Your results should be very similar to the below:

Query results table

This simple query is just one example of using BigQuery to generate custom log output. You can construct any number of SQL queries to analyze your audit logs as you see fit.

Congratulations!

This concluded this hands-on lab, Fundamentals of Cloud Logging. You learned what Cloud Logging is and how to use it.

Finish your quest

This self-paced lab is part of the Cloud Logging quest. A quest is a series of related labs that form a learning path. Completing this quest earns you a badge to recognize your achievement. You can make your badge or badges public and link to them in your online resume or social media account. Enroll in this quest and get immediate completion credit. Refer to the Google Cloud Skills Boost catalog for all available quests.

Next steps / Learn more

Questions about Cloud Monitoring? See Cloud Monitoring Documentation.

See what else you can do with BigQuery.

For more information on advanced filters and on the various fields that you can use within your filter criteria, see:

Google Cloud training and certification

...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.

Manual Last Updated January 18, 2023
Lab Last Tested January 18, 2023

Copyright 2023 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.