In this lab, you will investigate Cloud Audit Logs. Cloud Audit Logs maintains two audit logs for each project and organization: Admin Activity and Data Access.
Google Cloud services write audit log entries to these logs to help you answer the questions of "who did what, where, and when" within your Google Cloud projects.
Objectives
In this lab, you will learn how to perform the following tasks:
View audit logs in the Activity page.
View and filter audit logs in Cloud Logging.
Retrieve log entries with gcloud.
Export audit logs.
Setup and requirements
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
Activate Google Cloud Shell
Google Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud.
Google Cloud Shell provides command-line access to your Google Cloud resources.
In Cloud console, on the top right toolbar, click the Open Cloud Shell button.
Click Continue.
It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:
gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.
You can list the active account name with this command:
[core]
project = qwiklabs-gcp-44776a13dea667a6
Note:
Full documentation of gcloud is available in the
gcloud CLI overview guide
.
Check project permissions
Before you begin your work on Google Cloud, you need to ensure that your project has the correct permissions within Identity and Access Management (IAM).
In the Google Cloud console, on the Navigation menu (), select IAM & Admin > IAM.
Confirm that the default compute Service Account {project-number}-compute@developer.gserviceaccount.com is present and has the editor role assigned. The account prefix is the project number, which you can find on Navigation menu > Cloud overview > Dashboard.
Note: If the account is not present in IAM or does not have the `editor` role, follow the steps below to assign the required role.
In the Google Cloud console, on the Navigation menu (), click Cloud overview > Dashboard.
Replace {project-number} with your project number.
For Select a role, select Project (or Basic) > Editor.
Click Save.
Task 1. Enable data access audit logs
In this task, you enable data access audit logs.
Data access audit logs (except for BigQuery) are disabled by default, so you must first enable all audit logs. Logging charges for the volume of log data that exceeds the free monthly logs allotment.
All logs received by Logging count towards the logs allotment limit, except for the Cloud Audit Logs that are enabled by default. This includes all Google Cloud Admin Activity audit logs, System Event logs, plus data access audit logs from BigQuery only.
If you have not activated cloud shell yet then, on the Google Cloud Console title bar, click Activate Cloud Shell (). If prompted, click Continue.
At the command prompt, run this command to retrieve the current IAM policy for your project and save it as policy.json:
Click the Open Editor button to view the Cloud Shell code editor.
If an error indicates that the code editor could not be loaded because third-party cookies are disabled, click Open in New Window and switch to the new tab.
In the Cloud Shell code editor, click the policy.json file to expose its contents.
Add the following text to the policy.json file to enable data Access audit logs for all services. This text should be added just after the first { and before "bindings": [. (Be careful not to change anything else in the file).
Admin Activity logs contain log entries for API calls or other administrative actions that modify the configuration or metadata of resources. For example, the logs record when VM instances and App Engine applications are created and when permissions are changed.
To view the logs, you must have the Cloud Identity and Access Management roles Logging/Logs Viewer or Project/Viewer.
Admin Activity logs are always enabled so there is no need to enable them. There is no charge for your Admin Activity audit logs.
Note: You can view audit log entries in the Logs Viewer, Cloud Logging, and in the Cloud SDK. You can also export audit log entries to Pub/Sub, BigQuery, or Cloud Storage.
Use the Cloud Logging page
In the Google Cloud console, on the Navigation menu (), click View all products > Observability > Logging > Logs Explorer.
Copy and paste the following in the Query builder field.
Locate the log entry indicating that a Cloud Storage bucket was deleted. This entry will refer to storage.googleapis.com, which calls the storage.buckets.delete method to delete a bucket. The bucket name is the same name as your project id.
Within that entry, click on the storage.googleapis.com text and select Show matching entries.
Notice a line was added to the query preview textbox (located where the query builder had been) to show only storage events.
You should now see only the cloud storage entries.
Within that entry, click on the storage.buckets.delete text and select Show matching entries.
Notice another line was added to the Query preview textbox and now you can only see storage delete entries.
This technique can be used to easily locate desired events.
In the Query results, expand the Cloud Storage delete entry and then expand the protoPayload field.
Expand the authenticationInfo field and notice you can see the email address of the user that
performed this action.
Feel free to explore other fields in the entry.
Use the Cloud SDK
Log entries can also be read using the Cloud SDK command:
Example output:
gcloud logging read [FILTER]
In the Cloud Shell pane, use this command to retrieve only the audit activity for storage bucket deletion:
Note: If Cloud Shell is disconnected, then click reconnect.
gcloud logging read \
"logName=projects/$DEVSHELL_PROJECT_ID/logs/cloudaudit.googleapis.com%2Factivity \
AND protoPayload.serviceName=storage.googleapis.com \
AND protoPayload.methodName=storage.buckets.delete"
Task 4. Export the audit logs
In this task, you export audit logs. Individual audit log entries are kept for a specified length of time and are then deleted. The Cloud Logging Quota Policy explains how long log entries are retained. You cannot otherwise delete or modify audit logs or their entries.
Audit log type
Retention period
Admin Activity
400 days
Data Access
30 days
For longer retention, you can export audit log entries like any other Cloud Logging log entries and keep them for as long as you wish.
Export audit logs
When exporting logs, the current filter will be applied to what is exported.
In Logs Explorer, enter a query string in the Query builder to display all the audit logs. (This can be done by deleting all lines in the filter except the first one.) Your filter will look like what is shown below.
Click the Create sink button. The Logs Router Sinks page appears. Now, click on Logs Router.
On this page, you should be able to see the AuditLogsExport sink.
To the right of the AuditLogsExport sink, click the button with three dots () and select View sink details.
This will show information about the sink that you created.
Click Cancel when done.
Note: You could also export log entries to Pub/Sub or Cloud Storage. Exporting to Pub/Sub can be useful if you want to flow through an ETL process prior to storing in a database (Cloud Operations > PubSub > Dataflow > BigQuery/Bigtable).
Exporting to Cloud Storage will batch up entries and write them into Cloud Storage objects approximately once an hour.Note: All future logs will now be exported to BigQuery and the BigQuery tools can be used to perform analysis on the audit log data. The export does not export existing log entries.
In Cloud Shell, run the following commands to generate some more activity that you will view in the audit logs exported to BigQuery:
In this task, you export logs to a BigQuery dataset. You then analyze the logs using Query editor.
Note: When you export logs to a BigQuery dataset, Cloud Logging creates dated tables to hold the exported log entries. Log entries are placed in tables whose names are based on the entries' log names.
In the Google Cloud console, in the Navigation menu (), click BigQuery, then click Done.
In the left pane, under the Explorer section, click your project. This starts with (qwiklabs-gcp-xxx). You should see an auditlogs_dataset dataset under it.
Verify that the BigQuery dataset has appropriate permissions to allow the export writer to store log entries. Click on the auditlogs_dataset dataset.
From the Sharing dropdown, select Permissions.
On the Dataset Permission page, you will see the service account listed as BigQuery Data Editor member. If it's not already listed, you can add a service account under Add Principal and grant it the data editor role.
Click the Close button to close the Share Dataset screen.
Expand the dataset to see the table with your exported logs. (Click on the expand icon to expand the dataset.)
Click on the table name and take a moment to review the schemas and details of the tables that are being used.
Click the Query button.
In Cloud Shell, run the following commands again to generate some more activity that you will view in the audit logs exported to BigQuery:
Delete the text provided in the Query editor window and paste in the query below. This query will return the users that deleted virtual machines in the last 7 days.
#standardSQL
SELECT
timestamp,
resource.labels.instance_id,
protopayload_auditlog.authenticationInfo.principalEmail,
protopayload_auditlog.resourceName,
protopayload_auditlog.methodName
FROM
`auditlogs_dataset.cloudaudit_googleapis_com_activity_*`
WHERE
PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) BETWEEN
DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND
CURRENT_DATE()
AND resource.type = "gce_instance"
AND operation.first IS TRUE
AND protopayload_auditlog.methodName = "v1.compute.instances.delete"
ORDER BY
timestamp,
resource.labels.instance_id
LIMIT
1000
Click the Run button. After a couple of seconds you will see each time someone deleted a virtual machine within the past 7 days. You should see two entries, which is the activity you generated in this lab. Remember, BigQuery is only showing activity since the export was created.
Delete the text in the Query_editor window and paste in the query below. This query will return the users that deleted storage buckets in the last 7 days.
#standardSQL
SELECT
timestamp,
resource.labels.bucket_name,
protopayload_auditlog.authenticationInfo.principalEmail,
protopayload_auditlog.resourceName,
protopayload_auditlog.methodName
FROM
`auditlogs_dataset.cloudaudit_googleapis_com_activity_*`
WHERE
PARSE_DATE('%Y%m%d', _TABLE_SUFFIX) BETWEEN
DATE_SUB(CURRENT_DATE(), INTERVAL 7 DAY) AND
CURRENT_DATE()
AND resource.type = "gcs_bucket"
AND protopayload_auditlog.methodName = "storage.buckets.delete"
ORDER BY
timestamp,
resource.labels.instance_id
LIMIT
1000
Click the Run button. After a couple seconds you will see entries showing each time someone deleted a storage bucket within the past 7 days.
Note: As you can see, the ability to analyze audit logs in BigQuery is very powerful. In this activity, you viewed just two examples of querying audit logs.
Click Check my progress to verify the objective.
Export audit logs and use BigQuery to analyze logs
Congratulations!
In this lab, you have done the following:
Viewed audit logs on the activity page.
Viewed and filtered audit logs in Cloud Operations.
Retrieved log entries with gcloud.
Exported audit logs.
End your lab
When you have completed your lab, click End Lab. Google Cloud Skills Boost removes the resources you’ve used and cleans the account for you.
You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.
The number of stars indicates the following:
1 star = Very dissatisfied
2 stars = Dissatisfied
3 stars = Neutral
4 stars = Satisfied
5 stars = Very satisfied
You can close the dialog box if you don't want to provide feedback.
For feedback, suggestions, or corrections, please use the Support tab.
Copyright 2025 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
Labs create a Google Cloud project and resources for a fixed time
Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
On the top left of your screen, click Start lab to begin
Use private browsing
Copy the provided Username and Password for the lab
Click Open console in private mode
Sign in to the Console
Sign in using your lab credentials. Using other credentials might cause errors or incur charges.
Accept the terms, and skip the recovery resource page
Don't click End lab unless you've finished the lab or want to restart it, as it will clear your work and remove the project
This content is not currently available
We will notify you via email when it becomes available
Great!
We will contact you via email if it becomes available
One lab at a time
Confirm to end all existing labs and start this one
Use private browsing to run the lab
Use an Incognito or private browser window to run this lab. This
prevents any conflicts between your personal account and the Student
account, which may cause extra charges incurred to your personal account.
Configuring and Viewing Audit Logs in Cloud Logging