arrow_back

Export Data from Google Earth Engine to BigQuery

Sign in Join
Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses

Export Data from Google Earth Engine to BigQuery

Lab 1 hour universal_currency_alt 1 Credit show_chart Introductory
Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses

GSP1180

Google Cloud self-paced labs logo

Overview

Google Earth Engine and BigQuery share the goal of making large-scale data processing accessible and usable by a wider range of people and applications. Google Earth Engine is a planetary-scale platform for Earth science data and analysis that is powered by Google's cloud infrastructure. BigQuery is a serverless and cost-effective enterprise data warehouse that works across clouds and scales with your data. Google Earth Engine workflows often focus on image (raster) processing, whereas BigQuery is optimized for processing large tabular datasets (including vector data). By combining Google Earth Engine data with BigQuery data sources, you get a more complete picture of a particular phenomenon across time and space.

Exporting data from Google Earth Engine to BigQuery allows you to leverage BigQuery's powerful analysis tools to extract insights from Google Earth Engine data and share Google Earth Engine data with SQL-friendly users in a way that's accessible for them. This lab focuses on how to export data from Google Earth Engine to BigQuery, while providing a real-world example of using Google's geospatial tools to identify flooded roads in Lancaster, England after a heavy rain event in November 2017.

In this lab, you learn how to export data extracted from satellite data in Google Earth Engine to BigQuery, combine exported data with public datasets in BigQuery using spatial queries, and create a quick visualization of the results in Looker Studio.

Overview of Earth Engine to BigQuery to Looker Studio workflow

What you'll do

  • Create a BigQuery dataset to store exported data.
  • Analyze and convert pixel-based data to vector data in Google Earth Engine.
  • Export data from Google Earth Engine to BigQuery.
  • Run a spatial query in BigQuery that combines the exported data with public datasets.
  • Create a quick visualization of the spatial query results in Looker Studio.

Setup and requirements

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:

    • The Open Google Cloud console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Cloud console (or right-click and select Open Link in Incognito Window if you are running the Chrome browser).

    The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username below and paste it into the Sign in dialog.

    {{{user_0.username | "Username"}}}

    You can also find the Username in the Lab Details panel.

  4. Click Next.

  5. Copy the Password below and paste it into the Welcome dialog.

    {{{user_0.password | "Password"}}}

    You can also find the Password in the Lab Details panel.

  6. Click Next.

    Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  7. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Google Cloud console opens in this tab.

Note: To view a menu with a list of Google Cloud products and services, click the Navigation menu at the top-left. Navigation menu icon

Task 1. Enable the Google Earth Engine API

  1. In the Google Cloud console, enter Google Earth Engine API in the top search bar.

  2. Click on the result for Google Earth Engine API under Marketplace.

  3. Click Enable.

Click Check my progress to verify the objective. Enable the Google Earth Engine API.

Task 2. Create a BigQuery dataset to store exported data

In BigQuery, tables must belong to a dataset, so you need to create at least one dataset to load data into BigQuery.

In this task, you create a new BigQuery dataset to store the data to be exported from Google Earth Engine in a later task.

Open the BigQuery console

  1. In the Google Cloud Console, select Navigation menu > BigQuery.

The Welcome to BigQuery in the Cloud Console message box opens. This message box provides a link to the quickstart guide and the release notes.

  1. Click Done.

The BigQuery console opens.

Create the dataset

  1. In the Explorer pane, next to your project id, click on View actions (view actions icon), then select Create dataset.

Highlighted Create dataset option in the Explorer section

  1. Set Dataset ID to ee_to_bq.

Leave all other fields at the default settings.

  1. Click Create dataset.

Click Check my progress to verify the objective. Create a BigQuery dataset to store exported data.

Task 3. Identify flooded areas in Google Earth Engine

The Google Earth Engine Data Catalog contains the Copernicus Sentinel Synthetic Aperture Radar collection. This public dataset is composed of radar images that measure how surfaces scatter light waves back to a satellite's sensor. Standing bodies of water act like mirrors for radio signals, reflecting the satellite's radar light away rather than scattering it back to the imaging sensor. Since most natural surfaces do not have this property, you can differentiate standing bodies of water from their surroundings by looking for "dark" patches in the images (that is, areas with low backscatter values).

In this task, you write and execute a script incrementally in the Google Earth Engine Code Editor to identify flooded pixels around Lancaster, England after a heavy rain event in November 2017. After you identify flooded pixels, you convert the pixel data to vector polygons, in preparation for exporting the data to BigQuery.

Open the Google Earth Engine Code Editor

In this section, you open the Google Earth Engine Code Editor and connect your active session to the Google Cloud project created for this lab.

  1. In a new Incognito browser tab, navigate to https://code.earthengine.google.com/.

  2. Select the lab username:

  3. Click I'm authorized for an existing cloud project.

  4. For Project, select the current project:

Note: If you do not see the project ID listed, the registration system may not yet have detected the Google Cloud project. Wait three minutes, and then click Refresh under Project to see the project ID listed.
  1. Click Select.
Note: Even though the Google Cloud project is registered with Google Earth Engine, you may see a warning to register the project. You can safely ignore this warning and proceed with the lab.

Select the image collection using a defined AOI

In this section, you add code in the Google Earth Engine Code Editor to select a portion of the Sentinel dataset by defining an area of interest (AOI) around Lancaster, England and using it to filter for Sentinel-1 images that include that AOI.

  1. To select from the image collection using a defined AOI, paste the following code into the Code Editor.
// Define the AOI around Lancaster, England (WGS84 coordinates). var aoi = ee.Geometry.Polygon( [[[-2.92, 54.10], [-2.92, 53.99], [-2.67, 53.99], [-2.67, 54.10]]], null, false); // Select from the Sentinel-1 image collection (log scaling and VV co-polar) and filter by the defined AOI. var collection = ee.ImageCollection('COPERNICUS/S1_GRD') .filterBounds(aoi) .filter(ee.Filter.listContains('transmitterReceiverPolarisation', 'VV')) .select('VV'); // Center the map around the defined AOI at zoom level 13. Map.centerObject(aoi, 13);
  1. To execute the code, click Run in the middle menu bar.

Calculate the difference between images (before and after rain)

To identify the areas which decreased in their backscatter values after heavy rain, you can calculate the difference between two images: one image from before the rain event and one image from after the rain event.

In this section, you expand on the existing code in the Code Editor to filter the image collection by date to identify before and after flood images. You also add code to smooth the data to remove noise and to calculate the difference between the two images.

In this section and subsequent sections, you paste new code after the existing code from the previous section, so that you can build the full script incrementally.

  1. In the Code Editor, leave the existing code from the previous section, and start a new line (after the existing code) to add new code.

  2. To filter the image collection by date and calculate the difference between images, paste the following code into the Code Editor, and click Run.

// Define a radius variable used to smooth the data to remove noise. var SMOOTHING_RADIUS_METERS = 100; // Filter by date (before flood) and smooth the image. var before = collection.filterDate('2017-11-01', '2017-11-17') .mosaic() .focalMedian(SMOOTHING_RADIUS_METERS, 'circle', 'meters'); // Filter by date (after flood) and smooth the image. var after = collection.filterDate('2017-11-18', '2017-11-23') .mosaic() .focalMedian(SMOOTHING_RADIUS_METERS, 'circle', 'meters'); // Calculate the difference between before and after rain images. var diffSmoothed = after.subtract(before); // Use threshold smoothed radar intensities to identify areas with standing water. var DIFF_THRESHOLD_DB = -3; var diffThresholded = diffSmoothed.lt(DIFF_THRESHOLD_DB); // Display the smoothed difference image on the map. Map.addLayer(diffSmoothed, {min: -10, max: 10}, 'Difference');

Remove persistent surface water to highlight flooded pixels

In this section, you use the Global Surface Water dataset to remove persistent surface water (such as lakes and rivers) from the final image.

  1. In the Code Editor, leave the existing code from the previous section, and start a new line (after the existing code) to add new code.

  2. To remove persistent surface water from the results, paste the following code into the Code Editor, and click Run.

// Identify and remove global surface water (lakes, rivers, etc). var jrcData0 = ee.Image('JRC/GSW1_0/Metadata') .select('total_obs') .lte(0); var waterMask = ee.Image('JRC/GSW1_0/GlobalSurfaceWater') .select('occurrence') .unmask(0) .max(jrcData0) .lt(50); // Define a mask for persistent water (more than 50% of the time). // Create the final flooded image with persistent water removed. var floodedPixels = diffThresholded.updateMask(waterMask); // Display the final flooded image pixels with persistent water removed. Map.addLayer(floodedPixels, {min: -10, max: 10}, 'Difference');

Convert flooded pixel data to vector

In this section, you convert the flooded pixels to vector polygons, in preparation for exporting the results to BigQuery.

  1. In the Code Editor, leave the existing code from the previous section, and start a new line (after the existing code) to add new code.

  2. To convert the pixels to vector polygons, paste the following code into the Code Editor, and click Run.

// Convert patches of pixels to polygons (vectors). var vectors = floodedPixels.reduceToVectors({ geometry: aoi, scale: 10, geometryType: 'polygon', eightConnected: false // only connect if pixels share an edge }); // Eliminate large features in the dataset. var MAX_AREA = 500 * 1000; // units in m^2 vectors = vectors.map(function (f) { return f.set('area', f.geometry().area(10)); }).filter(ee.Filter.lt("area", MAX_AREA)); // Display the final flooded vector polygons on the map. Map.addLayer(vectors, {color: 'blue'}, 'Flooded areas');

Task 4. Export vector data from Google Earth Engine to BigQuery

To export vector data from Google Earth Engine to BigQuery, you can use the Export.table.toBigQuery function by specifying the full path to the BigQuery table including project ID, BigQuery dataset name, and BigQuery table name.

In this task, you export the flood areas (the converted vector polygons) to a new table named flooded_areas in the BigQuery dataset that you created in Task 2.

  1. In the Code Editor, leave the existing code from Task 3, and start a new line (after the existing code) to add new code.

  2. To create the export job, paste the following code into the Code Editor, and click Run.

// This function already includes the current project, existing BigQuery dataset name, and BigQuery table name to be created. Export.table.toBigQuery({ collection: vectors, description:'ee2bq_export_polygons', table: '{{{project_0.project_id | Project ID}}}.ee_to_bq.flooded_areas' });
  1. In the tab named Tasks, click Run next to the unsubmitted task named ee2bq_export-polygons to open the export job.

  2. Click Run to execute the export job.

  3. After the task has executed successfully, click on the ee2bq_export-polygons job under Submitted tasks to see the job details.

  4. Click Open in BigQuery to explore the exported table in BigQuery.

Click Check my progress to verify the objective. Export vector data from Google Earth Engine to BigQuery.

Task 5. Run a spatial query in BigQuery to identify flooded road segments

Through the Google Cloud Public Dataset Program, you can query many publicly available datasets in BigQuery such as OpenStreetMap (OSM).

In this task, you execute SQL in BigQuery to combine the exported flooded areas and the OSM planet_ways dataset containing roads to identify the highways that were flooded during the rain event.

  1. In BigQuery, click on the Query tab (magnifying glass icon) to open a new query window.

  2. To select all of the highways that intersect the flooded areas, paste the following query, and click Run.

SELECT * FROM ( -- query 1 - Select the polygons from the exported dataset as flood_poly. SELECT geo AS flood_poly FROM `ee_to_bq.flooded_areas`) t1 JOIN ( -- query 2 - Select the highways in OpenStreetMap data as road_geometry. SELECT id, version, changeset, osm_timestamp, geometry as road_geometry FROM `bigquery-public-data.geo_openstreetmap.planet_ways` planet_ways, planet_ways.all_tags AS all_tags WHERE all_tags.key = 'highway' ) ON -- Intersect the flooded areas and highway polygons. ST_INTERSECTS(flood_poly, road_geometry)

Click Check my progress to verify the objective. Run a spatial query in BigQuery to identify flooded road segments.

Task 6. Create a quick visualization of the spatial query results in Looker Studio

  1. In the Query results window, click Explore data, and select Explore with Looker Studio.

The Welcome to Google Looker Studio message box opens.

  1. Click Get Started.

Notice that default visualizations are created for the data.

  1. To remove the default visualizations, right-click on each, and select Delete.

  2. Click Add a chart on the top menu bar.

  3. Under Google Maps, select Line map.

  4. To add the line map to the canvas space, click in the center of the canvas space.

If desired, you can click on each corner to move and stretch the visualization to fill the canvas.

Notice the message stating "Chart configuration incomplete". You address this issue in the next step.

  1. For Setup > Geospatial field, click on flood_poly, and select road_geometry.

If you do not see the Setup panel, click on the line map to activate the panel.

  1. For Setup > Fields > Location, click on Invalid dimension, and select id.

  2. For Style > Line Map Layer, click on the Line color icon (pencil), and select any color such as red.

The flooded road segments are visible in red on the background street map.

Map of flooded road segments around Lancaster, England

You can finalize your visualization by exploring other style options, export to PDF by navigating to File > Download > PDF, or click Save and share to create a shareable link.

To learn more about these options to finalize and share visualizations, review examples and tutorials on Looker Studio Help or complete other hands-on labs for Looker Studio on Cloud Skills Boost.

Congratulations!

You learned how to export data from Google Earth Engine to BigQuery, combine exported data with public datasets in BigQuery using spatial queries, and create a quick visualization of the results in Looker Studio.

Google Cloud training and certification

...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.

Manual Last Updated November 21, 2023

Lab Last Tested November 21, 2023

Copyright 2023 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.