Loading...
No results found.
Share on LinkedIn Feed Twitter Facebook

Apply your skills in Google Cloud console

Data Lake Modernization on Google Cloud: Migrate Workflows

Data Lake Modernization on Google Cloud: Migrate Workflows

magic_button Data Lake Modernization Cloud Dataproc Data Management Apache Spark
These skills were generated by AI. Do you agree this course teaches these skills?
7 hours 15 minutes Intermediate

Welcome to Migrate Workflows, where we discuss how to migrate Spark and Hadoop tasks and workflows to Google Cloud.

Earn a badge today!

info
Course Info
Objectives
  • Describe options on migrating Spark to Google Cloud
  • Migrate spark Jobs directly to to Dataproc (“Lift and Shift”)
  • Optimize Spark Jobs to run on Google Cloud
  • Refactor Spark Jobs to use native Google Cloud services
  • Convert a Spark job to a serverless implementation using Cloud Functions
Prerequisites
Required: Have completed the Data Engineering on Google Cloud training, Be a Google Cloud Certified Professional Data Engineer or have equivalent expertise in Data Engineering, Have access to Cloud Connect – Partners. Recommended: Experience building data processing pipelines, Experience with Apache Beam and Apache Hadoop Java or Python programming expertise. Organizational requirements: The Cloud Partner organization must have implemented at least one Data Warehouse solution previously on any Data Warehouse platform.
Audience
Data Warehouse Deployment Engineers, Data Warehouse Consultants, Data Warehouse Architects. Technical Project Leads, Technical Project Managers, Data / Business Analyst.
Available languages
English

The Power of Challenge Labs

Now you can fast track your way to a skill badge without having to take the entire course. If you're confident with your skills, jump straight to the challenge lab.

Preview