Data Lake Modernization on Google Cloud: Migrate Workflows
Data Lake Modernization on Google Cloud: Migrate Workflows
These skills were generated by AI. Do you agree this course teaches these skills?
Welcome to Migrate Workflows, where we discuss how to migrate Spark and Hadoop tasks and workflows to Google Cloud.
Info Kursus
Tujuan
- Describe options on migrating Spark to Google Cloud
- Migrate spark Jobs directly to to Dataproc (“Lift and Shift”)
- Optimize Spark Jobs to run on Google Cloud
- Refactor Spark Jobs to use native Google Cloud services
- Convert a Spark job to a serverless implementation using Cloud Functions
Prasyarat
Required:
Have completed the Data Engineering on Google Cloud training,
Be a Google Cloud Certified Professional Data Engineer or have equivalent expertise in Data Engineering,
Have access to Cloud Connect – Partners.
Recommended:
Experience building data processing pipelines,
Experience with Apache Beam and Apache Hadoop Java or Python programming expertise.
Organizational requirements: The Cloud Partner organization must have implemented at least one Data Warehouse solution previously on any Data Warehouse platform.
Audiens
Data Warehouse Deployment Engineers, Data Warehouse Consultants, Data Warehouse Architects. Technical Project Leads, Technical Project Managers, Data / Business Analyst.
Bahasa yang tersedia
English
Apa yang harus saya lakukan jika sudah menyelesaikan kursus ini?
Setelah menyelesaikan kursus ini, Anda dapat menjelajahi konten tambahan di jalur pembelajaran Anda atau mengakses katalog pembelajaran.
Badge apa yang bisa saya dapatkan?
Setelah menyelesaikan kursus, Anda akan mendapatkan badge kelulusan. Badge dapat dilihat di profil dan dibagikan di jaringan sosial Anda.
Tertarik mengikuti kursus ini dengan salah satu partner on-demand kami?
Jelajahi konten Google Cloud di Coursera dan Pluralsight.
Lebih suka belajar dengan instruktur?