Stephen Hwang
Menjadi anggota sejak 2021
Menjadi anggota sejak 2021
Ini adalah kursus pengantar pembelajaran mikro yang bertujuan untuk mendefinisikan AI Generatif, cara penggunaannya, dan perbedaannya dari metode machine learning konvensional. Kursus ini juga mencakup Alat-alat Google yang dapat membantu Anda mengembangkan aplikasi AI Generatif Anda sendiri.
This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.
Processing streaming data is becoming increasingly popular as streaming enables businesses to get real-time metrics on business operations. This course covers how to build streaming data pipelines on Google Cloud. Pub/Sub is described for handling incoming streaming data. The course also covers how to apply aggregations and transformations to streaming data using Dataflow, and how to store processed records to BigQuery or Bigtable for analysis. Learners get hands-on experience building streaming data pipeline components on Google Cloud by using QwikLabs.
Incorporating machine learning into data pipelines increases the ability to extract insights from data. This course covers ways machine learning can be included in data pipelines on Google Cloud. For little to no customization, this course covers AutoML. For more tailored machine learning capabilities, this course introduces Notebooks and BigQuery machine learning (BigQuery ML). Also, this course covers how to productionalize machine learning solutions by using Vertex AI.
The two key components of any data pipeline are data lakes and warehouses. This course highlights use-cases for each type of storage and dives into the available data lake and warehouse solutions on Google Cloud in technical detail. Also, this course describes the role of a data engineer, the benefits of a successful data pipeline to business operations, and examines why data engineering should be done in a cloud environment. This is the first course of the Data Engineering on Google Cloud series. After completing this course, enroll in the Building Batch Data Pipelines on Google Cloud course.
Selesaikan badge keahlian pengantar Menyiapkan Data untuk ML API di Google Cloud untuk menunjukkan keterampilan Anda dalam hal berikut: menghapus data dengan Dataprep by Trifacta, menjalankan pipeline data di Dataflow, membuat cluster dan menjalankan tugas Apache Spark di Dataproc, dan memanggil beberapa ML API, termasuk Cloud Natural Language API, Google Cloud Speech-to-Text API, dan Video Intelligence API. Badge keahlian adalah badge digital eksklusif yang diberikan oleh Google Cloud s ebagai pengakuan atas kemahiran Anda dalam menggunakan produk dan layanan Google Cloud serta menguji kemampuan Anda dalam menerapkan pengetahuan di lingkungan praktis yang interaktif. Selesaikan kursus badge keahlian ini dan challenge lab penilaian akhir, untuk menerima badge keahlian yang dapat Anda bagikan dengan jaringan Anda.
This course introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud.
Dasar-Dasar Google Cloud: Infrastruktur Inti memperkenalkan konsep dan terminologi penting untuk bekerja dengan Google Cloud. Melalui video dan lab interaktif, kursus ini menyajikan dan membandingkan banyak layanan komputasi dan penyimpanan Google Cloud, bersama dengan resource penting dan alat pengelolaan kebijakan.