Guneet Singh
Menjadi anggota sejak 2021
Silver League
7400 poin
Menjadi anggota sejak 2021
This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.
Complete the introductory Build LookML Objects in Looker skill badge to demonstrate skills in the following: building new dimensions and measures, views, and derived tables; setting measure filters and types based on requirements; updating dimensions and measures; building and refining Explores; joining views to existing Explores; and deciding which LookML objects to create based on business requirements.
Selesaikan badge keahlian tingkat menengah Rekayasa Data untuk Pembuatan Model Prediktif dengan BigQuery ML untuk menunjukkan keterampilan Anda dalam hal berikut: membangun pipeline transformasi data ke BigQuery dengan Dataprep by Trifacta; menggunakan Cloud Storage, Dataflow, dan BigQuery untuk membangun alur kerja ekstrak, transformasi, dan pemuatan (ETL); serta membangun model machine learning menggunakan BigQuery ML. Badge keahlian adalah badge digital eksklusif yang diberikan oleh Google Cloud sebagai pengakuan atas kemahiran Anda dalam menggunakan produk dan layanan Google Cloud serta menguji kemampuan Anda dalam menerapkan pengetahuan di lingkungan praktik yang interaktif. Selesaikan kursus badge keahlian dan challenge lab penilaian akhir untuk menerima badge digital yang dapat Anda bagikan ke jaringan Anda.
Selesaikan badge keahlian pengantar Menyiapkan Data untuk ML API di Google Cloud untuk menunjukkan keterampilan Anda dalam hal berikut: menghapus data dengan Dataprep by Trifacta, menjalankan pipeline data di Dataflow, membuat cluster dan menjalankan tugas Apache Spark di Dataproc, dan memanggil beberapa ML API, termasuk Cloud Natural Language API, Google Cloud Speech-to-Text API, dan Video Intelligence API. Badge keahlian adalah badge digital eksklusif yang diberikan oleh Google Cloud s ebagai pengakuan atas kemahiran Anda dalam menggunakan produk dan layanan Google Cloud serta menguji kemampuan Anda dalam menerapkan pengetahuan di lingkungan praktis yang interaktif. Selesaikan kursus badge keahlian ini dan challenge lab penilaian akhir, untuk menerima badge keahlian yang dapat Anda bagikan dengan jaringan Anda.
Selesaikan pengantar badge keahlian Mengimplementasikan Load Balancing di Compute Engine untuk menunjukkan keterampilan berikut ini: menulis perintah gcloud dan menggunakan Cloud Shell, membuat dan men-deploy virtual machine di Compute Engine, serta mengonfigurasi jaringan dan load balancer HTTP. Badge keahlian adalah badge digital eksklusif yang diberikan oleh Google Cloud sebagai pengakuan atas kemahiran Anda dalam menggunakan produk dan layanan Google Cloud serta menguji kemampuan Anda dalam menerapkan pengetahuan di lingkungan yang interaktif. Selesaikan badge keahlian ini, dan penilaian akhir Challenge Lab, untuk menerima badge keahlian yang dapat Anda bagikan dengan jaringan Anda.
This is the first of two Quests of hands-on labs is derived from the exercises from the book Data Science on Google Cloud Platform, 2nd Edition by Valliappa Lakshmanan, published by O'Reilly Media, Inc. In this first Quest, covering up through chapter 8, you are given the opportunity to practice all aspects of ingestion, preparation, processing, querying, exploring and visualizing data sets using Google Cloud tools and services.
Big data, machine learning, and scientific data? It sounds like the perfect match. In this advanced-level quest, you will get hands-on practice with GCP services like Big Query, Dataproc, and Tensorflow by applying them to use cases that employ real-life, scientific data sets. By getting experience with tasks like earthquake data analysis and satellite image aggregation, Scientific Data Processing will expand your skill set in big data and machine learning so you can start tackling your own problems across a spectrum of scientific disciplines.
This advanced-level quest is unique amongst the other catalog offerings. The labs have been curated to give IT professionals hands-on practice with topics and services that appear in the Google Cloud Certified Professional Data Engineer Certification. From Big Query, to Dataprep, to Cloud Composer, this quest is composed of specific labs that will put your Google Cloud data engineering knowledge to the test. Be aware that while practice with these labs will increase your skills and abilities, you will need other preparation, too. The exam is quite challenging and external studying, experience, and/or background in cloud data engineering is recommended. Looking for a hands on challenge lab to demonstrate your skills and validate your knowledge? On completing this quest, enroll in and finish the additional challenge lab at the end of the Engineer Data in the Google Cloud to receive an exclusive Google Cloud digital badge.
The two key components of any data pipeline are data lakes and warehouses. This course highlights use-cases for each type of storage and dives into the available data lake and warehouse solutions on Google Cloud in technical detail. Also, this course describes the role of a data engineer, the benefits of a successful data pipeline to business operations, and examines why data engineering should be done in a cloud environment. This is the first course of the Data Engineering on Google Cloud series. After completing this course, enroll in the Building Batch Data Pipelines on Google Cloud course.
Dalam quest level pendahuluan ini, Anda akan mendapatkan praktik langsung dengan aneka fitur dan layanan dasar Google Cloud Platform. Dasar-Dasar GCP adalah Quest pertama yang direkomendasikan bagi peserta kursus Google Cloud—Anda dapat memulai dengan pengetahuan yang minim atau tanpa pengetahuan sama sekali tentang cloud, dan selesai dengan pengalaman praktis yang dapat diterapkan pada project GCP pertama Anda. Mulai dari menulis perintah Cloud Shell dan menerapkan mesin virtual pertama Anda, hingga menjalankan aplikasi di Kubernetes Engine atau dengan load balancing, Dasar-Dasar GCP merupakan pengenalan terbaik pada fitur-fitur dasar platform cloud. Setiap lab disertai video berdurasi 1 menit yang akan memandu Anda memahami berbagai konsep penting.
In this course, you learn how to create APIs that utilize multiple services and how you can use custom code on Apigee. You will also learn about fault handling, and how to share logic between proxies. You learn about traffic management and caching. You also create a developer portal, and publish your API to the portal. You learn about logging and analytics, as well as CI/CD and the different deployment models supported by Apigee. Through a combination of lectures, hands-on labs, and supplemental materials, you will learn how to design, build, secure, deploy, and manage API solutions using Google Cloud's Apigee API Platform.This is the third and final course of the Developing APIs with Google Cloud's Apigee API Platform course series.
This course introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud.
In this course, you learn how to secure your APIs. You explore the security concerns you will encounter for your APIs. You learn about OAuth, the primary authorization method for REST APIs. You will learn about JSON Web Tokens (JWTs) and federated security. You also learn about securing against malicious requests, safely sending requests across a public network, and how to secure your data for users of Apigee. Through a combination of lectures, hands-on labs, and supplemental materials, you will learn how to design, build, secure, deploy, and manage API solutions using Google Cloud's Apigee API Platform. This is the second course of the Developing APIs with Google Cloud's Apigee API Platform series. After completing this course, enroll in the API Development on Google Cloud's Apigee API Platform course.
In this course, you learn how to design APIs, and how to use OpenAPI specifications to document them. You learn about the API life cycle, and how the Apigee API platform helps you manage all aspects of the life cycle. You learn about how APIs can be designed using API proxies, and how APIs are packaged as API products to be used by app developers. Through a combination of lectures, hands-on labs, and supplemental materials, you will learn how to design, build, secure, deploy, and manage API solutions using Google Cloud's Apigee API Platform. This is the first course of the Developing APIs with Google Cloud's Apigee API Platform series. After completing this course, enroll in the API Security on Google Cloud's Apigee API Platform course.
Dasar-Dasar Google Cloud: Infrastruktur Inti memperkenalkan konsep dan terminologi penting untuk bekerja dengan Google Cloud. Melalui video dan lab interaktif, kursus ini menyajikan dan membandingkan banyak layanan komputasi dan penyimpanan Google Cloud, bersama dengan resource penting dan alat pengelolaan kebijakan.