In this course, you learn to analyze and choose the right database for your needs, to effectively develop applications on Google Cloud. You explore relational and NoSQL databases, dive into Cloud SQL, AlloyDB, and Spanner, and learn how to align database strengths with your application requirements, including those of generative AI. Gain hands-on experience configuring Vector Search and migrating applications to the cloud.
This Databases course consists of a series of advanced-level labs designed to validate your proficiency in migrating and managing Google Cloud databases. Each lab presents a set of the required tasks that you must complete with minimal assistance. The labs in this course have replaced the previous L300 Data Management Challenge Lab. If you have already completed the Challenge Lab as part of your L300 accreditation requirement, it will be carried over and count towards your L300 status. You must score 80% or higher for each lab to complete this course, and fulfill your CEPF L300 Database requirement. For technical issues with a Challenge Lab, please raise a Buganizer ticket using this CEPF Buganizer template: go/cepfl300labsupport
직원들이 검색창 하나로 문서 스토리지, 이메일, 채팅, 티켓 시스템, 기타 데이터 소스에서 특정 정보를 찾을 수 있도록 설계된 엔터프라이즈 도구인 Agentspace에는 Google의 전문적인 검색 및 AI 기술이 통합되어 있습니다. 또한 Agentspace 어시스턴트를 사용하면 브레인스토밍 및 조사는 물론 문서 개요를 작성하고 캘린더 일정에 동료를 초대하는 등의 작업에 도움이 되므로 직원들이 지식 관련 작업과 모든 종류의 협업을 빠르게 진행할 수 있습니다.
The learning path offers a deep dive into Google Cloud's data processing solutions, including: Dataflow Pub/Sub Managed Service for Apache Kafka BigQuery Engine for Apache Flink You'll learn how to leverage these tools to build, deploy, and troubleshoot efficient and scalable data pipelines for both batch and streaming data processing needs.
이 과정에서는 AI 해석 가능성과 투명성의 개념을 소개합니다. 개발자와 엔지니어에게 AI 투명성이 얼마나 중요한지를 설명합니다. 데이터와 AI 모델 모두에서 해석 가능성과 투명성을 구현하는 데 도움이 되는 실용적인 방법과 도구를 살펴봅니다.
이 과정에서는 생성형 AI 모델을 배포하고 관리할 때 MLOps팀이 직면하는 고유한 과제를 파악하는 데 필요한 지식과 도구를 제공하고 Vertex AI가 어떻게 AI팀이 MLOps 프로세스를 간소화하고 생성형 AI 프로젝트에서 성공을 거둘 수 있도록 지원하는지 살펴봅니다.
이 과정에서는 책임감 있는 AI라는 개념과 AI 원칙을 소개합니다. 공정성과 편향을 실질적으로 식별하고 AI/ML 실무에서 편향을 완화하는 기법을 알아봅니다. Google Cloud 제품과 오픈소스 도구를 사용하여 책임감 있는 AI 권장사항을 구현하는 실용적인 방법과 도구를 살펴봅니다.
중급 Gemini 멀티모달 및 멀티모달 RAG로 리치 문서 검사하기 기술 배지 과정을 완료하여 다음 기술 역량을 입증하세요. 멀티모달 프롬프트를 사용하여 텍스트 및 시각적 데이터에서 정보 추출, 동영상 설명 생성, Gemini의 멀티모달 기능을 사용하여 동영상은 물론 그 밖의 추가 정보 검색, 텍스트와 이미지가 포함된 문서의 메타데이터 구축, 모든 관련 텍스트 청크 가져오기, Gemini의 멀티모달 검색 증강 생성(RAG)을 사용하여 인용 문구 인쇄 등이 있습니다. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 기술 배지를 받을 수 있습니다.
이 과정에서는 AI 기반 검색 기술, 도구, 애플리케이션을 살펴봅니다. 벡터 임베딩을 활용하는 시맨틱 검색, 시맨틱 방식과 키워드 방식을 결합한 하이브리드 검색, 그라운딩된 AI 에이전트로서 AI 할루시네이션을 최소화하는 검색 증강 생성(RAG)에 대해 알아보세요. Vertex AI 벡터 검색을 활용해 지능형 검색 엔진을 빌드하는 실무 경험을 쌓을 수 있습니다.
이 과정에서는 Vertex AI Feature Store 사용의 이점, ML 모델의 정확성을 개선하는 방법, 가장 유용한 특성을 만드는 데이터 열을 찾는 방법을 살펴봅니다. 이 과정에는 BigQuery ML, Keras, TensorFlow를 사용한 특성 추출에 관한 콘텐츠와 실습도 포함되어 있습니다.
This learner pack introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud. Goals Identify the purpose and value of Google Cloud Data Platform Learn about batch and streaming data pipelines Build data lake and data warehouse You can find all of our technical learning packs on go/techlearningpacks and industry learning packs on go/industrylearningpacks. Brought to you by the CLS Tech Specialization Team (gcc-enablement-tech@). Share your request/feedback on go/learningpacks-feedback!
기업에서 인공지능과 머신러닝의 사용이 계속 증가함에 따라 책임감 있는 빌드의 중요성도 커지고 있습니다. 대부분의 기업은 책임감 있는 AI를 실천하기가 말처럼 쉽지 않습니다. 조직에서 책임감 있는 AI를 운영하는 방법에 관심이 있다면 이 과정이 도움이 될 것입니다. 이 과정에서 책임감 있는 AI를 위해 현재 Google Cloud가 기울이고 있는 노력, 권장사항, Google Cloud가 얻은 교훈을 알아보면 책임감 있는 AI 접근 방식을 구축하기 위한 프레임워크를 수립할 수 있을 것입니다.
Introduction to Generative AI, Introduction to Large Language Models, Introduction to Responsible AI 과정을 완료하고 기술 배지를 획득하세요. 최종 퀴즈를 풀어보고 생성형 AI의 기본 개념을 제대로 이해했는지 확인해 보세요. 기술 배지는 Google Cloud 제품 및 서비스에 대한 지식을 숙지한 사람에게 Google Cloud에서 발급하는 디지털 배지입니다. 프로필을 공개하고 기술 배지를 소셜 미디어 프로필에 추가하여 공유하세요.
책임감 있는 AI란 무엇이고 이것이 왜 중요하며 Google에서는 어떻게 제품에 책임감 있는 AI를 구현하고 있는지 설명하는 입문용 마이크로 학습 과정입니다. Google의 7가지 AI 원칙도 소개합니다.
데이터 파이프라인은 일반적으로 추출-로드(EL), 추출-로드-변환(ELT) 또는 추출-변환-로드(ETL) 패러다임 중 하나에 속합니다. 이 과정에서는 일괄 데이터에 사용해야 할 패러다임과 사용 시기에 대해 설명합니다. 또한 BigQuery, Dataproc에서의 Spark 실행, Cloud Data Fusion의 파이프라인 그래프, Dataflow를 사용한 서버리스 데이터 처리 등 데이터 변환을 위한 Google Cloud의 여러 가지 기술을 다룹니다. Google Cloud에서 Qwiklabs를 사용해 데이터 파이프라인 구성요소를 빌드하는 실무형 실습도 진행합니다.
초급 Dataplex로 데이터 메시 빌드하기 기술 배지 과정을 완료하여, Dataplex를 통해 데이터 메시를 빌드해 Google Cloud에서 데이터 보안, 거버넌스, 탐색을 활용하는 역량을 입증하세요. Dataplex에서 애셋에 태그를 지정하고, IAM 역할을 할당하고, 데이터 품질을 평가하는 기술을 연습하고 테스트할 수 있습니다. 기술 배지는 개인의 Google Cloud 제품 및 서비스 능력에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크와 공유 가능한 디지털 배지를 받을 수 있습니다.
초급 Google Cloud에서 ML API용으로 데이터 준비하기 기술 배지를 완료하여 Dataprep by Trifacta로 데이터 정리, Dataflow에서 데이터 파이프라인 실행, Dataproc에서 클러스터 생성 및 Apache Spark 작업 실행, Cloud Natural Language API, Google Cloud Speech-to-Text API, Video Intelligence API를 포함한 ML API 호출과 관련된 기술 역량을 입증하세요. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 기술 배지를 받을 수 있습니다.
중급 BigQuery로 데이터 웨어하우스 빌드 기술 배지를 완료하여 데이터를 조인하여 새 테이블 만들기, 조인 관련 문제 해결, 합집합으로 데이터 추가, 날짜로 파티션을 나눈 테이블 만들기, BigQuery에서 JSON, 배열, 구조체 작업하기와 관련된 기술 역량을 입증하세요. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 대화형 실습 환경을 통해 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 기술 배지를 받을 수 있습니다.
In the last installment of the Dataflow course series, we will introduce the components of the Dataflow operational model. We will examine tools and techniques for troubleshooting and optimizing pipeline performance. We will then review testing, deployment, and reliability best practices for Dataflow pipelines. We will conclude with a review of Templates, which makes it easy to scale Dataflow pipelines to organizations with hundreds of users. These lessons will help ensure that your data platform is stable and resilient to unanticipated circumstances.
MongoDB Atlas provided customers a fully managed, database-as-a-service on Google’s data cloud that is unmatched in speed, scale, and security—all with AI built in. Modern database systems, including MongoDB, have been a big step forward—giving businesses a more flexible, scalable, and developer-friendly alternative to legacy relational databases. But there is an even bigger payoff with a solution such as MongoDB Atlas a fully managed, database-as-a-service (DBaaS) offering. It is an approach that gives businesses all of the advantages of a modern, scalable, highly available database, while freeing IT to focus on high-value activities.
Earn a skill badge by completing the Tag and Discover BigLake Data quest, where you use BigQuery, BigLake, and Data Catalog within Dataplex to create, tag, and discover BigLake tables. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this Skill Badge, and the final assessment challenge lab, to receive a digital badge that you can share with your network.
이 과정에서는 Cloud Spanner에 대해 알아봅니다. Cloud Spanner를 다른 데이터베이스 제품과 비교하여 언제, 어떻게 Spanner를 사용하여 규모에 맞추어 관계형 데이터베이스 요구사항을 해결할 수 있는지 알아봅니다. Google Cloud의 다양한 도구를 사용하여 Spanner 데이터베이스를 만들고 관리하는 방법, Spanner의 분산 데이터베이스 모델을 염두에 두고 관계형 스키마를 최적화하는 방법, Spanner API를 사용하여 Spanner 데이터베이스와 상호작용하는 방법, Spanner를 애플리케이션과 통합하는 방법, Spanner 데이터베이스와 자체 데이터를 관리하는 데 다른 Google 도구를 사용하는 방법을 알아봅니다.
This learning pack is designed to have hands-on experience on Google Cloud data solutions. Goals Plan, execute, test, and monitor simple and complex enterprise database migrations to Google Cloud Choose an appropriate Google Cloud database, migrate SQL Server databases and run Oracle databases on Google Cloud bare metal Recognize and overcome the challenges of moving data to prevent data loss, preserve data integrity, and minimize downtime Evaluate on-premises database architectures and plan migrations to make the business case for moving databases to Google Cloud You can find all of our technical learning packs on go/techlearningpacks and industry learning packs on go/industrylearningpacks. Brought to you by the CLS Tech Specialization Team (gcc-enablement-tech@). Share your request/feedback on go/learningpacks-feedback!
This learning pack is intended to give architects, engineers, and developers the skills required to help enterprise customers architect, plan, execute, and test database migration projects. This course covers how to move on-premises, enterprise databases like SQL Server to Google Cloud (Compute Engine and Cloud SQL) and Oracle to Google Cloud bare metal. Goals Plan, execute, test, and monitor simple and complex enterprise database migrations to Google Cloud Choose an appropriate Google Cloud database, migrate SQL Server databases and run Oracle databases on Google Cloud bare metal Recognize and overcome the challenges of moving data to prevent data loss, preserve data integrity, and minimize downtime Evaluate on-premises database architectures and plan migrations to make the business case for moving databases to Google Cloud You can find all of our technical learning packs on go/techlearningpacks and industry learning packs on go/industrylearningpacks. Brought to you by …
The Google Cloud Rapid Migration & Modernization Program (RaMP) is a holistic, end-to-end migration/modernization program that helps customers & partners leverage expertise and best practices, lower risk, control costs, and simplify a customer's path to cloud success. This course will give an overview of the program and some of the tools and best practices available to support customer migrations & modernizations.
Cloud technology on its own only provides a fraction of the true value to a business; When combined with data–lots and lots of it–it has the power to truly unlock value and create new experiences for customers. In this course, you'll learn what data is, historical ways companies have used it to make decisions, and why it is so critical for machine learning. This course also introduces learners to technical concepts such as structured and unstructured data. database, data warehouse, and data lakes. It then covers the most common and fastest growing Google Cloud products around data.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
이 과정에서는 생성형 AI 모델과 상호작용하고 비즈니스 아이디어의 프로토타입을 제작하여 프로덕션으로 출시할 수 있는 도구인 Vertex AI Studio를 소개합니다. 몰입감 있는 사용 사례, 흥미로운 강의, 실무형 실습을 통해 프롬프트부터 프로덕션에 이르는 수명 주기를 살펴보고 Vertex AI Studio를 Gemini 멀티모달 애플리케이션, 프롬프트 설계, 프롬프트 엔지니어링, 모델 조정에 활용하는 방법을 알아봅니다. 이 과정의 목표는 Vertex AI Studio로 프로젝트에서 생성형 AI의 잠재력을 활용하는 것입니다.
이 과정에서는 딥 러닝을 사용해 이미지 캡션 모델을 만드는 방법을 알아봅니다. 인코더 및 디코더와 모델 학습 및 평가 방법 등 이미지 캡션 모델의 다양한 구성요소에 대해 알아봅니다. 이 과정을 마치면 자체 이미지 캡션 모델을 만들고 이를 사용해 이미지의 설명을 생성할 수 있게 됩니다.
이 과정은 기계 번역, 텍스트 요약, 질의 응답과 같은 시퀀스-투-시퀀스(Seq2Seq) 작업에 널리 사용되는 강력한 머신러닝 아키텍처인 인코더-디코더 아키텍처에 대한 개요를 제공합니다. 인코더-디코더 아키텍처의 기본 구성요소와 이러한 모델의 학습 및 서빙 방법에 대해 알아봅니다. 해당하는 실습 둘러보기에서는 TensorFlow에서 시를 짓는 인코더-디코더 아키텍처를 처음부터 간단하게 구현하는 코딩을 해봅니다.
이 과정에서는 최근 이미지 생성 분야에서 가능성을 보여준 머신러닝 모델 제품군인 확산 모델을 소개합니다. 확산 모델은 열역학을 비롯한 물리학에서 착안했습니다. 지난 몇 년 동안 확산 모델은 연구계와 업계 모두에서 주목을 받았습니다. 확산 모델은 Google Cloud의 다양한 최신 이미지 생성 모델과 도구를 뒷받침합니다. 이 과정에서는 확산 모델의 이론과 Vertex AI에서 이 모델을 학습시키고 배포하는 방법을 소개합니다.
초급 Dataplex 시작하기 기술 배지 과정을 완료하여 Dataplex 애셋 생성, 관점 유형 생성, Dataplex의 항목에 관점 적용과 관련된 기술 역량을 입증하세요.
생성형 AI 입문자 - Vertex AI 과정은 Google Cloud에서 생성형 AI를 사용하는 방법에 대한 실습으로 이루어져 있습니다. 실습을 통해 다음을 알아봅니다. text-bison, chat-bison, textembedding-gecko을 포함한 Vertex AI PaLM API 제품군에서 모델을 사용하는 방법을 알아봅니다. 프롬프트 설계, 권장사항에 대해 배우고 아이디어 구상, 텍스트 분류, 텍스트 추출, 텍스트 요약 등에 이를 사용하는 방법도 학습합니다. 또한 Vertex AI 커스텀 학습으로 파운데이션 모델을 학습시켜 모델을 조정하는 방법과 Vertex AI 엔드포인트에 배포하는 방법도 알아봅니다.
이 과정은 Transformer 아키텍처와 BERT(Bidirectional Encoder Representations from Transformers) 모델을 소개합니다. 셀프 어텐션 메커니즘 같은 Transformer 아키텍처의 주요 구성요소와 이 아키텍처가 BERT 모델 빌드에 사용되는 방식에 관해 알아봅니다. 또한 텍스트 분류, 질문 답변, 자연어 추론과 같이 BERT를 활용할 수 있는 다양한 작업에 대해서도 알아봅니다. 이 과정은 완료하는 데 대략 45분이 소요됩니다.
이 과정에서는 신경망이 입력 시퀀스의 특정 부분에 집중할 수 있도록 하는 강력한 기술인 주목 메커니즘을 소개합니다. 주목 메커니즘의 작동 방식과 이 메커니즘을 다양한 머신러닝 작업(기계 번역, 텍스트 요약, 질문 답변 등)의 성능을 개선하는 데 활용하는 방법을 알아봅니다.
이 과정은 입문용 마이크로 학습 과정으로, 대규모 언어 모델(LLM)이란 무엇이고, LLM을 활용할 수 있는 사용 사례로는 어떤 것이 있으며, 프롬프트 조정을 사용해 LLM 성능을 개선하는 방법은 무엇인지 알아봅니다. 또한 자체 생성형 AI 앱을 개발하는 데 도움이 되는 Google 도구에 대해서도 다룹니다.
생성형 AI란 무엇이고 어떻게 사용하며 전통적인 머신러닝 방법과는 어떻게 다른지 설명하는 입문용 마이크로 학습 과정입니다. 직접 생성형 AI 앱을 개발하는 데 도움이 되는 Google 도구에 대해서도 다룹니다.
This learner pack introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud. Goals Identify the purpose and value of Google Cloud Data Platform Learn about batch and streaming data pipelines Build data lake and data warehouse You can find all of our technical learning packs on go/techlearningpacks and industry learning packs on go/industrylearningpacks. Brought to you by the CLS Tech Specialization Team (gcc-enablement-tech@). Share your request/feedback on go/learningpacks-feedback!
This learning pack is intended to give architects, engineers, and developers the skills required to help enterprise customers architect, plan, execute, and test database migration projects. This course covers how to move on-premises, enterprise databases like SQL Server to Google Cloud (Compute Engine and Cloud SQL) and Oracle to Google Cloud bare metal. Goals Plan, execute, test, and monitor simple and complex enterprise database migrations to Google Cloud Choose an appropriate Google Cloud database, migrate SQL Server databases and run Oracle databases on Google Cloud bare metal Recognize and overcome the challenges of moving data to prevent data loss, preserve data integrity, and minimize downtime Evaluate on-premises database architectures and plan migrations to make the business case for moving databases to Google Cloud You can find all of our technical learning packs on go/techlearningpacks and industry learning packs on go/industrylearningpacks. Brought to you by …
Complete the introductory Create and Manage Cloud SQL for PostgreSQL Instances skill badge to demonstrate skills in the following: migrating, configuring, and managing Cloud SQL for PostgreSQL instances and databases.
Complete the introductory Create and Manage AlloyDB Instances skill badge to demonstrate skills in the following: performing core AlloyDB operations and tasks, migrating to AlloyDB from PostgreSQL, administering an AlloyDB database, and accelerating analytical queries using the AlloyDB Columnar Engine.
It’s no secret today that data is growing rapidly and considered the most critical asset of any organization. NetApp and Google Cloud play an instrumental role in enabling you to optimally store, protect and govern your data. With NetApp Cloud Manager and NetApp Cloud Volumes ONTAP data storage technology that utilizes Google Cloud compute, storage and networking infrastructure, you can easily manage storage operations and meet the requirements of any workload. In this course, you get hands-on practice on using NetApp Cloud Manager and Cloud Volumes ONTAP and learn about the capabilities delivered such as multi-protocol data access, built-in storage efficiencies and data protection features, remote caching and more.
Flex your Google Clout! Each week unlocks a new cloud puzzle. How fast can you find the solution? Share your score on your choice of social networks and join the conversation over in the Google Cloud Community.
In this quest you will get hands-on experience writing infrastructure as code with Terraform.
Complete the introductory Create and Manage Bigtable Instances skill badge to demonstrate skills in the following: creating instances, designing schemas, querying data, and performing administrative tasks in Bigtable including monitoring performance and configuring node autoscaling and replication.
Flex your Google Clout! Each day unlocks a new cloud puzzle. Complete all five and you’ll earn the inaugural Google Cloud badge! Share your score on your choice of social networks and join the conversation over in the Google Cloud Community.
Complete the introductory Create and Manage Cloud Spanner Instances skill badge to demonstrate skills in the following: creating and interacting with Cloud Spanner instances and databases; loading Cloud Spanner databases using various techniques; backing up Cloud Spanner databases; defining schemas and understanding query plans; and deploying a Modern Web App connected to a Cloud Spanner instance.
Earn a skill badge by completing the Explore Machine Learning Models with Explainable AI quest, where you will learn how to do the following using Explainable AI: build and deploy a model to an AI platform for serving (prediction), use the What-If Tool with an image recognition model, identify bias in mortgage data using the What-If Tool, and compare models using the What-If Tool to identify potential bias. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge quest and the final assessment challenge lab to receive a skill badge that you can share with your network.
This is the second Quest in a two-part series on Google Cloud billing and cost management essentials. This Quest is most suitable for those in a Finance and/or IT related role responsible for optimizing their organization’s cloud infrastructure. Here you'll learn several ways to control and optimize your Google Cloud costs, including setting up budgets and alerts, managing quota limits, and taking advantage of committed use discounts. In the hands-on labs, you’ll practice using various tools to control and optimize your Google Cloud costs or to influence your technology teams to apply the cost optimization best practices.
초급 BigQuery 데이터에서 인사이트 도출 기술 배지 과정을 완료하여 SQL 쿼리 작성, 공개 테이블 쿼리, BigQuery로 샘플 데이터 로드, BigQuery의 쿼리 검사기를 통한 일반적인 문법 오류 문제 해결, BigQuery 데이터를 연결해 Looker Studio에서 보고서를 생성하는 작업과 관련된 기술 역량을 입증하세요. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 기술 배지를 받게 됩니다.
Many traditional enterprises use legacy systems and applications that can't stay up-to-date with modern customer expectations. Business leaders often have to choose between maintaining their aging IT systems or investing in new products and services. "Modernize Infrastructure and Applications with Google Cloud" explores these challenges and offers solutions to overcome them by using cloud technology. Part of the Cloud Digital Leader learning path, this course aims to help individuals grow in their role and build the future of their business.
Earn a skill badge by completing the Share Data Using Google Data Cloud course, where you will gain practical experience with Google Cloud Data Sharing Partners, which have proprietary datasets that customers can use for their analytics use cases. Customers subscribe to this data, query it within their own platform, then augment it with their own datasets and use their visualization tools for their customer facing dashboards. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete the skill badge course, and final assessment challenge lab, to receive a digital badge that you can share with your network.
This quest introduces you to Vault and teaches you how to secure, store, and tightly control access to tokens, passwords, certificates, and encryption keys to protect secrets and other sensitive data.
In this course, we see what the common challenges faced by data analysts are and how to solve them with the big data tools on Google Cloud. You’ll pick up some SQL along the way and become very familiar with using BigQuery and Dataprep to analyze and transform your datasets. This is the first course of the From Data to Insights with Google Cloud series. After completing this course, enroll in the Creating New BigQuery Datasets and Visualizing Insights course.
The third course in this course series is Achieving Advanced Insights with BigQuery. Here we will build on your growing knowledge of SQL as we dive into advanced functions and how to break apart a complex query into manageable steps. We will cover the internal architecture of BigQuery (column-based sharded storage) and advanced SQL topics like nested and repeated fields through the use of Arrays and Structs. Lastly we will dive into optimizing your queries for performance and how you can secure your data through authorized views. After completing this course, enroll in the Applying Machine Learning to your Data with Google Cloud course.
This is the second course in the Data to Insights course series. Here we will cover how to ingest new external datasets into BigQuery and visualize them with Looker Studio. We will also cover intermediate SQL concepts like multi-table JOINs and UNIONs which will allow you to analyze data across multiple data sources. Note: Even if you have a background in SQL, there are BigQuery specifics (like handling query cache and table wildcards) that may be new to you. After completing this course, enroll in the Achieving Advanced Insights with BigQuery course.
Cloud Logging is a fully managed service that performs at scale. It can ingest application and system log data from thousands of VMs and, even better, analyze all that log data in real time. In this fundamental-level Quest, you learn how to store, search, analyze, monitor, and alert on log data and events from Google Cloud. The labs in the Quest give you hands-on practice using Cloud Logging to maximize your learning experience and provide insight on how you can use Cloud Logging to your own Google Cloud environment.
Earn the advanced skill badge by completing the Use Machine Learning APIs on Google Cloud course, where you learn the basic features for the following machine learning and AI technologies: Cloud Vision API, Cloud Translation API, and Cloud Natural Language API.
Cloud technology can bring great value to an organization, and combining the power of cloud technology with data has the potential to unlock even more value and create new customer experiences. “Exploring Data Transformation with Google Cloud” explores the value data can bring to an organization and ways Google Cloud can make data useful and accessible. Part of the Cloud Digital Leader learning path, this course aims to help individuals grow in their role and build the future of their business.
Google Cloud 네트워크 개발 과정을 완료하고 기술 배지를 획득하세요. 이 과정에서는 IAM 역할 탐색 및 프로젝트 액세스 권한 추가/삭제, VPC 네트워크 생성, Compute Engine VM 배포 및 모니터링, SQL 쿼리 작성, Compute Engine에서 VM 배포 및 모니터링, Kubernetes를 여러 배포 접근 방식과 함께 사용하여 애플리케이션을 배포하는 등의 다양한 애플리케이션 배포 및 모니터링 방법을 배울 수 있습니다. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 기술 배지를 받을 수 있습니다.
Machine Learning is one of the most innovative fields in technology, and the Google Cloud Platform has been instrumental in furthering its development. With a host of APIs, Google Cloud has a tool for just about any machine learning job. In this advanced-level course, you will get hands-on practice with machine learning at scale and how to employ the advanced ML infrastructure available on Google Cloud.
Workspace is Google's collaborative applications platform, delivered from Google Cloud. In this introductory-level course you will get hands-on practice with Workspace’s core applications from a user perspective. Although there are many more applications and tool components to Workspace than are covered here, you will get experience with the primary apps: Gmail, Calendar, Sheets and a handful of others. Each lab can be completed in 10-15 minutes, but extra time is provided to allow self-directed free exploration of the applications.
This course demonstrates the power of integrating Google Cloud services and tools with Workspace applications - like using Node.js to build a survey bot, the Natural Language API to recognize sentiment in a Google Doc, and building a chat bot with Apps Script.
Twelve years ago Lily started the Pet Theory chain of veterinary clinics, and has been expanding rapidly. Now, Pet Theory is experiencing some growing pains: their appointment scheduling system is not able to handle the increased load, customers aren't receiving lab results reliably through email and text, and veteranerians are spending more time with insurance companies than with their patients. Lily wants to build a cloud-based system that scales better than the legacy solution and doesn't require lots of ongoing maintenance. The team has decided to go with serverless technology. For the labs in the Google Cloud Run Serverless Quest, you will read through a fictitious business scenario in each lab and assist the characters in implementing a serverless solution. Looking for a hands on challenge lab to demonstrate your skills and validate your knowledge? On completing this quest, enroll in and finish the additional challenge lab at the end of this quest to receive an exclusive Google…
The hands-on labs in this Quest are structured to give experienced app developers hands-on practice with the state-of-the-art developing applications in Google Cloud. The topics align with the Google Cloud Certified Professional Cloud Developer Certification. These labs follow the sequence of activities needed to create and deploy an app in Google Cloud from beginning to end. Be aware that while practice with these labs will increase your skills and abilities, it is recommended that you also review the exam guide and other available preparation resources.
This is the second of two Quests of hands-on labs derived from the exercises from the book Data Science on Google Cloud Platform, 2nd Edition by Valliappa Lakshmanan, published by O'Reilly Media, Inc. In this second Quest, covering chapter 9 through the end of the book, you extend the skills practiced in the first Quest, and run full-fledged machine learning jobs with state-of-the-art tools and real-world data sets, all using Google Cloud tools and services.
Obtain a competitive advantage through DevOps. DevOps is an organizational and cultural movement that aims to increase software delivery velocity, improve service reliability, and build shared ownership among software stakeholders. In this course you will learn how to use Google Cloud to improve the speed, stability, availability, and security of your software delivery capability. DevOps Research and Assessment has joined Google Cloud. How does your team measure up? Take this five question multiple-choice quiz and find out!
The Data Lake Modernization course aims to prepare you to lead a Data Lake Modernization engagement through discovery & qualification through the technical considerations & cost modelling. The training is designed to educate on the Migration Journey, Data Lifecycle, Costing & Hands on Technical execution. At the end of the training you will have a deeper understanding of the Data Lake ecosystem, modernizing and migrating to GCP and hands-on experience of building data ingestion, processing & analytics pipelines on GCP.
In this second installment of the Dataflow course series, we are going to be diving deeper on developing pipelines using the Beam SDK. We start with a review of Apache Beam concepts. Next, we discuss processing streaming data using windows, watermarks and triggers. We then cover options for sources and sinks in your pipelines, schemas to express your structured data, and how to do stateful transformations using State and Timer APIs. We move onto reviewing best practices that help maximize your pipeline performance. Towards the end of the course, we introduce SQL and Dataframes to represent your business logic in Beam and how to iteratively develop pipelines using Beam notebooks.
In this course, you will receive technical training for Enterprise Data Warehouses solutions using BigQuery based on the best practices developed internally by Google’s technical sales and services organizations. The course will also provide guidance and training on key technical challenges that can arise when migrating existing Enterprise Data Warehouses and ETL pipelines to Google Cloud. You will get hands-on experience with real migration tasks, such as data migration, schema optimization, and SQL Query conversion and optimization. The course will also cover key aspects of ETL pipeline migration to Dataproc as well as using Pub/Sub, Dataflow, and Cloud Data Fusion, giving you hands-on experience using all of these tools for Data Warehouse ETL pipelines.
This content is deprecated. Please see the latest version of the course, here.
This course focuses on how you can bring your on-premises data lakes and workloads to Google Cloud to unlock cost savings and scale.
This course further explores SQL Server on Google Cloud.
머신러닝을 데이터 파이프라인에 통합하면 데이터에서 더 많은 인사이트를 도출할 수 있습니다. 이 과정에서는 머신러닝을 Google Cloud의 데이터 파이프라인에 포함하는 방법을 알아봅니다. 맞춤설정이 거의 또는 전혀 필요 없는 경우에 적합한 AutoML에 대해 알아보고 맞춤형 머신러닝 기능이 필요한 경우를 위해 Notebooks 및 BigQuery 머신러닝(BigQuery ML)도 소개합니다. Vertex AI를 사용해 머신러닝 솔루션을 프로덕션화하는 방법도 다루어 보겠습니다.
스트리밍을 통해 비즈니스 운영에 대한 실시간 측정항목을 얻을 수 있게 되면서 스트리밍 데이터 처리의 사용이 늘고 있습니다. 이 과정에서는 Google Cloud에서 스트리밍 데이터 파이프라인을 빌드하는 방법을 다룹니다. 수신되는 스트리밍 데이터 처리와 관련해 Pub/Sub를 설명합니다. 이 과정에서는 Dataflow를 사용해 집계 및 변환을 스트리밍 데이터에 적용하는 방법과 처리된 레코드를 분석을 위해 BigQuery 또는 Bigtable에 저장하는 방법에 대해서도 다룹니다. Google Cloud에서 Qwiklabs를 사용해 스트리밍 데이터 파이프라인 구성요소를 빌드하는 실습을 진행해 볼 수도 있습니다.
데이터 파이프라인의 두 가지 주요 구성요소는 데이터 레이크와 웨어하우스입니다. 이 과정에서는 스토리지 유형별 사용 사례를 살펴보고 Google Cloud에서 사용 가능한 데이터 레이크 및 웨어하우스 솔루션을 기술적으로 자세히 설명합니다. 또한 데이터 엔지니어의 역할, 성공적인 데이터 파이프라인이 비즈니스 운영에 가져오는 이점, 클라우드 환경에서 데이터 엔지니어링을 수행해야 하는 이유도 알아봅니다. 'Google Cloud의 데이터 엔지니어링' 시리즈의 첫 번째 과정입니다. 이 과정을 완료한 후 'Google Cloud에서 일괄 데이터 파이프라인 빌드하기' 과정에 등록하세요.
Google Cloud Application Programming Interfaces are the mechanism to interact with Google Cloud Services programmatically. This quest will give you hands-on practice with a variety of GCP APIs, which you will learn through working with Google’s APIs Explorer, a tool that allows you to browse APIs and run their methods interactively. By learning how to transfer data between Cloud Storage buckets, deploy Compute Engine instances, configure Dataproc clusters and much more, Exploring APIs will show you how powerful APIs are and why they are used almost exclusively by proficient GCP users. Enroll in this quest today.
Complete the introductory Build LookML Objects in Looker skill badge course to demonstrate skills in the following: building new dimensions and measures, views, and derived tables; setting measure filters and types based on requirements; updating dimensions and measures; building and refining Explores; joining views to existing Explores; and deciding which LookML objects to create based on business requirements.
In this course, you will get hands-on experience applying advanced LookML concepts in Looker. You will learn how to use Liquid to customize and create dynamic dimensions and measures, create dynamic SQL derived tables and customized native derived tables, and use extends to modularize your LookML code.
초급 Looker 대시보드 및 보고서를 위해 데이터 준비하기 기술 배지 과정을 완료하면 데이터를 필터링, 정렬, 피벗팅하고, 다른 Looker Explore의 결과를 병합하고, 함수 및 연산자를 사용해 데이터 분석 및 시각화를 위한 Looker 대시보드 및 보고서를 빌드하는 기술 역량을 입증할 수 있습니다. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유 가능한 기술 배지를 받을 수 있습니다.
중급 BigQuery ML로 ML 모델 만들기 기술 배지 과정을 완료하면 BigQuery ML로 머신러닝 모델을 만들고 평가하여 데이터 예측을 수행하는 기술 역량을 입증할 수 있습니다. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 기술 배지를 받을 수 있습니다.
Google Cloud 서비스는 보안에 있어 타협하지 않습니다. Google Cloud에서 프로젝트 전반의 보안과 ID를 보장하는 전용 도구를 개발했습니다. 이 초급 과정에서는 실무형 실습을 통해 Google Cloud의 Identity and Access Management(IAM) 서비스에 대해 알아봅니다. 이 서비스는 사용자 및 가상 머신 계정을 관리할 때 사용됩니다. VPC 및 VPN을 프로비저닝하여 네트워크 보안을 경험하고 보안 위협 및 데이터 손실 방지를 위해 사용할 수 있는 도구를 알아봅니다.
안전한 Google Cloud 네트워크 빌드 과정을 완료하여 기술 배지를 획득하세요. 이 과정에서는 Google Cloud에서 애플리케이션을 빌드, 확장, 보호하는 데 필요한 다양한 네트워킹 관련 리소스에 대해 배울 수 있습니다. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 기술 배지를 받을 수 있습니다.
Earn a skill badge by completing the Monitor Environments with Managed Service for Prometheus quest, where you learn Kubernetes Monitoring with Google Cloud Managed Service for Prometheus. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this Skill Badge, and the final assessment challenge lab, to receive a digital badge that you can share with your network.
This course is intended to give architects, engineers, and developers the skills required to help enterprise customers architect, plan, execute, and test database migration projects. Through a combination of presentations, demos, and hands-on labs participants move databases to Google Cloud while taking advantage of various services. This course covers how to move on-premises, enterprise databases like SQL Server to Google Cloud (Compute Engine and Cloud SQL) and Oracle to Google Cloud bare metal.
This advanced-level Quest builds on its predecessor Quest, and offers hands-on practice on the more advanced data integration features available in Cloud Data Fusion, while sharing best practices to build more robust, reusable, dynamic pipelines. Learners get to try out the data lineage feature as well to derive interesting insights into their data’s history.
Google Cloud 네트워크 설정 과정을 완료하고 기술 배지를 획득하세요. 이 실습에서는 Google Cloud Platform에서 기본적인 네트워킹 작업을 수행하는 방법을 알아봅니다. 커스텀 네트워크를 만들고 서브넷 방화벽 규칙을 추가한 다음 VM을 만들고 VM이 서로 통신할 때의 지연 시간을 테스트합니다. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 디지털 배지를 받을 수 있습니다.
This introductory-level quest shows application developers how the Google Cloud ecosystem could help them build secure, scalable, and intelligent cloud native applications. You learn how to develop and scale applications without setting up infrastructure, run data analytics, gain insights from data, and develop with pre-trained ML APIs to leverage machine learning even if you are not a Machine Learning expert. You will also experience seamless integration between various Google services and APIs to create intelligent apps.
In this quest, you will learn about Google Cloud’s IoT Core service and its integration with other services like GCS, Dataprep, Stackdriver and Firestore. The labs in this quest use simulator code to mimic IOT devices and the learning here should empower you to implement the same streaming pipeline with real world IoT devices.
In this course you will learn how to use several BigQuery ML features to improve retail use cases. Predict the demand for bike rentals in NYC with demand forecasting, and see how to use BigQuery ML for a classification task that predicts the likelihood of a website visitor making a purchase.
Earn the introductory skill badge by completing the Automate Data Capture at Scale with Document AI course. In this course, you learn how to extract, process, and capture data using Document AI.
This intermediate-level quest is unique among Qwiklabs quests. These labs have been curated to give operators hands-on practice with Anthos—a new, open application modernization platform on Google Cloud. Anthos enables you to build and manage modern hybrid applications. Tasks include: installing service mesh, collecting telemetry, and securing your microservices with service mesh policies. This quest is composed of labs targeted to teach you everything you need to know to introduce service mesh, and Anthos, into your next hybrid cloud project.
Google Cloud’s four step structured Cloud Migration Path Methodology provides a defined and repeatable path for users to follow when migrating and modernizing Virtual Machines. In this quest, you will get hands-on practice with Google’s current solution set for VM assessment, planning, migration, and modernization. You will start by analyzing your lab environment and building assessment reports with CloudPhysics and StratoZone, then build a landing zone within Google Cloud leveraging Terraform’s infrastructure-as-code templates, next you will manually transform a two-tier application into a cloud-native workload running on Kubernetes, and finally, transform a VM workload into Kubernetes with Migrate for Anthos and migrate a VM between cloud environments.
If you want to take your Google Cloud networking skills to the next level, look no further. This course is composed of labs that cover real-life use cases and it will teach you best practices for overcoming common networking bottlenecks. From getting hands-on practice with testing and improving network performance, to integrating high-throughput VPNs and networking tiers, Network Performance and Optimization is an essential course for Google Cloud developers who are looking to double down on application speed and robustness.
In this introductory-level quest, you will learn the fundamentals of developing and deploying applications on the Google Cloud Platform. You will get hands-on experience with the Google App Engine framework by launching applications written in languages like Python, Ruby, and Java (just to name a few). You will see first-hand how straightforward and powerful GCP application frameworks are, and how easily they integrate with GCP database, data-loss prevention, and security services.
이 초급 과정에서는 다른 과정과 차별화된 실습을 제공합니다. 이 과정은 IT 전문가에게 Google Cloud 공인 어소시에이트 클라우드 엔지니어 자격증 시험에서 다루는 주제와 서비스에 대한 실무형 실습을 제공하도록 선별되었습니다. IAM, 네트워킹, Kubernetes Engine 배포 등에 대해 다루며 Google Cloud 지식을 테스트해 볼 수 있는 구체적인 실습으로 구성되어 있습니다. 이러한 실습만으로도 기술과 역량을 향상시킬 수 있지만 시험 가이드 및 함께 제공되는 다른 준비용 리소스도 검토해 보시기 바랍니다.
The Google Cloud Platform provides many different frameworks and options to fit your application’s needs. In this introductory-level quest, you will get plenty of hands-on practice deploying sample applications on Google App Engine. You will also dive into other web application frameworks like Firebase, Wordpress, and Node.js and see firsthand how they can be integrated with Google Cloud.
TensorFlow is an open source software library for high performance numerical computation that's great for writing models that can train and run on platforms ranging from your laptop to a fleet of servers in the Cloud to an edge device. This quest takes you beyond the basics of using predefined models and teaches you how to build, train and deploy your own on Google Cloud.
SQL만으로 몇 시간이 아닌 몇 분 만에 머신러닝 모델을 빌드하고 싶으신가요? BigQuery ML은 데이터 분석가가 기존 SQL 도구와 기술을 사용하여 머신러닝 모델을 만들고, 학습시키고, 평가하고, 예측할 수 있게 하여 머신러닝을 범용화합니다. 이 실습 시리즈에서는 다양한 모델 유형을 실험하고 좋은 모델을 만드는 요소를 알아봅니다.
In this Quest, the experienced user of Google Cloud will learn how to describe and launch cloud resources with Terraform, an open source tool that codifies APIs into declarative configuration files that can be shared amongst team members, treated as code, edited, reviewed, and versioned. In these nine hands-on labs, you will work with example templates and understand how to launch a range of configurations, from simple servers, through full load-balanced applications.
이 과정은 Google Cloud 기본 개념 과정 이상의 지식을 얻기 위해 실무형 실습을 찾는 초보 클라우드 개발자에게 도움이 됩니다. 실습을 통해 Cloud Storage와 Monitoring 및 Cloud Functions 등 기타 주요 애플리케이션 서비스를 자세히 살펴보며 실무 경험을 쌓게 됩니다. 모든 Google Cloud 이니셔티브에 적용할 수 있는 유용한 기술을 개발할 수 있습니다.
Networking is a principle theme of cloud computing. It’s the underlying structure of Google Cloud, and it’s what connects all your resources and services to one another. This course will cover essential Google Cloud networking services and will give you hands-on practice with specialized tools for developing mature networks. From learning the ins-and-outs of VPCs, to creating enterprise-grade load balancers, Automate Deployment and Manage Traffic on a Google Cloud Network will give you the practical experience needed so you can start building robust networks right away.
Learn the ins and outs of Google Cloud's operations suite, an important service for generating insights into the health of your applications. It provides a wealth of information in application monitoring, report logging, and diagnoses. These labs will give you hands-on practice with and will teach you how to monitor virtual machines, generate logs and alerts, and create custom metrics for application data. It is recommended that the students have at least earned a Badge by completing the Google Cloud Essentials. Looking for a hands on challenge lab to demonstrate your skills and validate your knowledge? On completing this course, enroll in and finish the challenge lab at the end of the Monitor and Log with Google Cloud Operations Suite to receive an exclusive Google Cloud digital badge.
Kubernetes는 가장 인기 있는 컨테이너 조정 시스템이며, Google Kubernetes Engine은 Google Cloud에서 관리형 Kubernetes 배포를 지원하도록 특별히 설계되었습니다. 이 고급 과정에서는 Docker 이미지, 컨테이너를 구성하고 완전한 Kubernetes Engine 애플리케이션을 배포하는 실무형 실습을 진행합니다. 이 과정에서는 컨테이너 조정을 자체 워크플로에 통합하는 데 필요한 실용적인 기술을 알려드립니다. 기술을 입증하고 지식을 확인할 실무형 챌린지 실습을 찾고 계신가요? 이 과정을 마친 후 추가로 챌린지 실습을 완료하여 전용 Google Cloud 디지털 배지를 받으세요. 이 챌린지 실습은 Google Cloud에서 Kubernetes 애플리케이션 배포하기 과정이 끝나면 제공됩니다.
This fundamental-level quest is unique amongst the other quest offerings. The labs have been curated to give IT professionals hands-on practice with topics and services that appear in the Google Cloud Certified Professional Cloud Architect Certification. From IAM, to networking, to Kubernetes engine deployment, this quest is composed of specific labs that will put your Google Cloud knowledge to the test. Be aware that while practice with these labs will increase your skills and abilities, we recommend that you also review the exam guide and other available preparation resources.
This quest offers hands-on practice with Cloud Data Fusion, a cloud-native, code-free, data integration platform. ETL Developers, Data Engineers and Analysts can greatly benefit from the pre-built transformations and connectors to build and deploy their pipelines without worrying about writing code. This Quest starts with a quickstart lab that familiarises learners with the Cloud Data Fusion UI. Learners then get to try running batch and realtime pipelines as well as using the built-in Wrangler plugin to perform some interesting transformations on data.
Organizations around the world rely on Apache Kafka to integrate existing systems in real time and build a new class of event streaming applications that unlock new business opportunities. Google and Confluent are in a partnership to deliver the best event streaming service based on Apache Kafka and to build event driven applications and big data pipelines on Google Cloud Platform. In this course, you will first learn how to deploy and create a streaming data pipeline with Apache Kafka, then try out the different functionalities of the Confluent Platform.
Containerized applications have changed the game and are here to stay. With Kubernetes, you can orchestrate containers with ease, and integration with the Google Cloud Platform is seamless. In this advanced-level quest, you will be exposed to a wide range of Kubernetes use cases and will get hands-on practice architecting solutions over the course of 8 labs. From building Slackbots with NodeJS, to deploying game servers on clusters, to running the Cloud Vision API, Kubernetes Solutions will show you first-hand how agile and powerful this container orchestration system is.
Google Cloud is committed to supporting Windows workloads in its frameworks and services. In this advanced-level quest, you will get hands-on practice running many of the popular Windows services on Google Cloud. For example, you will learn how to instantiate Microsoft SQL databases, cloud tools for Powershell on Google Cloud Platform frameworks.
빅데이터, 머신러닝, 인공지능은 오늘날 인기 있는 컴퓨팅 관련 주제이지만 매우 전문화된 분야이기 때문에 초급용 자료를 구하기 어렵습니다. 다행히도 Google Cloud는 이러한 분야에서 사용자 친화적인 서비스를 제공하며 초급 과정을 통해 학습자에게 BigQuery, Cloud Speech API, Video Intelligence와 같은 도구를 사용해 시작할 기회를 제공합니다.
In this quest, you will get hands-on experience with LookML in Looker. You will learn how to write LookML code to create new dimensions and measures, create derived tables and join them to Explores, filter Explores, and define caching policies in LookML.
In this advanced-level quest, you will learn how to harness serious Google Cloud computing power to run big data and machine learning jobs. The hands-on labs will give you use cases, and you will be tasked with implementing big data and machine learning practices utilized by Google’s very own Solutions Architecture team. From running Big Query analytics on tens of thousands of basketball games, to training TensorFlow image classifiers, you will quickly see why Google Cloud is the go-to platform for running big data and machine learning jobs.
In this introductory level Quest you will gain practical experience on the fundamentals of sports data science using BigQuery. Start your journey by creating a soccer dataset in BigQuery by importing CSV and JSON files. Harness the power of BigQuery with sophisticated SQL analytical concepts, including using BigQuery ML to train an expected goals model on the soccer event data and evaluate the impressiveness of World Cup goals.
Complete the introductory Migrate MySQL data to Cloud SQL using Database Migration Services skill badge to demonstrate skills in the following: migrating MySQL data to Cloud SQL using different job types and connectivity options available in Database Migration Service and migrating MySQL user data when running Database Migration Service jobs. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge quest, and the final assessment challenge lab, to receive a skill badge that you can share with your network.
This course offers hands-on practice with migrating MySQL data to Cloud SQL using Database Migration Service. You start with an introductory lab that briefly reviews how to get started with Cloud SQL for MySQL, including how to connect to Cloud SQL instances using the Cloud Console. Then, you continue with two labs focused on migrating MySQL databases to Cloud SQL using different job types and connectivity options available in Database Migration Service. The course ends with a lab on migrating MySQL user data when running Database Migration Service jobs.
Google Cloud 기초: 핵심 인프라 과정은 Google Cloud 사용에 관한 중요한 개념 및 용어를 소개합니다. 이 과정에서는 동영상 및 실무형 실습을 통해 중요한 리소스 및 정책 관리 도구와 함께 Google Cloud의 다양한 컴퓨팅 및 스토리지 서비스를 살펴보고 비교합니다.
Big data, machine learning, and scientific data? It sounds like the perfect match. In this advanced-level quest, you will get hands-on practice with GCP services like Big Query, Dataproc, and Tensorflow by applying them to use cases that employ real-life, scientific data sets. By getting experience with tasks like earthquake data analysis and satellite image aggregation, Scientific Data Processing will expand your skill set in big data and machine learning so you can start tackling your own problems across a spectrum of scientific disciplines.
Blockchain and related technologies, such as distributed ledger and distributed apps, are becoming new value drivers and solution priorities in many industries. In this course you will gain hands-on experience with distributed ledger and the exploration of blockchain datasets in Google Cloud. It brings the research and solution work of Google's Allen Day into self-paced labs for you to run and learn directly. Since this course uses advanced SQL in BigQuery, a SQL-in-BigQuery refresher lab is at the start.
This is the first of two Quests of hands-on labs is derived from the exercises from the book Data Science on Google Cloud Platform, 2nd Edition by Valliappa Lakshmanan, published by O'Reilly Media, Inc. In this first Quest, covering up through chapter 8, you are given the opportunity to practice all aspects of ingestion, preparation, processing, querying, exploring and visualizing data sets using Google Cloud tools and services.
Want to turn your marketing data into insights and build dashboards? Bring all of your data into one place for large-scale analysis and model building. Get repeatable, scalable, and valuable insights into your data by learning how to query it and using BigQuery. BigQuery is Google's fully managed, NoOps, low cost analytics database. With BigQuery you can query terabytes and terabytes of data without having any infrastructure to manage or needing a database administrator. BigQuery uses SQL and can take advantage of the pay-as-you-go model. BigQuery allows you to focus on analyzing data to find meaningful insights.
This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.
이 과정에서는 데이터-AI 수명 주기를 지원하는 Google Cloud 빅데이터 및 머신러닝 제품과 서비스를 소개합니다. Google Cloud에서 Vertex AI를 사용하여 빅데이터 파이프라인 및 머신러닝 모델을 빌드하는 프로세스, 문제점 및 이점을 살펴봅니다.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
Earn a skill badge by completing the Create Conversational AI Agents with Dialogflow CX quest, where you will learn how to create a conversational virtual agent, including how to: define intents and entities, use versions and environments, create conversational branching, and use IVR features. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge quest, and the final assessment challenge lab, to receive a skill badge that you can share with your network.
In this advanced-level quest, you will learn the ins and outs of developing GCP applications in Python. The first labs will walk you through the basics of environment setup and application data storage with Cloud Datastore. Once you have a handle on the fundamentals, you will get hands-on practice deploying Python applications on Kubernetes and App Engine (the latter is the same framework that powers Snapchat!) With specialized bonus labs that teach user authentication and backend service development, this quest will give you practical experience so you can start developing robust Python applications straight away.
In this series of labs you will learn how to use BigQuery to analyze NCAA basketball data with SQL. Build a Machine Learning Model to predict the outcomes of NCAA March Madness basketball tournament games.
Want to learn the core SQL and visualization skills of a Data Analyst? Interested in how to write queries that scale to petabyte-size datasets? Take the BigQuery for Analyst Quest and learn how to query, ingest, optimize, visualize, and even build machine learning models in SQL inside of BigQuery.
Want to scale your data analysis efforts without managing database hardware? Learn the best practices for querying and getting insights from your data warehouse with this interactive series of BigQuery labs. BigQuery is Google's fully managed, NoOps, low cost analytics database. With BigQuery you can query terabytes and terabytes of data without having any infrastructure to manage or needing a database administrator. BigQuery uses SQL and can take advantage of the pay-as-you-go model. BigQuery allows you to focus on analyzing data to find meaningful insights.
Data Catalog is deprecated and will be discontinued on January 30, 2026. You can still complete this course if you want to. For steps to transition your Data Catalog users, workloads, and content to Dataplex Catalog, see Transition from Data Catalog to Dataplex Catalog (https://cloud.google.com/dataplex/docs/transition-to-dataplex-catalog). Data Catalog is a fully managed and scalable metadata management service that empowers organizations to quickly discover, understand, and manage all of their data. In this quest you will start small by learning how to search and tag data assets and metadata with Data Catalog. After learning how to build your own tag templates that map to BigQuery table data, you will learn how to build MySQL, PostgreSQL, and SQLServer to Data Catalog Connectors.
중급 BigQuery ML을 사용한 예측 모델링을 위한 데이터 엔지니어링 기술 배지를 획득하여 Dataprep by Trifact로 데이터 변환 파이프라인을 BigQuery에 빌드, Cloud Storage, Dataflow, BigQuery를 사용한 ETL(추출, 변환, 로드) 워크플로 빌드, BigQuery ML을 사용하여 머신러닝 모델을 빌드하는 기술 역량을 입증할 수 있습니다. 기술 배지는 Google Cloud 제품 및 서비스 숙련도에 따라 Google Cloud에서 독점적으로 발급하는 디지털 배지로, 기술 배지 과정을 통해 대화형 실습 환경에서 지식을 적용하는 역량을 테스트할 수 있습니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 디지털 배지를 받을 수 있습니다.
This advanced-level quest is unique amongst the other catalog offerings. The labs have been curated to give IT professionals hands-on practice with topics and services that appear in the Google Cloud Certified Professional Data Engineer Certification. From Big Query, to Dataprep, to Cloud Composer, this quest is composed of specific labs that will put your Google Cloud data engineering knowledge to the test. Be aware that while practice with these labs will increase your skills and abilities, you will need other preparation, too. The exam is quite challenging and external studying, experience, and/or background in cloud data engineering is recommended. Looking for a hands on challenge lab to demonstrate your skills and validate your knowledge? On completing this quest, enroll in and finish the additional challenge lab at the end of the Engineer Data in the Google Cloud to receive an exclusive Google Cloud digital badge.
Looking to build or optimize your data warehouse? Learn best practices to Extract, Transform, and Load your data into Google Cloud with BigQuery. In this series of interactive labs you will create and optimize your own data warehouse using a variety of large-scale BigQuery public datasets. BigQuery is Google's fully managed, NoOps, low cost analytics database. With BigQuery you can query terabytes and terabytes of data without having any infrastructure to manage or needing a database administrator. BigQuery uses SQL and can take advantage of the pay-as-you-go model. BigQuery allows you to focus on analyzing data to find meaningful insights. Looking for a hands on challenge lab to demonstrate your skills and validate your knowledge? On completing this quest, enroll in and finish the additional challenge lab at the end of this quest to receive an exclusive Google Cloud digital badge.
Cloud SQL is a fully managed database service that stands out from its peers due to high performance, seamless integration, and impressive scalability. In this quest you will receive hands-on practice with the basics of Cloud SQL and quickly progress to advanced features, which you will apply to production frameworks and application environments. From creating instances and querying data with SQL, to building Deployment Manager scripts and connecting Cloud SQL instances with applications run on GKE containers, this quest will give you the knowledge and experience needed so you can start integrating this service right away.
입문 Compute Engine에서 부하 분산 구현 기술 배지 과정을 완료하여 gcloud 명령어 작성 및 Cloud Shell 사용, Compute Engine에서 가상 머신 만들기 및 배포, 네트워크 및 HTTP 부하 분산기 구성에 관한 본인의 기술을 입증하세요. 기술 배지는 Google Cloud 제품 및 서비스에 대한 개인의 숙련도를 인정하기 위해 Google Cloud에서 단독 발급하는 디지털 배지로서 대화형 실습 환경을 통해 지식을 적용하는 역량을 테스트합니다. 이 기술 배지 과정과 최종 평가 챌린지 실습을 완료하면 네트워크에 공유할 수 있는 기술 배지를 받게 됩니다.
가장 인기 있는 이 탐구 과정에서 Google Cloud를 처음으로 실습할 수 있습니다. Stackdriver 및 Kubernetes의 고급 개념으로 실습하여 VM 가동, 키 인프라 도구 구성과 같은 기본사항을 익혀 보세요.