Приєднатися Увійти

Apply your skills in Google Cloud console

Katusha Fletcher

Учасник із 2024

Срібна ліга

Кількість балів: 52120
Google Cloud Fundamentals: Core Infrastructure - Yкраїнська Earned січ. 2, 2025 EST
Introduction to Data Engineering on Google Cloud Earned груд. 30, 2024 EST
Work with Gemini Models in BigQuery Earned груд. 26, 2024 EST
Boost Productivity with Gemini in BigQuery Earned груд. 26, 2024 EST
Manage Data Models in Looker Earned груд. 26, 2024 EST
Applying Advanced LookML Concepts in Looker Earned груд. 26, 2024 EST
BigQuery for Data Analysts Earned лист. 13, 2024 EST
Build LookML Objects in Looker Earned жовт. 23, 2024 EDT
Data Catalog Fundamentals Earned жовт. 4, 2024 EDT
Create ML Models with BigQuery ML Earned вер. 24, 2024 EDT
Derive Insights from BigQuery Data Earned вер. 24, 2024 EDT
Developing Data Models with LookML Earned вер. 11, 2024 EDT
Prepare Data for Looker Dashboards and Reports Earned вер. 10, 2024 EDT
Analyzing and Visualizing Data in Looker Earned вер. 10, 2024 EDT
Introduction to Data Analytics on Google Cloud Earned вер. 7, 2024 EDT
Підготовка даних для інтерфейсів API машинного навчання в Google Cloud Earned вер. 7, 2024 EDT
Serverless Data Processing with Dataflow: Develop Pipelines Earned вер. 7, 2024 EDT
Engineer Data for Predictive Modeling with BigQuery ML Earned вер. 7, 2024 EDT
Створення сітки даних за допомогою Dataplex Earned вер. 7, 2024 EDT
Build a Data Warehouse with BigQuery Earned вер. 5, 2024 EDT
Serverless Data Processing with Dataflow: Operations Earned вер. 1, 2024 EDT
Serverless Data Processing with Dataflow: Foundations Earned серп. 25, 2024 EDT
Smart Analytics, Machine Learning, and AI on Google Cloud Earned серп. 25, 2024 EDT
Building Resilient Streaming Analytics Systems on Google Cloud Earned серп. 24, 2024 EDT
Building Batch Data Pipelines on Google Cloud Earned серп. 23, 2024 EDT
Modernizing Data Lakes and Data Warehouses with Google Cloud Earned серп. 21, 2024 EDT
Preparing for your Professional Data Engineer Journey Earned серп. 20, 2024 EDT

Курс "Знайомство з Google Cloud: основна інфраструктура" охоплює важливі поняття й терміни щодо використання Google Cloud. Переглядаючи відео й виконуючи практичні завдання, слухачі ознайомляться з різними сервісами Google Cloud для обчислень і зберігання даних, а також важливими ресурсами й інструментами для керування правилами. Крім того, вони зможуть їх порівнювати.

Докладніше

In this course, you learn about data engineering on Google Cloud, the roles and responsibilities of data engineers, and how those map to offerings provided by Google Cloud. You also learn about ways to address data engineering challenges.

Докладніше

This course demonstrates how to use AI/ML models for generative AI tasks in BigQuery. Through a practical use case involving customer relationship management, you learn the workflow of solving a business problem with Gemini models. To facilitate comprehension, the course also provides step-by-step guidance through coding solutions using both SQL queries and Python notebooks.

Докладніше

This course explores Gemini in BigQuery, a suite of AI-driven features to assist data-to-AI workflow. These features include data exploration and preparation, code generation and troubleshooting, and workflow discovery and visualization. Through conceptual explanations, a practical use case, and hands-on labs, the course empowers data practitioners to boost their productivity and expedite the development pipeline.

Докладніше

Complete the intermediate Manage Data Models in Looker skill badge to demonstrate skills in the following: maintaining LookML project health; utilizing SQL runner for data validation; employing LookML best practices; optimizing queries and reports for performance; and implementing persistent derived tables and caching policies. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course, and the final assessment challenge lab, to receive a digital badge that you can share with your network.

Докладніше

In this course, you will get hands-on experience applying advanced LookML concepts in Looker. You will learn how to use Liquid to customize and create dynamic dimensions and measures, create dynamic SQL derived tables and customized native derived tables, and use extends to modularize your LookML code.

Докладніше

This course is designed for data analysts who want to learn about using BigQuery for their data analysis needs. Through a combination of videos, labs, and demos, we cover various topics that discuss how to ingest, transform, and query your data in BigQuery to derive insights that can help in business decision making.

Докладніше

Complete the introductory Build LookML Objects in Looker skill badge to demonstrate skills in the following: building new dimensions and measures, views, and derived tables; setting measure filters and types based on requirements; updating dimensions and measures; building and refining Explores; joining views to existing Explores; and deciding which LookML objects to create based on business requirements.

Докладніше

Data Catalog is deprecated and will be discontinued on January 30, 2026. You can still complete this course if you want to. For steps to transition your Data Catalog users, workloads, and content to Dataplex Catalog, see Transition from Data Catalog to Dataplex Catalog (https://cloud.google.com/dataplex/docs/transition-to-dataplex-catalog). Data Catalog is a fully managed and scalable metadata management service that empowers organizations to quickly discover, understand, and manage all of their data. In this quest you will start small by learning how to search and tag data assets and metadata with Data Catalog. After learning how to build your own tag templates that map to BigQuery table data, you will learn how to build MySQL, PostgreSQL, and SQLServer to Data Catalog Connectors.

Докладніше

Complete the intermediate Create ML Models with BigQuery ML skill badge to demonstrate skills in creating and evaluating machine learning models with BigQuery ML to make data predictions.

Докладніше

Complete the introductory Derive Insights from BigQuery Data skill badge to demonstrate skills in the following: write SQL queries, query public tables, load sample data into BigQuery, troubleshoot common syntax errors with the query validator in BigQuery, and create reports in Looker Studio by connecting to BigQuery data. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course, and the final assessment challenge lab, to receive a skill badge that you can share with your network.

Докладніше

This course empowers you to develop scalable, performant LookML (Looker Modeling Language) models that provide your business users with the standardized, ready-to-use data that they need to answer their questions. Upon completing this course, you will be able to start building and maintaining LookML models to curate and manage data in your organization’s Looker instance.

Докладніше

Complete the introductory Prepare Data for Looker Dashboards and Reports skill badge to demonstrate skills in the following: filtering, sorting, and pivoting data; merging results from different Looker Explores; and using functions and operators to build Looker dashboards and reports for data analysis and visualization. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course and the final assessment challenge lab to receive a skill badge that you can share with your network.

Докладніше

In this course, you learn how to do the kind of data exploration and analysis in Looker that would formerly be done primarily by SQL developers or analysts. Upon completion of this course, you will be able to leverage Looker's modern analytics platform to find and explore relevant content in your organization’s Looker instance, ask questions of your data, create new metrics as needed, and build and share visualizations and dashboards to facilitate data-driven decision making.

Докладніше

In this beginner-level course, you will learn about the Data Analytics workflow on Google Cloud and the tools you can use to explore, analyze, and visualize data and share your findings with stakeholders. Using a case study along with hands-on labs, lectures, and quizzes/demos, the course will demonstrate how to go from raw datasets to clean data to impactful visualizations and dashboards. Whether you already work with data and want to learn how to be successful on Google Cloud, or you’re looking to progress in your career, this course will help you get started. Almost anyone who performs or uses data analysis in their work can benefit from this course.

Докладніше

Пройдіть вступний кваліфікаційний курс Підготовка даних для інтерфейсів API машинного навчання в Google Cloud, щоб продемонструвати свої навички щодо очистки даних за допомогою сервісу Dataprep by Trifacta, запуску конвеєрів даних у Dataflow, створення кластерів і запуску завдань Apache Spark у Dataproc, а також виклику API машинного навчання, зокрема Cloud Natural Language API, Google Cloud Speech-to-Text API і Video Intelligence API. Кваліфікаційний значок – це ексклюзивна цифрова відзнака, яка підтверджує, що ви вмієте працювати з продуктами й сервісами Google Cloud і можете застосовувати ці знання в інтерактивному практичному середовищі. Щоб отримати кваліфікаційний значок і показати його колегам, пройдіть цей курс і підсумковий тест.

Докладніше

In this second installment of the Dataflow course series, we are going to be diving deeper on developing pipelines using the Beam SDK. We start with a review of Apache Beam concepts. Next, we discuss processing streaming data using windows, watermarks and triggers. We then cover options for sources and sinks in your pipelines, schemas to express your structured data, and how to do stateful transformations using State and Timer APIs. We move onto reviewing best practices that help maximize your pipeline performance. Towards the end of the course, we introduce SQL and Dataframes to represent your business logic in Beam and how to iteratively develop pipelines using Beam notebooks.

Докладніше

Complete the intermediate Engineer Data for Predictive Modeling with BigQuery ML skill badge to demonstrate skills in the following: building data transformation pipelines to BigQuery using Dataprep by Trifacta; using Cloud Storage, Dataflow, and BigQuery to build extract, transform, and load (ETL) workflows; and building machine learning models using BigQuery ML. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete the skill badge course, and final assessment challenge lab, to receive a digital badge that you can share with your network.

Докладніше

Пройдіть вступний кваліфікаційний курс Створення сітки даних за допомогою Dataplex, щоб продемонструвати свої навички створення такої сітки для покращеної безпеки даних, керування ними й пошуку в Google Cloud. Ви потренуєтеся й перевірите свої навички щодо позначення тегами об’єктів, призначення ролей IAM і перевірки якості даних у Dataplex. Кваліфікаційний значок – це ексклюзивна цифрова відзнака, яка підтверджує, що ви вмієте працювати з продуктами й сервісами Google Cloud, а також застосовувати ці знання в інтерактивному практичному середовищі. Щоб отримати кваліфікаційний значок і показати його колегам, пройдіть цей курс та підсумковий тест.

Докладніше

Complete the intermediate Build a Data Warehouse with BigQuery skill badge to demonstrate skills in the following: joining data to create new tables, troubleshooting joins, appending data with unions, creating date-partitioned tables, and working with JSON, arrays, and structs in BigQuery. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete the skill badge course, and final assessment challenge lab, to receive a digital badge that you can share with your network.

Докладніше

In the last installment of the Dataflow course series, we will introduce the components of the Dataflow operational model. We will examine tools and techniques for troubleshooting and optimizing pipeline performance. We will then review testing, deployment, and reliability best practices for Dataflow pipelines. We will conclude with a review of Templates, which makes it easy to scale Dataflow pipelines to organizations with hundreds of users. These lessons will help ensure that your data platform is stable and resilient to unanticipated circumstances.

Докладніше

This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.

Докладніше

Incorporating machine learning into data pipelines increases the ability to extract insights from data. This course covers ways machine learning can be included in data pipelines on Google Cloud. For little to no customization, this course covers AutoML. For more tailored machine learning capabilities, this course introduces Notebooks and BigQuery machine learning (BigQuery ML). Also, this course covers how to productionalize machine learning solutions by using Vertex AI.

Докладніше

Processing streaming data is becoming increasingly popular as streaming enables businesses to get real-time metrics on business operations. This course covers how to build streaming data pipelines on Google Cloud. Pub/Sub is described for handling incoming streaming data. The course also covers how to apply aggregations and transformations to streaming data using Dataflow, and how to store processed records to BigQuery or Bigtable for analysis. Learners get hands-on experience building streaming data pipeline components on Google Cloud by using QwikLabs.

Докладніше

Data pipelines typically fall under one of the Extract and Load (EL), Extract, Load and Transform (ELT) or Extract, Transform and Load (ETL) paradigms. This course describes which paradigm should be used and when for batch data. Furthermore, this course covers several technologies on Google Cloud for data transformation including BigQuery, executing Spark on Dataproc, pipeline graphs in Cloud Data Fusion and serverless data processing with Dataflow. Learners get hands-on experience building data pipeline components on Google Cloud using Qwiklabs.

Докладніше

The two key components of any data pipeline are data lakes and warehouses. This course highlights use-cases for each type of storage and dives into the available data lake and warehouse solutions on Google Cloud in technical detail. Also, this course describes the role of a data engineer, the benefits of a successful data pipeline to business operations, and examines why data engineering should be done in a cloud environment. This is the first course of the Data Engineering on Google Cloud series. After completing this course, enroll in the Building Batch Data Pipelines on Google Cloud course.

Докладніше

This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.

Докладніше