Thomas Rose
成为会员时间:2021
黄金联赛
16465 积分
成为会员时间:2021
This skill badge course aims to unlock the power of data visualization and business intelligence reporting with Looker, and gain hands-on experience through labs.
完成中级技能徽章课程使用 BigQuery 构建数据仓库,展示以下技能: 联接数据以创建新表、排查联接故障、使用并集附加数据、创建日期分区表, 以及在 BigQuery 中使用 JSON、数组和结构体。 技能徽章是 Google Cloud 颁发的专属数字徽章, 旨在认可您在 Google Cloud 产品与服务方面的熟练度; 您需要在交互式实操环境中参加考核,证明自己运用所学知识的能力后 才能获得。完成此技能徽章课程和作为最终评估的实验室挑战赛, 获得数字徽章,在您的人际圈中炫出自己的技能。
完成入门级技能徽章课程“从 BigQuery 数据中挖掘数据洞见”,展示您在以下方面的技能: 编写 SQL 查询、查询公共表、将示例数据加载到 BigQuery 中、 在 BigQuery 中使用查询验证器排查常见的语法错误,以及通过连接到 BigQuery 数据在 Looker Studio 中 创建报告。 技能徽章是由 Google Cloud 颁发的专属数字徽章, 旨在认可您在 Google Cloud 产品与服务方面的熟练度。 您需要在交互式实操环境中参加考核,证明自己运用所学知识的能力后才能获得此徽章 。完成此技能徽章课程和作为最终评估的实验室挑战赛, 获得技能徽章,在您的人际圈中炫出自己的技能。
Data pipelines typically fall under one of the Extract and Load (EL), Extract, Load and Transform (ELT) or Extract, Transform and Load (ETL) paradigms. This course describes which paradigm should be used and when for batch data. Furthermore, this course covers several technologies on Google Cloud for data transformation including BigQuery, executing Spark on Dataproc, pipeline graphs in Cloud Data Fusion and serverless data processing with Dataflow. Learners get hands-on experience building data pipeline components on Google Cloud using Qwiklabs.
The two key components of any data pipeline are data lakes and warehouses. This course highlights use-cases for each type of storage and dives into the available data lake and warehouse solutions on Google Cloud in technical detail. Also, this course describes the role of a data engineer, the benefits of a successful data pipeline to business operations, and examines why data engineering should be done in a cloud environment. This is the first course of the Data Engineering on Google Cloud series. After completing this course, enroll in the Building Batch Data Pipelines on Google Cloud course.
In this course, we see what the common challenges faced by data analysts are and how to solve them with the big data tools on Google Cloud. You’ll pick up some SQL along the way and become very familiar with using BigQuery and Dataprep to analyze and transform your datasets. This is the first course of the From Data to Insights with Google Cloud series. After completing this course, enroll in the Creating New BigQuery Datasets and Visualizing Insights course.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
Welcome to Design in BigQuery, where we map Enterprise Data Warehouse concepts and components to BigQuery and Google data services with a focus on schema design.
This course discusses the key elements of Google's Data Warehouse solution portfolio and strategy.
Earn a skill badge by passing the final quiz, you'll demonstrate your understanding of foundational concepts in generative AI. A skill badge is a digital badge issued by Google Cloud in recognition of your knowledge of Google Cloud products and services. Share your skill badge by making your profile public and adding it to your social media profile.
This course will help ML Engineers, Developers, and Data Scientists implement Large Language Models for Generative AI use cases with Vertex AI. The first two modules of this course contain links to videos and prerequisite course materials that will build your knowledge foundation in Generative AI. Please do not skip these modules. The advanced modules in this course assume you have completed these earlier modules.
随着企业对人工智能和机器学习的应用越来越广泛,以负责任的方式构建这些技术也变得更加重要。但对很多企业而言,真正践行 Responsible AI 并非易事。如果您有意了解如何在组织内践行 Responsible AI,本课程正适合您。 本课程将介绍 Google Cloud 目前如何践行 Responsible AI,以及从中总结的最佳实践和经验教训,便于您以此为框架构建自己的 Responsible AI 方法。
Want to turn your marketing data into insights and build dashboards? Bring all of your data into one place for large-scale analysis and model building. Get repeatable, scalable, and valuable insights into your data by learning how to query it and using BigQuery. BigQuery is Google's fully managed, NoOps, low cost analytics database. With BigQuery you can query terabytes and terabytes of data without having any infrastructure to manage or needing a database administrator. BigQuery uses SQL and can take advantage of the pay-as-you-go model. BigQuery allows you to focus on analyzing data to find meaningful insights.
探索生成式 AI - Vertex AI 课程汇集了多组实验, 指导用户在 Google Cloud 平台上运用生成式 AI。参与实验,您将了解 如何使用 Vertex AI PaLM API 系列模型,包括 text-bison、chat-bison 和 textembedding-gecko。您还将了解提示设计、最佳实践, 以及如何使用生成式 AI 进行构思、文本分类、文本提取、文本 总结等任务。您还将学习如何通过 Vertex AI 自定义训练对基础模型进行调优, 并将模型部署到 Vertex AI 端点。
本课程介绍 Vertex AI Studio,这是一种用于与生成式 AI 模型交互、围绕业务创意进行原型设计并在生产环境中落地的工具。通过沉浸式应用场景、富有吸引力的课程和实操实验,您将探索从提示到产品的整个生命周期,了解如何将 Vertex AI Studio 用于多模态 Gemini 应用、提示设计、提示工程和模型调优。本课程的目的在于帮助您利用 Vertex AI Studio,在自己的项目中充分发掘生成式 AI 的潜力。
本课程教您如何使用深度学习来创建图片标注模型。您将了解图片标注模型的不同组成部分,例如编码器和解码器,以及如何训练和评估模型。学完本课程,您将能够自行创建图片标注模型并用来生成图片说明。
本课程简要介绍了编码器-解码器架构,这是一种功能强大且常见的机器学习架构,适用于机器翻译、文本摘要和问答等 sequence-to-sequence 任务。您将了解编码器-解码器架构的主要组成部分,以及如何训练和部署这些模型。在相应的实验演示中,您将在 TensorFlow 中从头编写简单的编码器-解码器架构实现代码,以用于诗歌生成。
本课程向您介绍扩散模型。这类机器学习模型最近在图像生成领域展现出了巨大潜力。扩散模型的灵感来源于物理学,特别是热力学。过去几年内,扩散模型成为热门研究主题并在整个行业开始流行。Google Cloud 上许多先进的图像生成模型和工具都是以扩散模型为基础构建的。本课程向您介绍扩散模型背后的理论,以及如何在 Vertex AI 上训练和部署此类模型。
完成 Introduction to Generative AI、Introduction to Large Language Models 和 Introduction to Responsible AI 三门课程,赢取技能徽章。通过最终测验,即表明您理解了生成式 AI 的基本概念。 技能徽章是由 Google Cloud 颁发的数字徽章,旨在认可您对 Google Cloud 产品与服务的了解程度。公开您的个人资料并将技能徽章添加到您的社交媒体个人资料中,以此来分享您获得的成就。
这是一节入门级微课程,旨在解释什么是负责任的 AI、它的重要性,以及 Google 如何在自己的产品中实现负责任的 AI。此外,本课程还介绍了 Google 的 7 个 AI 开发原则。
本课程向您介绍 Transformer 架构和 Bidirectional Encoder Representations from Transformers (BERT) 模型。您将了解 Transformer 架构的主要组成部分,例如自注意力机制,以及该架构如何用于构建 BERT 模型。您还将了解可以使用 BERT 的不同任务,例如文本分类、问答和自然语言推理。完成本课程估计需要大约 45 分钟。
本课程将向您介绍注意力机制,这是一种强大的技术,可令神经网络专注于输入序列的特定部分。您将了解注意力的工作原理,以及如何使用它来提高各种机器学习任务的性能,包括机器翻译、文本摘要和问题解答。
这是一节入门级微学习课程,探讨什么是大型语言模型 (LLM)、适合的应用场景以及如何使用提示调整来提升 LLM 性能,还介绍了可以帮助您开发自己的 Gen AI 应用的各种 Google 工具。
这是一节入门级微课程,旨在解释什么是生成式 AI、它的用途以及与传统机器学习方法的区别。该课程还介绍了可以帮助您开发自己的生成式 AI 应用的各种 Google 工具。
Complete the introductory Build LookML Objects in Looker skill badge to demonstrate skills in the following: building new dimensions and measures, views, and derived tables; setting measure filters and types based on requirements; updating dimensions and measures; building and refining Explores; joining views to existing Explores; and deciding which LookML objects to create based on business requirements.
Complete the intermediate Manage Data Models in Looker skill badge to demonstrate skills in the following: maintaining LookML project health; utilizing SQL runner for data validation; employing LookML best practices; optimizing queries and reports for performance; and implementing persistent derived tables and caching policies. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course, and the final assessment challenge lab, to receive a digital badge that you can share with your network.
This course introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud.
Explore the Looker platform and examine the extended capabilities offered through BigQuery.
In this course, you shadow a series of client meetings led by a Looker Professional Services Consultant.
By the end of this course, you should feel confident employing technical concepts to fulfill business requirements and be familiar with common complex design patterns.
In this course you will discover additional tools for your toolbox for working with complex deployments, building robust solutions, and delivering even more value.
Develop technical skills beyond LookML along with basic administration for optimizing Looker instances
This course reviews the processes for creating table calculations, pivots and visualizations
This course is designed for Looker users who want to create their own ad-hoc reports. It assumes experience of everything covered in our Get Started with Looker course (logging in, finding Looks & dashboards, adjusting filters, and sending data)
In this course you will discover Liquid, the templating language invented by Shopify and explore how it can be used in Looker to create dynamic links, content, formatting, and more.
Hands on course covering the main uses of extends and the three primary LookML objects extends are used on as well as some advanced usage of extends.
This course is designed to teach you about roles, permission sets and model sets. These are areas that are used together to manage what users can do and what they can see in Looker.
This course aims to introduce you to the basic concepts of Git: what it is and how it's used in Looker. You will also develop an in-depth knowledge of the caching process on the Looker platform, such as why they are used and why they work
This course provides an introduction to databases and summarized the differences in the main database technologies. This course will also introduce you to Looker and how Looker scales as a modern data platform. In the lessons, you will build and maintain standard Looker data models and establish the foundation necessary to learn Looker's more advanced features.
This course provides an iterative approach to plan, build, launch, and grow a modern, scalable, mature analytics ecosystem and data culture in an organization that consistently achieves established business outcomes. Users will also learn how to design and build a useful, easy-to-use dashboard in Looker. It assumes experience with everything covered in our Getting Started with Looker and Building Reports in Looker courses.
In this course, we’ll show you how organizations are aligning their BI strategy to most effectively achieve business outcomes with Looker. We'll follow four iterative steps: Plan, Build, Launch, Grow, and provide resources to take into your own services delivery to build Looker with the goal of achieving business outcomes.
By the end of this course, you should be able to articulate Looker's value propositions and what makes it different from other analytics tools in the market. You should also be able to explain how Looker works, and explain the standard components of successful service delivery.
This course empowers you to develop scalable, performant LookML (Looker Modeling Language) models that provide your business users with the standardized, ready-to-use data that they need to answer their questions. Upon completing this course, you will be able to start building and maintaining LookML models to curate and manage data in your organization’s Looker instance.
In this course, you learn how to do the kind of data exploration and analysis in Looker that would formerly be done primarily by SQL developers or analysts. Upon completion of this course, you will be able to leverage Looker's modern analytics platform to find and explore relevant content in your organization’s Looker instance, ask questions of your data, create new metrics as needed, and build and share visualizations and dashboards to facilitate data-driven decision making.
In this course, you will get hands-on experience applying advanced LookML concepts in Looker. You will learn how to use Liquid to customize and create dynamic dimensions and measures, create dynamic SQL derived tables and customized native derived tables, and use extends to modularize your LookML code.
完成为 Looker 信息中心和报告准备数据入门级技能徽章课程, 展现您在以下方面的技能:对数据进行过滤、排序和透视;将来自不同 Looker 探索的结果合并; 以及使用函数和运算符构建 Looker 信息中心和报告以用于数据分析和可视化。 技能徽章是由 Google Cloud 颁发的专属数字徽章,旨在认可 您在 Google Cloud 产品与服务方面的熟练度;您需要在交互式实操环境中参加考核, 证明自己运用所学知识的能力后才能获得此徽章。完成此技能徽章课程和 作为最终评估的实验室挑战赛,即可获得技能徽章,在您的人际圈中 炫出自己的技能。
In this quest, you will get hands-on experience with LookML in Looker. You will learn how to write LookML code to create new dimensions and measures, create derived tables and join them to Explores, filter Explores, and define caching policies in LookML.