Dołącz Zaloguj się

Wykorzystuj swoje umiejętności w konsoli Google Cloud

Emeka Agomoh

Jest członkiem od 2024

Liga brązowa

1350 pkt.
Encoder-Decoder Architecture Earned maj 25, 2024 EDT
Attention Mechanism Earned maj 25, 2024 EDT

This course gives you a synopsis of the encoder-decoder architecture, which is a powerful and prevalent machine learning architecture for sequence-to-sequence tasks such as machine translation, text summarization, and question answering. You learn about the main components of the encoder-decoder architecture and how to train and serve these models. In the corresponding lab walkthrough, you’ll code in TensorFlow a simple implementation of the encoder-decoder architecture for poetry generation from the beginning.

Więcej informacji

This course will introduce you to the attention mechanism, a powerful technique that allows neural networks to focus on specific parts of an input sequence. You will learn how attention works, and how it can be used to improve the performance of a variety of machine learning tasks, including machine translation, text summarization, and question answering. This course is estimated to take approximately 45 minutes to complete.

Więcej informacji