About
The Transformer Neural Network is the current state-of-the-art technique in the field of NLP. It was proposed in the paper “Attention Is All You Need” in 2017 and aims to solve sequence-to-sequence tasks while handling long-range dependencies with ease.
In this session we will learn about transformers, their architecture, self-attention mechanism & usage in vision and language domain. We will also code a transformer with Azure & pytorch.