Stanford CME295 Transformers & LLMs | Autumn 2025 | Lecture 1 - Transformer

Stanford Online October 17, 2025
Video Thumbnail

About

No channel description available.

Video Description

For more information about Stanford’s graduate programs, visit: https://online.stanford.edu/graduate-education September 26, 2025 This lecture covers: • Background on NLP and tasks • Tokenization • Embeddings • Word2vec, RNN, LSTM • Attention mechanism • Transformer architecture To follow along with the course schedule and syllabus, visit: https://cme295.stanford.edu/syllabus/ Chapters: 00:00:00 Introduction 00:03:54 Class logistics 00:09:40 NLP overview 00:22:57 Tokenization 00:30:28 Word representation 00:53:23 Recurrent neural networks 01:06:47 Self-attention mechanism 01:13:53 Transformer architecture 01:29:53 Detailed example Afshine Amidi is an Adjunct Lecturer at Stanford University. Shervine Amidi is an Adjunct Lecturer at Stanford University. View the course playlist: https://www.youtube.com/playlist?list=PLoROMvodv4rOCXd21gf0CF4xr35yINeOy

You May Also Like