The Curriculum
From foundations to frontier. Each topic is a complete interactive deep-dive with visuals, code, quizzes, and community.
Foundations
Core building blocks of modern ML

Word2Vec
Word Embeddings & Skip-gramBeginner~35 min· 5 tabs · 6 chapters · 15 quizzes · ~35 minAvailable
See how a neural network learns word meanings by predicting context. Watch embeddings self-organize into semantic clusters in real time.
- Understand one-hot encoding and why dense vectors are better
- Build a skip-gram model from scratch in PyTorch
- Visualize how word embeddings self-organize into semantic clusters
- Perform vector arithmetic like king − man + woman ≈ queen

RNN
Recurrent Neural NetworksBeginner~40 minComing Soon
Watch how networks process sequences one step at a time, carrying hidden state forward through time.
- Understand how hidden state carries information across time steps
- Visualize vanishing and exploding gradients in deep sequences
- Build an RNN language model that generates text character by character
Prereqs: word2vec
Coming SoonArchitecture
The architectures powering modern AI

Transformers
Attention Is All You NeedIntermediate~50 minComing Soon
Dive into self-attention, multi-head attention, and positional encoding — the backbone of modern LLMs.
- Understand self-attention and why it replaced recurrence
- Visualize how multi-head attention attends to different relationships
- Walk through positional encoding and layer normalization
- Trace a token through every layer of a Transformer encoder
Prereqs: word2vec, rnn
Coming Soon
Diffusion
Diffusion Models & Image GenerationIntermediate~45 minComing Soon
Step through the denoising process that turns pure noise into images, one timestep at a time.
- Understand the forward process: how images become noise
- Visualize the reverse denoising process step by step
- Explore how U-Net predicts noise at each timestep
- See how classifier-free guidance steers image generation
Join the Community
Ask questions, share insights, and discuss across all topics with fellow learners.
Browse Discussions