

Interactive deep-dives where every equation has a visual twin. Watch models train, drag tensors, step through gradients — then read the annotated code.
No prerequisites beyond curiosity and basic Python.
Not lectures. Not textbooks. Hands-on interaction with a real model.

Watch models train in real time. Scrub through thousands of training steps. See patterns emerge as the algorithm discovers structure in data.
Explore
Drag vectors. Adjust parameters. Step through forward passes and backpropagation with real numbers from a real model.
Explore
Every algorithm comes with annotated Python + PyTorch source. Hover any line for an explanation. Cross-references link code to the visual it produces.
Explore
Quiz questions per topic across multiple difficulty tiers. Earn XP. Track your progress. One chance per question — no guessing.
ExploreEvery topic is a multi-layered experience, not a single page.

Every topic starts with a visual narrative — build intuition before equations
Try it yourself
Not a replacement for everything — but something none of them do.
See the algorithm run
You control the training loop
Real model, real data
Actual PyTorch models on real data
Step through gradients
Click each gradient, see the force
Read production code
Annotated Python you can run
Cost
100% free, forever
Whatever your background, there's a starting point.

“I want to understand ML but math scares me”
Start with The Story tab — every topic begins with a visual narrative. See each concept before the equation appears. No math prerequisites.
Browse Topics
“I know the theory but it doesn't click”
Jump to Inside the Model — inspect actual weight matrices, step through real gradients, and watch training unfold step by step.
Browse Topics
“I use PyTorch daily but want deeper intuition”
Go straight to The Code tab — annotated Python + PyTorch source with cross-references to the visuals it produces.
Browse TopicsFrom foundations to frontier. Each topic is a complete interactive deep-dive.

Word Embeddings & Skip-gram
See how a neural network learns word meanings by predicting context. Watch embeddings self-organize into semantic clusters in real time.
Start Learning
Attention Is All You Need
Dive into self-attention, multi-head attention, and positional encoding — the backbone of modern LLMs.
Coming SoonStuck on a concept?
Every topic includes an AI assistant — ChatGPT or Claude, pre-loaded with your lesson context.
100% free. No signup required to learn.
Sign in with GitHub to save quiz progress and join discussions.
Built with Next.js, D3.js, PyTorch, and Firestore