The test sentence (neutral, fresh) Let’s take a new sentence, so readers see generality: “Emma thanked David politely.” We will follow one token’s journey (say, “thanked”) through the Transformer. Step 0 — Tokenization (splitting the input) For simplicity (as in the blogs), we treat words as tokens: [Emma] [thanked] [David]…
Month: December 2025
Attention at a Networking Event — Blog 5
No Cheating, Making Choices, and Saying the Next Word (Masking + Output Probabilities) This is the finale of our networking event. By now, every guest: The room is alive with understanding. But understanding alone is not enough. A language model must do one very specific thing: Say the next word….
Attention at a Networking Event — Blog 4
Multiple Spotlights and Private Thinking (Multi-Head Attention + Feed-Forward Network) In Blog 3, the mixer finally came alive. Guests looked around, decided who mattered, and blended what they learned into richer, context-aware selves. Beautiful. But imagine a photographer trying to capture this event with one single spotlight. Some faces would…
Attention at a Networking Event — Blog 3
When the Mixer Finally Comes Alive (Self-Attention: Q, K, V) Until now, our networking event has been adorable but slightly awkward. Everyone is standing politely with their profiles (embeddings) and seat numbers (positional encodings), yet nobody has spoken to anyone. It’s like watching five well-dressed introverts circulating air. But language…
Attention at a Networking Event — Blog 2
Seat Numbers at the Mixer (Positional Information) In Blog 1, we got our guests into the room and gave them name tags (token IDs) plus mini personality profiles (embeddings). Everyone is officially “representable in numbers.” Nice. But there’s one awkward rule in the Transformer’s party hall: It has no built-in…
Attention at a Networking Event — A 5-Part SimplifAIng Mini-Series
Transformers are usually taught like someone dumped tokenization + embeddings + Q/K/V + softmax + ReLU + multi-head into a blender, hit “Turbo,” and called it intuition. If you’ve ever nodded politely while your mind quietly left your body—welcome… In this mini-series, we’ll learn the same mechanics using one friendly…
When Your AI Assistant Becomes a Spy: Lessons from the GeminiJack Incident
Imagine this. Your company has just rolled out a shiny new AI assistant that can read your emails, documents, and calendar so it can answer questions faster. You type: “Show me the latest approved budget numbers for Q4.” The assistant responds with a neat, tidy summary.You skim it, nod, move…