Understanding Attention: A Code-First Journey Through Transformers

Build attention mechanisms from scratch in PyTorch. We'll start with raw tensors and progressively build to multi-head attention, explaining every reshape, transpose, and dimension along the way.

The 10% You Should Never Automate

Everyone's asking what AI can do. The better question is what you shouldn't let it do. Frameworks for deciding what to automate and what to protect.

When Should You Build an AI Agent? A Practical Decision Framework

Practical framework to determine when AI agents make sense for your use case. Learn when to build agents and when simpler approaches like prompt engineering or RAG work better.

The Making of the Mahatma - M.K. Gandhi's Journey of Truth

Explore formative influences that shaped Mahatma Gandhi's values. From childhood lessons in honesty to legendary tales of Harishchandra and Shravana, discover the stories and experiences that forged his commitment to truth and service.

Ai Weiwei – Art, Activism, and the Rebel Spirit

Updated on
Explore Ai Weiwei's provocative art and fearless activism. From childhood exile to confronting state power, discover how China's most famous dissident artist transforms personal struggle into powerful cultural commentary through iconic works.

Mistral 7B on consumer hardware

Run Mistral 7B locally on Mac with Ollama for fast seed data generation. Learn CLI setup, prompt formatting, and downstream parsing to generate thousands of samples on consumer hardware.

Finding the right words

Understand how LLMs choose words during generation. Learn temperature, top-k, and top-p sampling strategies to balance coherence, diversity, and task-appropriateness in generated text.

Paper Review - Embers of Autoregression

Critical review of LLM limitations in low-probability situations. Explores why AI practitioners should understand autoregressive training pressures before deploying LLMs for tasks requiring precise reasoning or uncommon patterns.

Subscribe

All the latest posts directly in your inbox.