gpt
2 articles
Karpathy's Ultimate Reduction: 243 Lines of Pure Python, Zero Dependencies, Train a GPT From Scratch
Karpathy's 'art project': a GPT model in 243 lines of pure Python, zero dependencies. Every operation uses atomic math (add, mult, exp, log). Efficiency is secondary. It's the nand2tetris for AI education.
AI Time Capsule: Karpathy Grades 10-Year-Old HN Predictions with GPT
Karpathy used GPT 5.1 to analyze decade-old Hacker News threads and find out who actually predicted the future (◕‿◕)