Intelligence isn’t computed—it’s compressed. Starting from a single Gaussian blur kernel, we trace how the fundamental operation of context-gathering creates surplus that must be destroyed to become useful. Through demonstrations of convolution, pooling, and attention mechanisms, we show that “understanding” is literally the art of forgetting the right things. Context windows, whether in pixels, neurons, or tokens, always overflow—and architectures for intelligence are, at their core, compression schemes for neighborhood relationships.