Prediction is foundational to theories of perceptual processing and learning, in both neuroscience and AI. However, it remains unclear whether prediction occurs routinely during naturalistic perception, and what level of abstraction the brain predicts. Here, we address both questions by analysing 7T fMRI recordings of humans viewing 73,000 natural images. We use deep generative models to quantify spatial predictability at multiple levels of abstraction, and relate these to retinotopically precise responses across V1-V4, while rigorously controlling for local image features. This reveals that, even during natural scene viewing, responses throughout visual cortex are modulated by spatial predictability, with more predictable inputs evoking weaker responses. In central vision, we observe a hierarchy of predictions that parallels the feature-encoding gradient: V1 is most sensitive to low-level unpredictability, with later areas progressively sensitive to higher-level unpredictability — diverging from recent proposals, in both neuroscience and AI, that prediction operates primarily at higher levels of abstraction. At higher eccentricities, prediction effects are amplified but even V1 is tuned to high-level predictability, consistent with these prior accounts. Together, these results suggest that the visual system implements distinct prediction regimes across the visual field, thereby reconciling conflicting accounts of what visual cortex predicts.
A woman’s uterus has been kept alive outside the body for the first time
“Think of this as a human body,” says Javier González. In front of me is essentially a metal box on wheels. Standing at around a


