Growing evidence indicates that visual search is influenced not only by perceptual but also by semantic information. Among such semantic influences, categorical effects have been extensively studied. By contrast, the influence of contextual associations, that is, how objects commonly co-occur within specific scene contexts, remains less understood. Here, we orthogonally manipulated contextual and categorical relationships between target and distractor objects in a visual search task, while controlling for perceptual target-distractor similarity. By applying linear regression analyses, we modelled behavioral responses and event-related potentials (ERPs) from contextual, categorical, and perceptual target-distractor similarity. We found that contextual associations guide attention independently during early processing (modulating the N1pc ERP component), subsequently interact with categorical information (modulating the N2pc), and ultimately shape behavioral search performance jointly with perceptual and categorical similarity. These findings demonstrate that contextual object associations influence visual search across multiple processing stages, thereby supporting attentional guidance in everyday visual tasks.
It’s About Time: The Temporal and Modal Dynamics of Copilot Usage
arXiv:2512.11879v1 Announce Type: cross Abstract: We analyze 37.5 million deidentified conversations with Microsoft’s Copilot between January and September 2025. Unlike prior analyses of AI usage,




