arXiv:2601.17523v1 Announce Type: new
Abstract: Sleep is thought to support memory consolidation and the recovery of optimal energetic regime by reorganizing synaptic connectivity, yet how plasticity across hierarchical brain circuits contributes to abstraction and energy efficiency remains unclear. Here we study a spiking multi-layer network alternating wake-like and deep-sleep-like states, with state-dependent dendritic integration and synaptic plasticity in a biologically inspired thalamo-cortical framework. During wakefulness, the model learns from few perceived examples, while during deep sleep it undergoes spontaneous replay driven by slow oscillations. Plasticity enabled not only within intra-layer connections, but also in inter-layer pathways, is critical for memory consolidation and energetic downshift. Compared to restricted plasticity, full inter-layer plasticity yields higher post-sleep visual classification accuracy and promotes the emergence of sharper class-specific associations. Furthermore, we introduce a biophysically grounded estimator of metabolic power expressing network energy consumption in ATP units, partitioned into baseline, synaptic maintenance, action potential, and transmission costs. We find that inter-layer plasticity in sleep leads to a larger reduction in firing rates, synaptic strength and synaptic activity, corresponding to a substantially larger decrease in power consumption. This work suggests promising elements to be integrated in neuromorphic/energy-efficient AI learning systems, supported by brain state-specific apical mechanisms.
FIT: Defying Catastrophic Forgetting in Continual LLM Unlearning
arXiv:2601.21682v1 Announce Type: cross Abstract: Large language models (LLMs) demonstrate impressive capabilities across diverse tasks but raise concerns about privacy, copyright, and harmful materials. Existing



