arXiv:2602.22911v5 Announce Type: replace-cross
Abstract: Low-Rank Adaptation (LoRA) dominates parameter-efficient fine-tuning (PEFT). However, it faces a “linear ceiling”: increasing the rank yields diminishing returns in expressive capacity due to intrinsic linear constraints. We introduce CeRA (Capacity-enhanced Rank Adaptation), a weight-level parallel adapter that injects SiLU gating and dropout to induce non-linear capacity expansion. We demonstrate a fundamental relationship between adapter expressivity and task complexity. In basic arithmetic (GSM8K), CeRA matches standard linear baselines, but on the complex MATH dataset, it demonstrates high parameter efficiency in downstream reasoning (Exact Match). CeRA at rank 64 (pass@1 16.36%) outperforms both a high-rank LoRA at rank 512 (15.72%) and the state-of-the-art linear variant, DoRA, at rank 64 (14.44%), achieving higher exact-match accuracy with only 1/8 of the parameter budget. Empirical spectral analysis shows that CeRA activates the lower-variance tail of the singular value spectrum, preventing the rank collapse observed in linear methods and providing the representation capacity required for complex logical reasoning.
Measuring and reducing surgical staff stress in a realistic operating room setting using EDA monitoring and smart hearing protection
BackgroundStress is a critical factor in the operating room (OR) and affects both the performance and well-being of surgical staff. Measuring and mitigating this stress

