FeNN-DMA: A RISC-V SoC for SNN acceleration

arXiv:2511.00732v1 Announce Type: cross Abstract: Spiking Neural Networks (SNNs) are a promising, energy-efficient alternative to standard Artificial Neural Networks (ANNs) and are particularly well-suited to

A Proof of Learning Rate Transfer under $mu$P

arXiv:2511.01734v1 Announce Type: cross
Abstract: We provide the first proof of learning rate transfer with width in a linear multi-layer perceptron (MLP) parametrized with $mu$P, a neural network parameterization designed to “maximize” feature learning in the infinite-width limit. We show that under $mu P$, the optimal learning rate converges to a emphnon-zero constant as width goes to infinity, providing a theoretical explanation to learning rate transfer. In contrast, we show that this property fails to hold under alternative parametrizations such as Standard Parametrization (SP) and Neural Tangent Parametrization (NTP). We provide intuitive proofs and support the theoretical findings with extensive empirical results.

Subscribe for Updates

Copyright 2025 dijee Intelligence Ltd.   dijee Intelligence Ltd. is a private limited company registered in England and Wales at Media House, Sopers Road, Cuffley, Hertfordshire, EN6 4RY, UK registeration number 16808844