arXiv:2512.20668v1 Announce Type: cross
Abstract: Backpropagation is still the de facto algorithm used today to
train neural networks.
With the exponential growth of recent architectures, the
computational cost of this algorithm also becomes a burden. The
recent PEPITA and forward-only frameworks have proposed promising
alternatives, but they failed to scale up to a handful of hidden
layers, yet limiting their use.
In this paper, we first analyze theoretically the main limitations of
these approaches. It allows us the design of a forward-only
algorithm, which is equivalent to backpropagation under the linear
and orthogonal assumptions. By relaxing the linear assumption, we
then introduce FOTON (Forward-Only Training of Orthogonal Networks)
that bridges the gap with the backpropagation
algorithm. Experimental results show that it outperforms PEPITA,
enabling us to train neural networks of any depth, without the need
for a backward pass.
Moreover its performance on convolutional networks clearly opens up avenues for its application to more
advanced architectures. The code is open-sourced at https://github.com/p0lcAi/FOTON .
FEM-Bench: A Structured Scientific Reasoning Benchmark for Evaluating Code-Generating LLMs
arXiv:2512.20732v1 Announce Type: cross Abstract: As LLMs advance their reasoning capabilities about the physical world, the absence of rigorous benchmarks for evaluating their ability to


