Watts-per-Intelligence Part II: Algorithmic Catalysis

arXiv:2604.20897v1 Announce Type: cross Abstract: We develop a thermodynamic theory of algorithmic catalysis within the watts-per-intelligence framework, identifying reusable computational structures that reduce irreversible operations

arXiv:2604.16710v1 Announce Type: cross
Abstract: Linear-threshold networks (LTNs) capture the mesoscale behavior of interacting populations of neurons and are of particular interest to control theorists due to their dynamical richness and relative ease of analysis. The aim of this paper is to advance the study of global asymptotic stability in LTNs with asymmetric neural interactions and heterogeneous dissipation under the structural Lyapunov diagonal stability (LDS) condition. To this end, we introduce a one-parameter family of LTNs that preserves the LDS condition and has a parameter-independent equilibrium set. In the fast limit, this family converges to a projected dynamical system (PDS), while in the slow limit, it converges to a discontinuous hard-selector system (HSS). Under LDS, we prove that the fast PDS limit is globally exponentially stable and that the HSS limit is globally asymptotically stable. This alignment suggests that the limiting systems capture essential mechanisms governing stability across the entire LTN family. Together with numerical evidence, these findings indicate that resolving stability at the fast and slow endpoints provides a promising and structurally grounded path toward establishing global stability for LTNs with biologically plausible recurrence and diagonal dissipation.

Subscribe for Updates

Copyright 2025 dijee Intelligence Ltd.   dijee Intelligence Ltd. is a private limited company registered in England and Wales at Media House, Sopers Road, Cuffley, Hertfordshire, EN6 4RY, UK registration number 16808844