Translate

Visualizzazione post con etichetta neural networks. Mostra tutti i post
Visualizzazione post con etichetta neural networks. Mostra tutti i post

sabato 14 giugno 2025

# aibot: noise balance and stationary distribution of stochastic gradient descent.


<< The stochastic gradient descent (SGD) algorithm is the algorithm (is used) to train neural networks. However, it remains poorly understood how the SGD navigates the highly nonlinear and degenerate loss landscape of a neural network. >>

<< In this work, (AA) show that the minibatch noise of SGD regularizes the solution towards a noise-balanced solution whenever the loss function contains a rescaling parameter symmetry. Because the difference between a simple diffusion process and SGD dynamics is the most significant when symmetries are present, (AA) theory implies that the loss function symmetries constitute an essential probe of how SGD works. (They) then apply this result to derive the stationary distribution of stochastic gradient flow for a diagonal linear network with arbitrary depth and width. >>

<< The stationary distribution exhibits complicated nonlinear phenomena such as phase transitions, broken ergodicity, and fluctuation inversion. These phenomena are shown to exist uniquely in deep networks, implying a fundamental difference between deep and shallow models. >>

Liu Ziyin, Hongchao Li, Masahito Ueda. Noise balance and stationary distribution of stochastic gradient descent. Phys. Rev. E 111, 065303. Jun 6, 2025.

Also: ai (artificial intell) (bot), network, noise, disorder & fluctuations, in https://www.inkgmr.net/kwrds.html 

Keywords: ai, artificial intelligence, noise, stochasticity, networks, neural networks, deep learning,stochastic gradient descent (SGD), transitions, phase transitions, broken ergodicity, fluctuation inversion

giovedì 12 giugno 2025

# gst: unstable fixed points in chaotic networks

<< Understanding the high-dimensional chaotic dynamics occurring in complex biological systems such as recurrent neural networks or ecosystems remains a conceptual challenge. For low-dimensional dynamics, fixed points provide the geometric scaffold of the dynamics. However, in high-dimensional systems, even the location of fixed points is unknown. >>

Here, AA << analytically determine the number and distribution of fixed points for a canonical model of a recurrent neural network that exhibits high-dimensional chaos. This distribution reveals that fixed points and dynamics are confined to separate shells in state space. Furthermore, the distribution enables (AA) to determine the eigenvalue spectra of the Jacobian at the fixed points, showing that each fixed point has a low-dimensional unstable manifold. >>

<< Despite the radial separation of fixed points and dynamics, (They)  find that the principal components of fixed points and dynamics align and that nearby fixed points act as partially attracting landmarks for the dynamics. >>

AA results << provide a detailed characterization of the fixed point geometry and its interplay with the dynamics, thereby paving the way towards a geometric understanding of high-dimensional chaos through their skeleton of unstable fixed points. >>

Jakob Stubenrauch, Christian Keup, et al. Fixed point geometry in chaotic neural networks. Phys. Rev. Research 7, 023203. May 29, 2025.

Also: chaos, disorder & fluctuations, instability, transition, network, brain, in https://www.inkgmr.net/kwrds.html 

Keywords: gst, chaos, networks, neural networks, ecosystems, fixed points, unstable fixed points.