Translate

Visualizzazione post con etichetta neuro. Mostra tutti i post
Visualizzazione post con etichetta neuro. Mostra tutti i post

mercoledì 28 gennaio 2026

# gst: from chimera states to spike avalanches and quasicriticality; the role of superdiffusive coupling.

<< ️The partial synchronization states of collective activity, as well as the spike avalanches realization in systems of interacting neurons, are extremely important distinguishing features of the neocortical circuits that have multiple empirical validations. However, at this stage, there is a limited number of studies highlighting their potential interrelationship at the level of nonlinear mathematical models. >>

<< ️In this study, (AA) investigate the development of chimera states and the emergence of spike avalanches in superdiffusive neural networks, as well as analyze the system's approach to quasicriticality. >>

<< ️The analysis of the available ideas suggests that partial synchronization states, spike avalanches, and quasicritical neuronal dynamics are all directly implicated in core cognitive functions such as information processing, attention, and memory. Given this fundamental role, the results presented in this (AA) work could have significant implications for both theoretical neuroscience and applied machine learning, particularly in the development of reservoir computing systems. >>

I. Fateev, A. Polezhaev. From chimera states to spike avalanches and quasicriticality: The role of superdiffusive coupling. Phys. Rev. E 113, 014215. Jan 20, 2026.

Also: network, brain, neuro, behav, chimera, random, walk, walking, in https://www.inkgmr.net/kwrds.html 

Keywords: gst, networks, neuronal network models, chimera, random, walk, walking, avalanches, neuronal avalanches, collective behaviors, criticality.

martedì 9 settembre 2025

# brain: self-organized learning emerges from coherent coupling of critical neurons.

<< ️Deep artificial neural networks have surpassed human-level performance across a diverse array of complex learning tasks, establishing themselves as indispensable tools in both social applications and scientific research. >>

<< ️Despite these advances, the underlying mechanisms of training in artificial neural networks remain elusive. >>

<< ️Here, (AA) propose that artificial neural networks function as adaptive, self-organizing information processing systems in which training is mediated by the coherent coupling of strongly activated, task-specific critical neurons. >>

<< ️(AA) demonstrate that such neuronal coupling gives rise to Hebbian-like neural correlation graphs, which undergo a dynamic, second-order connectivity phase transition during the initial stages of training. Concurrently, the connection weights among critical neurons are consistently reinforced while being simultaneously redistributed in a stochastic manner. >>

<< ️As a result, a precise balance of neuronal contributions is established, inducing a local concentration within the random loss landscape which provides theoretical explanation for generalization capacity. >>

<< ️(AA) further identify a later on convergence phase transition characterized by a phase boundary in hyperparameter space, driven by the nonequilibrium probability flux through weight space. The critical computational graphs resulting from coherent coupling also decode the predictive rules learned by artificial neural networks, drawing analogies to avalanche-like dynamics observed in biological neural circuits. >>

<<(AA) findings suggest that the coherent coupling of critical neurons and the ensuing local concentration within the loss landscapes may represent universal learning mechanisms shared by both artificial and biological neural computation. >>

Chuanbo Liu, Jin Wang. Self-organized learning emerges from coherent coupling of critical neurons. arXiv: 2509.00107v1 [cond-mat.dis-nn]. Aug 28, 2025.

Also: brain, neuro, network, random, transition, ai (artificial intell) (bot), in https://www.inkgmr.net/kwrds.html 

Keywords: gst, brain, neurons, networks, randomness, transitions, ai (artificial intell) (bot), learning mechanisms, self-organized learning, artificial neural networks, deep learning, neuronal coupling, criticality, stochasticity, avalanche-like dynamics.

mercoledì 30 ottobre 2024

# life: ghostly psyche revisited; ghosts, zombies, gris-gris, and so on ...

<< Most people imagine philosophers as rational thinkers who spend their time developing abstract logical theories and strongly reject superstitious beliefs. But several 20th-century philosophers actively investigated spooky topics such as clairvoyance, telepathy – even ghosts.
Many of these philosophers, including Henri Bergson and William James, were interested in what was called “psychical research”. This was the academic study of paranormal phenomena including telepathy, telekinesis and other-worldly spirits. These thinkers attended seances and were attempting to develop theories about ghosts, life after death and the powers exhibited by mediums in trances. >>

AA << recent archival research has been looking at how these topics shaped 20th-century philosophy. >>️

Matyas Moravec. Many important 20th-century philosophers investigated ghosts – here’s how they explained them. Oct 24, 2024. 

Also: ethno, gris-gris, neuro, zombie, perception, psychedelic, delirium, in https://www.inkgmr.net/kwrds.html 

Keywords: ethno, gris-gris, neuro, zombie, perception, psychedelic, delirium 

FonT: even G.A. Romero's zombie filmography - which, as with other "spooky" themes, I have always marginalized - still seems relevant.


martedì 9 luglio 2024

# gst: discontinuous transition to chaos in a canonical random neural network


AA << study a paradigmatic random recurrent neural network introduced by Sompolinsky, Crisanti, and Sommers (SCS). In the infinite size limit, this system exhibits a direct transition from a homogeneous rest state to chaotic behavior, with the Lyapunov exponent gradually increasing from zero. (AA)  generalize the SCS model considering odd saturating nonlinear transfer functions, beyond the usual choice 𝜙⁡(𝑥)=tanh⁡𝑥. A discontinuous transition to chaos occurs whenever the slope of 𝜙 at 0 is a local minimum [i.e., for 𝜙′′′⁢(0)>0]. Chaos appears out of the blue, by an attractor-repeller fold. Accordingly, the Lyapunov exponent stays away from zero at the birth of chaos. >>

In the figure 7 << the pink square is located at the doubly degenerate point (𝑔,𝜀)=(1,1/3). >>️️

Diego Pazó. Discontinuous transition to chaos in a canonical random neural network. Phys. Rev. E 110, 014201. July 1, 2024.

Also: chaos, random, network, transition, neuro, in https://www.inkgmr.net/kwrds.html 

Keywords: gst, chaos, random, network, transition, neuro