•During the normal waking state, the brain is in a constant state of internal exploration through the formation and dissolution of resting-state functional networks. •Based on large-scale computer models of the brain, the best fit to observed data comes when the networks are at the ‘edge of instability’. •Such a position is a distinct advantage for the efficiency and speed of network mobilization for perception and action. •We provide theoretical and empirical questions to better link resting-state networks to cognitive architectures. Resting-state networks (RSNs), which have become a main focus in neuroimaging research, can be best simulated by large-scale cortical models in which networks teeter on the edge of instability. In this state, the functional networks are in a low firing stable state while they are continuously pulled towards multiple other configurations. Small extrinsic perturbations can shape task-related network dynamics, whereas perturbations from intrinsic noise generate excursions reflecting the range of available functional networks. This is particularly advantageous for the efficiency and speed of network mobilization. Thus, the resting state reflects the dynamical capabilities of the brain, which emphasizes the vital interplay of time and space. In this article, we propose a new theoretical framework for RSNs that can serve as a fertile ground for empirical testing. Resting-state networks (RSNs), which have become a main focus in neuroimaging research, can be best simulated by large-scale cortical models in which networks teeter on the edge of instability. In this state, the functional networks are in a low firing stable state while they are continuously pulled towards multiple other configurations. Small extrinsic perturbations can shape task-related network dynamics, whereas perturbations from intrinsic noise generate excursions reflecting the range of available functional networks. This is particularly advantageous for the efficiency and speed of network mobilization. Thus, the resting state reflects the dynamical capabilities of the brain, which emphasizes the vital interplay of time and space. In this article, we propose a new theoretical framework for RSNs that can serve as a fertile ground for empirical testing. experimental design in which the subject is asked to execute a perceptual, motor, or cognitive task relative to a low-level baseline task (e.g., visual fixation or ‘rest’). During execution of the task, the associated brain activation is measured and is considered to be the only neural correlate of that specific function. brain dynamics can be modeled by attractor network models. These comprise a network of neurons that is a dynamical system that, in general, has the tendency to settle in stationary states, fixed points called ‘attractors’, typically characterized by a stable pattern of firing activity. External or even intrinsic noise that appears in the form of finite size effects could provoke destabilization of an attractor, therefore inducing transitions between different stable attractors. The dynamics of the network can be detailed by coupling the dynamical equations describing each neuron and the synaptic variables associated with their mutual coupling. one of the basic tools of analysis of dynamical systems. A bifurcation is defined by qualitative changes of the asymptotic behavior of the system (‘attractors’) under parameter variation. behavior of a dynamical system that is highly sensitive to initial conditions in such a way that extremely small differences in initial conditions yield widely diverging outcomes; thus, the evolution of the system is effectively unpredictable, even if the system is purely deterministic. at the brink of a bifurcation, the system displays certain characteristic dynamic features, most of which are related to enhanced fluctuations. an MRI technique that is similar to DTI, but with the added capability of resolving multiple directions of diffusion in each voxel of white matter. This enables multiple groups of fibers at each location, including intersecting fiber pathways, to be mapped. an MRI technique that takes advantage of the restricted diffusion of water through myelinated nerve fibers in the brain to infer the anatomical connectivity between brain areas. the statistical relation between activity in two or more neural sources. This usually refers to the temporal correlation between sources, but has been extended to include correlations across trials or different experimental subjects. Functional connectivity methods include estimation of correlation coefficients and coherence. The estimation cannot be used to infer the direction of the relation between sources. the mean-field approximation involves replacing the temporally averaged discharge rate of a cell with an equivalent momentary activity of a neural population (ensemble average) that corresponds to the assumption of ergodicity. According to this approximation, each cell assembly is characterized by means of its activity population rate. bursts of elevated population activity, correlated in space and time, that are distinguished by a particular statistical character: activity clusters of size s occur with probability P(s) equal to s–a (i.e., a power law with exponent a=1.5). in neurodynamical systems, noise is mainly produced by the probabilistic spiking times of neurons and usually plays an important and advantageous role in brain function. Spiking noise is a significant factor in a network with a finite (i.e., limited) number of neurons. The spiking noise can be described as introducing statistical fluctuations into the finite-size system. a network of spiking neurons establishes a high-dimensional dynamic system, in which individual neurons (usually expressed by integrate-and-fire or Hodgkin–Huxley models) interact with each other through different types of dynamical synapse (e.g., AMPA, NMDA, or GABA). a mean field-like rate model expressing the coupling between two populations of excitatory and inhibitory neurons. In general, the Wilson–Cowan model is tuned such that the population rate of the pools oscillates. It is one of the simple neural oscillators. Resting Brains Never Rest: Computational Insights into Potential Cognitive Architectures: (Trends in Neurosciences , 268–274, 2013)Deco et al.Trends in NeurosciencesJanuary 6, 2018In BriefDue to an oversight in the preparation of this Opinion article, the following acknowledgment was inadvertently omitted: ‘Gustavo Deco was funded by the European Research Council (Advanced Grant DYSTRUCTURE No. 295129 ).’ Full-Text PDF