Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
You have accessResearch articles

The broad edge of synchronization: Griffiths effects and collective phenomena in brain networks

Victor Buendía

Victor Buendía

Max Planck Institute for Biological Cybernetics, Tübingen, Germany

Department of Computer Science, University of Tübingen, Tübingen, Germany

Google Scholar

Find this author on PubMed

,
Pablo Villegas

Pablo Villegas

IMT Institute for Advanced Studies, Piazza San Ponziano 6 55100 Lucca, Italy

Google Scholar

Find this author on PubMed

,
Raffaella Burioni

Raffaella Burioni

Dipartimento di Matematica, Fisica e Informatica, Università di Parma, via G.P. Usberti, 7/A - 43124, Parma, Italy

INFN, Gruppo Collegato di Parma, via G.P. Usberti, 7/A - 43124, Parma, Italy

Google Scholar

Find this author on PubMed

and
Miguel A. Muñoz

Miguel A. Muñoz

Departamento de Electromagnetismo y Física de la Materia e Instituto Carlos I de Física Teórica y Computacional. Universidad de Granada, E-18071 Granada, Spain

[email protected]

Google Scholar

Find this author on PubMed

    Abstract

    Many of the amazing functional capabilities of the brain are collective properties stemming from the interactions of large sets of individual neurons. In particular, the most salient collective phenomena in brain activity are oscillations, which require the synchronous activation of many neurons. Here, we analyse parsimonious dynamical models of neural synchronization running on top of synthetic networks that capture essential aspects of the actual brain anatomical connectivity such as a hierarchical-modular and core-periphery structure. These models reveal the emergence of complex collective states with intermediate and flexible levels of synchronization, halfway in the synchronous–asynchronous spectrum. These states are best described as broad Griffiths-like phases, i.e. an extension of standard critical points that emerge in structurally heterogeneous systems. We analyse different routes (bifurcations) to synchronization and stress the relevance of ‘hybrid-type transitions’ to generate rich dynamical patterns. Overall, our results illustrate the complex interplay between structure and dynamics, underlining key aspects leading to rich collective states needed to sustain brain functionality.

    This article is part of the theme issue ‘Emergent phenomena in complex physical and socio-technical systems: from cells to societies’.

    1. Collective effects in spontaneous brain activity

    The human brain is the most complex object we are aware of. It is composed of as many as 1011 neurons and 1015 synapses forming an extraordinarily intricate network [13]. Electrochemical signals travelling through such a network permit neurons to communicate with each other, allowing them to coordinate their behaviour and generate a vast diversity of possible emergent or collective behaviours, which are believed to be at the basis of information transmission, processing and storage [4]. Even if individual neurons such as the Jennifer-Aniston’s one—responding with high specificity to images of the famous actress—exist [5], these are more the exception than the rule: most of the remarkable abilities of the brain are delocalized, i.e. they emerge out of collective phenomena stemming from the interactions of large sets of neurons [69].

    Probably, the most salient collective phenomena in brain activity are oscillations, as already detected in pioneering electroencephalogram recordings almost one century ago [10]. Neural oscillations at a given (mesoscopic) brain region require the synchronous firing or activation of many individual neurons within it, allowing signals to robustly propagate to e.g. distant network areas [10,11]. Indeed, neural synchronization has been shown to be crucial in memory, attention, vision, and high-level cognitive functions, and abnormalities in the level of synchrony have been associated with pathologies such as, e.g. Parkinson's disease (excess) and autism (deficit) [12,13]. In what follows, we briefly review four relevant concepts useful to shed light onto whole-brain connectivity and dynamics.

    (A) Resting-state functional networks. Brain networks are never silent; even in resting conditions their ongoing activity consumes as much as 20% of our total energy budget, suggesting crucial functional roles [14]. Seminal work developed in the last two decades guided by novel neuroimaging technologies elucidated that spontaneous activity in resting conditions at different locations (as measured, e.g. by functional magnetic resonance imaging (fMRI) techniques) does not occur in a random uncorrelated way. Even if the resulting activity time series at diverse mesoscopic regions are non-stationary (see below), most of the first studies focused on the analysis of long-time averaged pairwise correlations. In this way, regions of correlated and anticorrelated activity were robustly identified, defining the so-called ‘resting-state networks’ (RSN) [1520], which are believed to represent the basic functional architecture of the brain [2,3,21,22]. These networks of functional correlations show a rich multi-scale organization with moduli of correlated activity appearing within other moduli at larger scales in a nested hierarchical-modular way [2326].

    The focus of research in resting-state functional connectivity has recently shifted from static (i.e. long-time-averaged correlations) to dynamical aspects (in the sense that correlations are measured independently in a series of concatenated time windows), thus allowing for a characterization of the temporal organization of functional connectivity or chronnectomics [2730]. The emerging picture is that the global patterns of spatial correlations shift in time between a discrete set of basic functional (metastable) states [31,32], thus creating a rich dynamical repertoire which is believed to constitute a template to sustain diverse functions.

    (B) Structural networks. Resting-state functional correlations are well-understood by now to be strongly constrained by the underlying network of structural (or anatomical) connectivity (as determined from diffusion tensor imaging (DTI) measurements combined with algorithms for the identification of white-matter fibre tracts connecting the units of a given mesoscopic parcellation [3]). Important architectural features of the resulting structural networks (or structural connectomes) at the mesoscale are that: (i) they are organized in a hierarchical-modular way [2,3,3335] and, (ii) they have a core-periphery organization with connector hubs [3638]. Actually, areas with a strong mutual structural connectivity show also a high level of long-time functional connectivity [39,40]. Reciprocally, functionally connected regions are likely to be directly wired together [25,41], revealing that structural networks constitute a scaffold to support and constrain the emergence of functional correlations. However, there is not a one-to-one equivalence between structure and function and, as mentioned above, the focus of recent research is on time-dependent aspects of functional connectivity.

    (C) Segregation-and-integration balance. An influential idea, proposed by Tononi and coworkers, is that high-level cognitive tasks performed by the brain require segregated processing for each type of (e.g. sensory) input, which then needs to be integrated to allow for a unified representation, advanced cognitive processing, and response [42]. Thus, it was conjectured that an optimal balance between segregation and integration is crucial for optimal brain function and that deviations from such a balance are associated with neurological disorders [42]. Furthermore, such an optimal and flexible balance requires high diversity and variability of underlying synchronization or correlation patterns of neural activity to be sustained [43,44]. Thus, it is important to shed further light on what are the (i) structural and (ii) dynamical aspects favouring an optimal segregation-integration balance, i.e. flexible and variable levels of synchronization.

    (i) From the structural side, it is well established that a hierarchical-modular network organization is particularly well suited to accommodate diverse levels of segregated/integrated activity across many scales: local moduli support segregation but the transient correlation or synchronization of some of them allows for integration of information at progressively larger scales in a hierarchical way [4548]. Similarly, the presence of central connector hubs and the resulting core-periphery structure has been reported to be crucial for the emergence of a central integrative core controlling the segregation-integration balance [49,50].

    (ii) From the dynamical side, ambitious computational models for the ‘whole-brain dynamics’ have been constructed in recent years [51,52]. These models aim to describe the overall patterns of brain activity in different states (resting, awake, anaesthesized, etc.) in a parsimonious way. They consist of an empirically-determined structural connectome network, where each single node is described by a dynamical unit (e.g. a neural-field model such as the Wilson–Cowan one [53,54] or a simpler effective model of neural synchronization such as a Stuart–Landau oscillator [51,55]) assumed to capture key aspects of neural activity at a brain mesoscopic region, and the different units are linked together following the network architecture [5558]. These analyses revealed that the best agreement between model-generated correlations and empirically-measured ones is obtained if the dynamics at individual mesoscopic nodes operates close to a (Hopf) bifurcation point [51,55]. This means that high levels of overall dynamical variability, as those reported in the actual resting-state functional networks, are best reproduced if each mesoscopic unit is at the edge of becoming oscillatory.

    (D) Criticality and Griffiths phases. This last result resembles a much-discussed and suggestive conjecture proposing that the large variability of brain activity at vastly different scales might stem from an underlying dynamical process operating at a critical point, i.e. at the edge between order and disorder, with its concomitant spatio-temporal scale-invariance [8,52,5961]. The so-called ‘criticality hypothesis’ sparked much interest, as critical behaviour has been shown to entail many potential functional advantages such as maximal dynamical range, exquisite sensitivity to signals, vast information processing and storage capabilities, etc. [8,62].1 Importantly, it has been proposed that the concept of criticality—which ultimately requires parameter fine-tuning or some self-organization mechanism to be achieved [8,64]—can be made more generic or robust by invoking Griffiths phases [6567]. These are extended regimes—sharing many of the remarkable features of critical points—that appear in physical systems with some inherent structural heterogeneity (such as actual brain networks) [6769]. For instance, as illustrated in figure 1a, a Griffiths-like phase emerges in a dynamical model of phase synchronization running on top of a hierarchical-modular network, i.e. a human connectome network, and it is characterized by a broad set of parameter values with intermediate, variable, and transient levels of frustrated synchronization [69,71,73]. This is in contrast to the standard picture observed for the same dynamical model running on more homogeneous random networks where large variability is encountered only in a small neighbourhood around the critical point (figure 1a). Thus, our group conjectured that Griffiths-like phases could lie at the basis of the remarkable variability of complex synchronization patterns in actual brain dynamics, thus facilitating a flexible balance between segregation and integration [69,71,73].

    Figure 1.

    Figure 1. Structural and dynamical ingredients for dynamical richness. (a) Mean value (continuous line) and standard deviation (shaded region) of the overall synchronization order parameter, R, as measured for the standard Kuramoto model on top of random Erdös–Rényi (ER) network and on a human-connectome structural network [70] (size N=998 in both cases [71]). In the second case, there is a broad Griffiths-like region of ‘frustrated synchronization’, where the level of synchronization is very variable, as opposed to the standard ER case in which large levels of variability are only observed in the neighbourhood of the critical point (plot adapted from [71]). (b) Schematic phase diagram of the model described by equation (1) (with a constant frequency, ω) on infinitely-large random networks. The transition between the synchronous and asynchronous phases can occur either through a Hopf bifurcation (red line), a SNIC bifurcation (green line), or—as a hybrid-type synchronization—through an intermediate region of bistability (blue region) delimited by three co-dimension-2 bifurcations: a Bogdanov–Takens, a saddle-node-loop, and a cusp bifurcation, as well as by lines of homoclinic and saddle-node bifurcations (see [72] and refs. therein). (c) Raster plots illustrating crossings of each network unit through a given phase value (2π) when the dynamics operate at the hybrid-type synchronization transition in ER, hierarchical-nodular random network (HM-R), and hierarchical-modular core-periphery (HM-CP) networks; much-richer patterns emerge in the latter cases. (d) Visual representation of the three types of considered networks. Each dot represent a different node, while the colour code and spatialization highlight the network hierarchical modularity (HM-R and HM-CP), or its absence (ER). (Online version in colour.)

    Summing up: even if our knowledge on the nature of both structural and functional brain networks has been remarkably enhanced in recent years, there are still key general questions that remain unanswered. In particular, we ask here: what are the crucial aspects of (i) the structural architecture of brain networks and, (ii) the dynamics at individual mesoscopic nodes that determine together the largest variability of synchronization patterns, thus allowing for a flexible balance between segregation and integration? What is the role played by different types of criticality, i.e. diverse bifurcation types at the individual mesoscopic units? Are Griffiths-like phases relevant to describe brain activity?

    2. Modelling the interplay between structure and dynamics

    To shed further light on the previous questions, here we perform an exploratory research in which we analyse diverse types of synthetic structural networks capturing key aspects of brain connectivity combined with parsimonious dynamical models for neural synchronization [7476] that, notwithstanding their simplicity, can give rise to a rich collective phenomenology, including a large variety of bifurcation types (not only Hopf ones) at the mesoscopic level [72].

    Structural-network models. We consider a sort of network of networks, in which each mesoscopic unit is actually represented by a basal network module including a large cluster of densely-connected nodes. These basal networks are linked together following three different types of unweighted and undirected architectures with a fixed total number of nodes: (i) random Erdös-Rényi (ER) networks [77], (ii) hierarchical-modular random (HM-R) networks without hubs [67,78], and (iii) hierarchical networks with a core-periphery (HM-CP) structure involving central connector hubs (so that the degree distribution is scale free) [78] (see Methods). In particular, HM-R networks are already known to exhibit Griffiths-like collective phases of frustrated synchronization [69,71,73], while less heterogeneous ER networks do not exhibit such phases, and HM-CP networks have not been so far explored from this perspective. Thus, our main focus here is on analysing whether the core-periphery structure fosters or hinders the emergence of broad Griffiths-like phases.

    Dynamical model. The standard minimal model for collective synchronization is the one proposed by Kuramoto [74,79]: a large set of N individual oscillators, characterised by an intrinsic random frequency and coupled together following some connectivity pattern (e.g. a fully-connected network or a random network architecture). Typically, as the coupling strength increases, the collective state experiences a Hopf bifurcation representing the transition from an asynchronous to a synchronous/oscillatory state (i.e. the type of bifurcation considered in the above-mentioned whole-brain computational models [51,55]). A more general phase model is

    φ˙j=ωj+asinφj+Ji=1NWijsin(φiφj)+σηj(t),2.1
    where each individual unit, j, is characterized by its phase φj and intrinsic (random) frequency ωj, Wij is the connectivity matrix, J is the overall coupling strength, and ηj(t) is a delta-correlated Gaussian white noise [80]. The term asinφj accounts for the excitable nature of individual units (for a<ωj the units oscillate, while for a>ωj the units remain trapped in an ‘excitable state’ which can be perturbed to become transiently oscillatory by noise or external inputs [81]). In the limiting case a=σ=0, equation (2.1) reduces to the standard Kuramoto model.

    Equation (2.1) has been profusely studied in the literature for both homogeneous frequencies and for specific frequency distributions [72,80,8284]. Remarkably, as sketched in figure 1b, the emerging dynamics is described by a very rich phase diagram, including synchronous and asynchronous phases, as well as different types of bifurcation lines (or critical points), separating them (the diagram in figure 1b has been theoretically obtained for homogeneous frequencies on a fully connected network [72], but a qualitatively identical one arises when considering a heterogeneous frequency distribution with a and/or σ fixed to 0 [72,80,8284]).

    In this very general phase diagram, there exist two main types of collective synchronization transitions, i.e. two main ways to enter the synchronous phase in figure 1b: Hopf bifurcations (the fingerprint of ‘type-II’ synchronization) and saddle-node-into-invariant-circle (SNIC) bifurcations (the fingerprint of ‘type-I’ synchronization), which are qualitatively very different from each other and represent the two main types of synchronization transitions (see e.g. [81]). However, as illustrated in figure 1b in between the lines of bifurcations of these two standard classes, there is also a triangular-shaped region of bistability—with coexisting states of either low or high activity, respectively—delimited by three codimension-2 bifurcation points (namely, a Bogdanov–Takens, a saddle-node-loop and a cusp; for details see [72,82] and refs. therein). The detailed discussion of this admittedly complicated phase diagram is beyond the scope of the present brief paper, however, let us just stress the only important aspect for our purposes here: our group has recently uncovered that entering the synchronous phase through such a region of bistability—i.e. through a so-called hybrid-type synchronization transition—a much richer phenomenology, including scale-free avalanches (a typical signature of criticality in neural systems [8,59]) emerges [72]. This result inspired us to explore the effects that such a hybrid-type transition occurring at the basal moduli could have on the overall macroscopic dynamics of complex brain networks. Does it enhance the overall dynamical richness as compared to, e.g. the standard Hopf (type-II) route to synchronization (as usually employed in above-mentioned whole-brain modelling approaches?

    Thus, in what follows, we computationally analise equation (1) on top of the either ER, HM-R and HM-CP networks of mesoscopic regions as described above, and we choose parameters so as to enter the synchronous regime at each node through either a type-I, a type-II, or a hybrid-type transition. As a first observation, for the sake of visual illustration, let us mention that at the ‘hybrid-type’ transition, one can generate very rich dynamical patterns, with partial and transient synchronization between basal moduli, on hierarchical networks (especially on HM-CP ones) which are not observed on simpler ER networks (figure 1c). On the other hand, much simpler patterns appear nearby Hopf or SNIC bifurcations (not shown here but qualitatively similar to those for the ER above; see [72]). The results in this example correspond to the case of homogeneous frequencies (as in [72]); nevertheless, given that frequency heterogeneity is an important ingredient in brain dynamics, in what follows, we present much more detailed results for the case of a Gaussian bimodal frequency distribution, g(ω)=12(N(ω0,Δ2)+N(ω0,Δ2)) but fixing a=0. The reason why we use this model is that its phase diagram—which coincides qualitatively with that sketched in figure 1b (but as a function of its own free parameters Δ and ω0)—is exactly known [84], and this helps a lot for fixing parameter values to locate the different bifurcation types.

    3. Results and protocols for further advances

    First of all, to reduce the number of parameter values, we fixed a=0, and a small noise amplitude, σ=0.2, to allow system variability, scrutinizing the system behaviour as a function of the remaining free parameters (Δ, ω0 and J). As discussed above, changing one of these two parameters allows one to shift the dynamical regime of basal moduli and encounter different bifurcation types—Hopf, SNIC or hybrid-type transitions—separating synchronous from asynchronous regimes (which depends on the ratio ΔJ and ω0J, see figure 1b and a detailed phase diagram in [84]). First of all, we verified that, very robustly, the synchronous phase becomes a region of ‘frustrated synchronization’ of Griffiths-like phase in both types of hierarchical-modular network (either HM-R and HM-CP) due to the presence of structural and frequency heterogeneity. In other words, as illustrated in the timeseries shown in the insets of figure 2ac, the synchronization Kuramoto order parameter R exhibits high variability (variance) in a broad region in parameter space [71,73] (let us remark that such a variability that does not disappear in infinitely-large networks[71]). Note also that similar regimes do not appear in random ER networks (figure 2ac). This observation confirms that hierarchical-modular networks are particularly well-suited to host large dynamical variability and that such a variability is enhanced in networks with the additional feature of a core-periphery structure. Observe that also in the ‘asynchronous’ phase one can have some residual level of variable synchronization, especially near the SNIC bifurcation (though this is a spurious effect stemming from the employed bimodal frequency distribution and could be removed by defining a slightly different order parameter).

    Figure 2.

    Figure 2. Variability across network types and dynamical regimes: Mean and standard deviation of the overall synchronization Kuramoto order parameter, R, as measured for the noisy Kuramoto model on top of an ER network (blue circles), a HM-R network (orange squares) and a HM-CP network (green triangles) for the three types of bifurcations: (a) Hopf bifurcation (ω0=0,Δ=0.25), where J acts as control parameter, (b) SNIC bifurcation (J=1,Δ=0.15), and (c) hybrid-type transition (J=1,Δ=0.28), where ω0 acts as control parameter. The insets show, for every case, some typical temporal series for parameters corresponding to the vertical dashed line. Observe the emergence of a broad Griffiths-like region of ‘frustrated synchronization’ both for the HM-R and HM-CP networks but not of ER ones. (d) Raster plots for the HM-CP network in the three selected bifurcation types: Hopf (J=0.8), SNIC (ω0=0.41) and Hybrid-type (ω0=0.37). (e) Distribution of Fano factors across nodes for different networks and different selected points in the parameters space. (i) Hopf bifurcation: synchronous phase (green solid line, J=0.6), critical point (orange dashed line, JER=0.45, JHM-R=0.49, JHM-CP=0.45) and asynchronous phase (blue dotted line, J=0.2); (ii) SNIC bifurcation: synchronous phase (ω0=0.3), critical point (ω0ER=0.41, ω0HM-R=0.38, ω0HM-CP=0.36) and asynchronous phase (ω0=0.5); (iii) Hybrid synchronization: synchronous phase (ω0=0.3), critical point (ω0=0.36, ω0HM-R=0.35, ω0HM-CP=0.34) and asynchronous phase (ω0=0.5). Other parameter values: NER=1500, NHM-R=NHM-CP=1600. (Online version in colour.)

    Let us stress that the Griffiths-like regime is generic across a broad region in phase space and is thus not specific to the type of transition from which such a phase is entered. However, as illustrated in the raster plots of figure 2d (raster plots are obtained by plotting a dot each time an individual oscillator crosses a predefined value, e.g. ϕ=2π), the qualitative features and variability of the resulting time series within the Griffiths-like phase can vary depending on the bifurcation type near which the system lies.

    In order to quantify the levels of variability—within and across basal moduli—of the resulting time series we employ the Fano factor (FF), a quantity that takes values FF0 for regular ‘spikes’, FF1 for random Poisson process, and FF1 for highly irregular spikes (see Methods and [4]). Figure 2e shows the probability distribution of FFs calculated across individual units, p(FF), for different network types and diverse dynamical regimes, either nearby or far from bifurcation points. For ER networks, large values are observed only if the system is fine-tuned to its critical point while, otherwise, the FF takes always very low values. More generally, for all networks, the largest values and variability of FF are typically observed at critical transition points. Nevertheless, observe also that regions of significantly large variability are obtained within the Griffiths-like phase for HM networks. More specifically, the FF probability distributions reveal that the largest variability is obtained when the dynamics of basal moduli is tuned to operate near the hybrid-type synchronization phase transition, with the remarkable exception of the HM-CP: the core-periphery structure always displays a large variability, being maximal near the hybrid transition and the SNIC bifurcation.2

    To shed light on this last observation, let us remark that the parameters to observe each type of synchronization transition have been chosen assuming isolated basal moduli. However, external inputs coming from the coupling to other basal moduli can alter the local behaviour of each of the basal moduli. Moreover, given the finite size of basal moduli, deviations from the exactly known mean-field behaviour can occur. Also, the small size of the hybrid-type transition region (which is not easy to locate numerically [72]), combined with finite-network-size effects, could introduce numerical errors in the location of the point of maximal variability. For all these reasons, even if the hybrid-type transition can in principle ensure the maximal possible variability at a given mesoscopic unit, systems tuned in principle to lie near the SNIC bifurcation seem also to produce very large levels of variability at a global scale (see, e.g. figure 2d). In any case, the previous results confirm the robustness of the Griffiths phase with large degrees of variability for different types of bifurcations (see raster plots and FF distributions in figure 2d,e for the HM-CP network).

    Thus, in summary, the joint evaluation of phase diagrams and FFs support that: (i) The dynamical richness is much more prominent in hierarchical-modular networks than in random ones. (ii) HM-CP networks, including connectors hubs, can sustain more extensive dynamical richness than simple HM-R networks. (iii) In all hierarchical-modular cases, broad regimes of dynamical richness are generated across the ‘frustrated synchronization’ or ‘Griffiths-like’ phase, without the explicit need for parameter fine-tuning to set parameter values close to the edge a bifurcation line or critical point.

    Let us finally mention that detailed analyses of the emerging short-time correlation matrices and their dynamical changes—which can be very illuminating to investigate the segregation-integration balance—are still missing. Further systematic analyses of the temporal structure of dynamical functional networks (following, e.g. the pioneering work of Deco’s group [31,32]) are in progress and will be reported elsewhere. Additionally, we also leave for future work a detailed analysis of how the network architecture affects the statistics of avalanches nearby the different transition points; owing to structural heterogeneity we expect the emergence of broad Griffiths regions with continuously varying avalanche exponents [6567], but studies employing much larger network sizes would be required to confirm or exclude this possibility in models with either homogeneous or heterogeneous frequency distributions.

    4. Conclusion and discussion

    Understanding what are the chief structural of dynamical aspects that confers their extremely rich dynamical repertoire to actual brain networks is a challenge of utmost importance. For instance, as argued in the Introduction, such a large dynamical repertoire is a necessary condition to support a rich and flexible balance between segregation and integration. In spite of very important advances—both empirical and computational—some aspects still remain obscure and/or controversial. In this brief paper, we have proposed a strategy to explore the problem of the emerging collective variability of functional and transient synchronization: it consists of the consideration of simple structural networks and minimal dynamical models for the neural activity at single nodes (coupled together following the underlying network connectivity) with the goal of scrutinizing in a systematic way the emerging variability in synchronization levels across time and network scales and how it depends on key features of both the structure and the dynamics.

    The results presented here constitute a first approximation to this ambitious long-term project; much more extensive and systematic analyses along similar lines are still needed and will be presented elsewhere. Nevertheless, there are a number of features that become clear already from the present study. Among the few network architectures analysed here, those with a hierarchical-modular structure exhibiting also connector hubs, i.e. a core-periphery structure, are those supporting the largest possible range of dynamical variability. Other structural aspects such as connectivity weights [86] and directionality [87] are known to play a key role and need to be incorporated and studied in future extensions of this work. On the other hand, from the dynamical side, the main conclusion is that large levels of variability can be obtained in broad regions of parameter space, corresponding to Griffiths-like phases of ‘frustrated synchronization’, where transient and variable levels of synchrony are observed. If one insists on fine-tuning the dynamics, the largest variability is obtained nearby hybrid-type transitions.

    We plan to extend the present work in a number of directions. First of all, we intend to analyse structural matrices (including empirical ones) employing spectral graph theory, to understand their key features in a more systematic and consistent way. This will allow us to shed light on the structural modes that the topology supports (see, e.g. [54,63]). The key idea is to determine at a graph-spectral level which are the dynamical regimes leading to the broader and more variable spectrum of excitations (see e.g. [48]). We also plan to analyse more detailed dynamical models (beyond simple phase models) such as the Landau–Ginzburg mesoscopic theory and Wilson–Cowan models [53,88]. For example, it has been recently found that chaotic deterministic oscillators can generate a richer and more robust dynamical variability that stochastic system siting at the edge of a bifurcation [89]. We plan to explore this possibility by analysing more neuro-physiologically motivated chaotic systems in the spirit of the Landau–Ginzburg approach [88,90]. Other potentially important aspects, such as (i) time-delays in the node couplings, (ii) more realistic frequency distributions (as well as their correlation with node-connectivity and network location [91], etc.) could also be considered. It is our hope that this type of modelling-oriented systematic approach will help shed light on the rich collective dynamical regimes sustaining the functionality of actual brains, bridging the gap between structure and dynamically emergent high-level functions.

    5. Methods

    (a) Synthetic network architectures

    HM-R networks and HM-CP are built employing the algorithms proposed in [67,78], respectively. Both models generate hierarchical-modular structures and are flexible enough to produce diverse networks depending on parameters such as the number of hierarchical levels, the total connectivity and the number of nodes in the basal modules. The HM-R case has two large-scale moduli (hemispheres), which are composed by two sub-moduli, and so on for a total of six hierarchical levels. The resulting 64 basal moduli are ER networks with N=25 units and connectivity per node k0=12 (see [67]). The next first hierarchical layer over the basal one has k=5, which is progressively reduced up to a value k=0.1 for nodes connecting both hemispheres. The HM-CP network is constructed with identical basal moduli, the same number of layers and identical averaged connectivity, using the scale-free exponent γ=2 for all levels except the basal one [78].

    (b) Fano factor

    The FF of a time series X(t) (either continuous of discrete in time) is defined as the ratio between the variance and the mean of the series, as FF=Var(X)/X. We computed the FF over interspike times in the raster plot [4], where a spike is defined each time the oscillator phase crosses 2π. For each individual unit, the interspike interval is the time between two consecutive events in the raster plot, i.e. ISIj=tj+1tj for j=1,,nspikes. This is repeated for each unit, FFi for i=1,,N, computing then the histogram averaged over all oscillators.

    Data accessibility

    This article has no additional data.

    Authors' contributions

    All authors conceived and designed the study, and they all drafted, read and approved the manuscript. V.B. and P.V. carried out the computational simulations and numerical analyses. M.A.M. and R.B. supervised the research.

    All authors gave final approval for publication and agreed to be held accountable for the work performed therein.

    Competing interests

    We declare we have no competing interests.

    Funding

    M.A.M. acknowledges the Spanish Ministry and Agencia Estatal de investigación (AEI) through grant nos FIS2017-84256-P and PID2020-113681GB-100 (European Regional Development Fund), as well as the Consejeria de Conocimiento, Investigación Universidad, Junta de Andalucia and European Regional Development Fund, Ref. A-FQM-175-UGR18 and P20-00173 for financial support. V.B. acknowledges support from the Sofja Kovalevskaja Award from the Alexander von Humboldt Foundation, endowed by the Federal Ministry of Education and Research. R.B. acknowledges funding from the INFN BIOPHYS project. We also thank Cariparma for their support through the TEACH IN PARMA project.

    Acknowledgements

    We thank G.B. Morales and J. Pretel for extremely valuable comments and P. Moretti, J. Cortes and S. di Santo for a long-term collaboration on the topics discussed here.

    Footnotes

    1 This idea is also linked to the ‘entropic brain’ hypothesis [44,63].

    2 Previous studies have also employed the Fano factor to assess variability of brain activity (see [85]); note that the Fano factor can be computed for different observables, and in the cited case it is applied to the variability of instantaneous firing rates over finite-sized temporal windows rather than to inter-spike intervals. Despite of the methodological differences, our results are compatible with the distributions shown in [85].

    One contribution of 17 to a theme issue ‘Emergent phenomena in complex physical and socio-technical systems: from cells to societies’.

    Published by the Royal Society. All rights reserved.