
Detecting critical transitions before they occur is challenging, especially for complex dynamical systems. While some early-warning indicators have been suggested to capture the phenomenon of slowing down in the system's response near critical transitions, their applicability to real systems is yet limited. In this paper, we propose the concept of predictability based on machine learning methods, which leads to an alternative early-warning indicator. The predictability metric takes a black-box approach and assesses the impact of uncertainties itself in identifying abrupt transitions in time series. We have applied the proposed metric to the time series generated from different systems, including an ecological model and an electric power system. We show that the predictability changes noticeably before critical transitions occur, while other general indicators such as variance and autocorrelation fail to make any notable signals.
Citation: Jaesung Choi, Pilwon Kim. Early warning for critical transitions using machine-based predictability[J]. AIMS Mathematics, 2022, 7(11): 20313-20327. doi: 10.3934/math.20221112
[1] | Junhao Zhong, Zhenzhen Wang . Artificial intelligence techniques for financial distress prediction. AIMS Mathematics, 2022, 7(12): 20891-20908. doi: 10.3934/math.20221145 |
[2] | Federico Divina, Miguel García-Torres, Francisco Gómez-Vela, Domingo S. Rodriguez-Baena . A stacking ensemble learning for Iberian pigs activity prediction: a time series forecasting approach. AIMS Mathematics, 2024, 9(5): 13358-13384. doi: 10.3934/math.2024652 |
[3] | Meng Pang, Zhe Li . A novel profit-based validity index approach for feature selection in credit risk prediction. AIMS Mathematics, 2024, 9(1): 974-997. doi: 10.3934/math.2024049 |
[4] | Salman khan, Muhammad Naeem, Muhammad Qiyas . Deep intelligent predictive model for the identification of diabetes. AIMS Mathematics, 2023, 8(7): 16446-16462. doi: 10.3934/math.2023840 |
[5] | Alexander Musaev, Dmitry Grigoriev, Maxim Kolosov . Adaptive algorithms for change point detection in financial time series. AIMS Mathematics, 2024, 9(12): 35238-35263. doi: 10.3934/math.20241674 |
[6] | A. Presno Vélez, M. Z. Fernández Muñiz, J. L. Fernández Martínez . Enhancing structural health monitoring with machine learning for accurate prediction of retrofitting effects. AIMS Mathematics, 2024, 9(11): 30493-30514. doi: 10.3934/math.20241472 |
[7] | Ana Lazcano de Rojas . Data augmentation in economic time series: Behavior and improvements in predictions. AIMS Mathematics, 2023, 8(10): 24528-24544. doi: 10.3934/math.20231251 |
[8] | Jiawen Ye, Lei Dai, Haiying Wang . Enhancing sewage flow prediction using an integrated improved SSA-CNN-Transformer-BiLSTM model. AIMS Mathematics, 2024, 9(10): 26916-26950. doi: 10.3934/math.20241310 |
[9] | Spyridon D. Mourtas, Emmanouil Drakonakis, Zacharias Bragoudakis . Forecasting the gross domestic product using a weight direct determination neural network. AIMS Mathematics, 2023, 8(10): 24254-24273. doi: 10.3934/math.20231237 |
[10] | Olfa Hrizi, Karim Gasmi, Abdulrahman Alyami, Adel Alkhalil, Ibrahim Alrashdi, Ali Alqazzaz, Lassaad Ben Ammar, Manel Mrabet, Alameen E.M. Abdalrahman, Samia Yahyaoui . Federated and ensemble learning framework with optimized feature selection for heart disease detection. AIMS Mathematics, 2025, 10(3): 7290-7318. doi: 10.3934/math.2025334 |
Detecting critical transitions before they occur is challenging, especially for complex dynamical systems. While some early-warning indicators have been suggested to capture the phenomenon of slowing down in the system's response near critical transitions, their applicability to real systems is yet limited. In this paper, we propose the concept of predictability based on machine learning methods, which leads to an alternative early-warning indicator. The predictability metric takes a black-box approach and assesses the impact of uncertainties itself in identifying abrupt transitions in time series. We have applied the proposed metric to the time series generated from different systems, including an ecological model and an electric power system. We show that the predictability changes noticeably before critical transitions occur, while other general indicators such as variance and autocorrelation fail to make any notable signals.
Complex dynamical systems, including organisms, ecosystems, engineering systems and financial markets, often exhibit sudden transitions from a stable state to an alternative one with different features [2,14,15]. Such sharp regime shifts can occur unexpectedly with no explicit trigger, and therefore are hard to predict and manage.
It has been noted that systems near critical points have general properties regardless of differences in detail [21]. One of the most important clues is the so-called 'critical slowing down, ' which implies that the system becomes increasingly slow in recovering from small perturbations as it approaches such critical points [20]. The slowness of recovery in such natural perturbations near bifurcations leads to an increase in autocorrelation and variance in the fluctuating response [1,9]. Other conventional metrics such as spectral density and skewness, have also been used as early-warning indicators (EWIs) for critical transitions [2]. When applied to real systems, conventional EWIs often vary in reliability and fail to produce consistent results [5]. This implies that detecting critical transitions remains a daunting task, especially when we have insufficient knowledge of the system's transition-generating mechanisms.
In this paper, we propose the concept of predictability for time series data. While the specific value of predictability depends on the machine learning method we choose, it can be used as a simple and flexible EWI for a time series. Roughly put, the concept is based on the observation that complex systems become increasingly unpredictable as they approach tipping points. If we use a time-series predictor with a limited capacity, its prediction error likely grows larger and larger toward a critical point. On the contrary, the metric can also be used to detect the emergence of order for the systems that start to develop a higher level of patterns across a critical transition.
In the following, we present a definition of predictability as a metric throughout the time series and provide a methodological guide for evaluating it as a leading indicator for detecting critical transitions. In many dynamical systems, dramatic changes often occur from a time-dependent change of the parameter sets [12,16]. We have applied the proposed metric to the typical empirical models that develop critical transitions by parameter drifts, including a food-chain model, an electric power system and coupled chaotic oscillators. We compare the performance of the predictability to those of other EWIs, such as variance and autocorrelation.
This section introduces the concept of predictability of a time series. By the term predictor, we simply mean a procedure that derives an output time series from an input time series, which we want to naturally interpret as the future and past time series, respectively. For example, linear and quadratic regressions are two elementary predictors. We assume that the length of the input and the output time series can vary for a specific predictor. It is reasonable to expect that the longer the time series is fed to the predictor, the more accurate the prediction. However, such a proportional improvement only occurs in some range in practice due to the nature of the given time series, or due to the limited capacity of the predictor.
Suppose we have a sufficiently reliable predictor that can be used for a general time series. The concept of predictability for complex systems is based on the following hypothesis: the performance of the predictor changes as the system approaches critical transitions. In the situation where the predictor estimates the systems' future behavior with reasonable precision, the hypothesis implies that we will need more data to make a better prediction once the system begins to undergo a critical transition. If the predictor must use the same size input all along, its performance likely degrades near critical points. Considering that we always have limited resources for prediction in reality, the above statement is nothing but a rephrased version of "it is harder to detect critical states earlier than usual states." On the contrary, for systems that consist of many interacting factors, a critical transition induced by synchronization can dramatically change the system's global and long-term behavior into more predictable behavior. For example, the stock price is known to be notoriously hard to forecast with any degree of reliability. However, under some condition, investors in the market start to trade in the same direction and mimic the actions of others, eventually resulting in a large-scale pattern of soaring or crashing of the stock prices. The critical transitions toward synchronization like this can greatly raise the predictability of the system from what was effectively zero before.
For a formal definition of the predictability, let us reformulate the prediction of a time series as a supervised learning problem in machine learning. Define a prediction process S as S=(M,lw,ltr), where M is a predictor, lw>0 is the length of the sliding window and ltr is the length of the training set. Also, lte=lw−ltr is the length of the test set. Refer to Figure 1(a) and (b).
To define the predictability of a given time series at t, we apply the predictor M to the prior part of the time series. Figure 1(a) shows that the time series in the sliding window [t−lw,t] is used to evaluate the metric. At each time t, the corresponding sliding window is split into two parts: the training set and the test set. The portions may vary with time. We use the time series in [t−lw,t−lw+ltr] to train the predictor, and then we have the predictor yield the future time series for the remaining part of the sliding window in [t−lw+ltr,t]. The prediction error, ES(t), can be assessed by comparing this result with the "real" time series from the test set in [t−lw+ltr,t]. Note that this error generally decreases as the portion of the training set increases (or, as the portion of the test set decreases, equivalently). Compare Figure 1(b) and (c). At each t, we basically want to adjust ltr (or lte equivalently) to keep the error ES(t) within the tolerance level, tol.
Let us define l∗tr and l∗te such that
l∗tr=max{ltr∈(0,lw)|ES(t)>tol},l∗te=lw−l∗tr. | (2.1) |
The predictability, Pred(t), is now defined as the maximum ratio of the test set to the sliding window that can be predicted within the error tolerance. That is,
Pred(t)=l∗telw. | (2.2) |
For example, suppose the given data are given as a trivial constant time series. Any reasonable predictors can easily capture the constant behavior even from a small training set, that is, with ltr/lw≪1. This implies Pred(t)≈1 for the constant time series. Another extreme case is the white noise, which has no time-correlated pattern and therefore allows no meaningful prediction. In that case, no matter how much of the sliding window is used as the training set, the predictor fails to reproduce the time series. This means that we have Pred(t)≈0 for the white noise.
While the concept of predictability is introduced to capture the change of the features of a time series, its validity is based on the assumption of unbiased performance of an adopted predictor. Such a requirement can be fulfilled by the most popular time-series predictors. In this work, we use two predictors, one from statistics and the other from machine learning. The autoregressive integrated moving average (ARIMA) is one of the most widely used statistical approaches to time-series forecasting; it incorporates autocorrelation to model temporal structures of the data. We evaluate the predictability using the ARIMA as the predictor in the first numerical example in Section 4. However, since the ARIMA assumes linear non-stationarity in the mean, it is not appropriate for forecasting complex nonlinear systems. We have adopted a more general machine learning method, i.e., echo state networks (ESNs), for other prediction tasks.
ESNs are a special type of recurrent neural network which has two modules: (i) a fixed network called a "reservoir" that reacts to an input nonlinearly and (ii) a trainable linear layer to connect such a response to the desired output. ESNs have been widely used for time-series prediction due to its simple and flexible architecture [11,17]. The following is a minimal explanation of the structure of the ESN used in this work. The readers who are more interested in details can refer to [6,10]. The ESN is defined as an evolutionary equation such that
r(k+1)=(1−α)r(k)+αtanh(Ar(k)+Winu(k)), | (3.1) |
where r(k)∈RN is a state of the reservoir and u(k)∈RL is an input vector. We use N=1000 nodes and a leaking rate α=0.1 throughout all of the examples. A∈RN×N is an internal weight matrix (adjacency matrix between nodes) that is created with the pairwise connection probability, 0.02. Win∈RN×L is an input weight matrix, and its entry is randomly generated from the uniform distribution over [−0.085,0.085].
Suppose a (training) data set of size K is given as (u(1),y(1)),(u(2),y(2)),⋯,(u(K),y(K)), where u(k)∈RLin is an input vector and y(k)=[y1(k),y2(k),⋯yLout(k)]T∈RLout is a desired output vector for 1≤k≤K. By driving the input series u(1),u(2),⋯u(K) into Eq. (3.1), we obtain r(1),r(2),⋯,r(K). Solving the minimization problems,
wi=argminw∈RNK∑k=1‖yi(k)−wTr(k)‖22,1≤i≤Lout, | (3.2) |
determines the output weight matrix Wout∈RLout×N=[w1,w2,⋯,wLout]T.
The matrix Wout now enables us to approximate an output y from an input u; we first evaluate r in Eq (3.1) with respect to u and then estimate y by ˆy as
ˆy=Woutr. | (3.3) |
Considering the evolutionary nature of Eq (3.1), an ESN is appropriate, especially for the highly correlated data set. In the examples in Section 4, we collect u(1),u(2),⋯,u(K) from the dynamics of the models and generate the training set by setting y(k)=u(k+1),1≤k≤K. Once the corresponding Wout is obtained in Eq (3.2) as mentioned above, we repeatedly apply Eq (3.3) and generate predictions as ˆy(K)=ˆu(K+1),ˆy(K+1)=ˆu(K+2),⋯.
We use the following error to assess the quality of the predictions in this study:
ES(t)=‖y−ˆy‖2‖y‖2. | (3.4) |
As a final practical remark, we added small multiplicative noise to the input in Eq (3.1) as follows:
u(k)←(1+ϵ)u(k),ϵ∼N(0,10−8), | (3.5) |
which helps the ESN to avoid overfitting in computations [10,22].
In this section, we measure the predictability of the time series generated from various models with parameter drift, including stochastic and chaotic ones. Our goal is to detect an early sign of a sudden change from the predictability before it actually happens. In the following presentation of the numerical experiments, a green vertical dotted line indicates when the system parameter drifts across a bifurcation point. The moment when the system undergoes visible changes is marked with a red vertical dotted line. Note that the predictor in our examples observes the time series of (a part of) the system variables only and cannot access the parameters. At each moment, we use a sliding window that consists of 10,000 sampling data points and evaluate the predictability to the second decimal place. The error tolerance is tol=0.1 for the first example, and tol=0.25 for others. The red solid line is used to plot the moving average of 20 values of the predictability. We compare the predictability to that of other conventional EWIs, variance and autocorrelation, which are evaluated within the same sliding window. The time lag t for autocorrelation is selected among t=1,5,10,50 and 100, which gives the largest relative deviation in the graph of AR(t) before the critical transition.
We consider the simple grazing model [19,20]:
dXdt=X(1−XK)−cX2X2+1+ϵη(t), | (4.1) |
where X is the biomass of the vegetation. The parameters K,c and ϵ represent the carrying rate, maximum grazing rate and noise level, respectively. The first term on the right-hand side denotes the growth rate of the vegetation and the second term represents the rate of the total consumption rate. The third term, η(t), stands for the stochastic force satisfying ⟨η(t)⟩=0 and ⟨η(t)η(t′)⟩=δ(t−t′). The deterministic part of Eq (4.1) undergoes saddle-node bifurcations with respect to, c as in Figure 2. We set K=10 and focus on the "jump-down'' transition occurring at c∗=2.60437.
Figure 3(a) shows the time-series data generated from Eq (4.1). The technical details of the time series are addressed in the caption. The model is known to have the critical slowing down as the parameter c being drifted across a bifurcation point c∗ [20]. Such a phenomenon is captured in the rise of variance before the collapse, as in Figure 3(c). In order to evaluate the predictability Pred(t), we have adopted ARIMA(0, 1, 1) as a predictor. One can see in Figure 3(b) that the predictability has a notable change in variance before the collapse. This suggests that for a simple system such as that given by Eq (4.1), both the predictability and variance can detect an early sign of collapse, while the autocorrelation fails to do so.
Complex dynamical systems in reality often involve more complicated transition mechanisms than simple bifurcations in the model given by Eq (4.1). Especially, chaotic systems can take some time to react to parameter shifts and the visible changes in the time series tend to follow with some delay. Such a transient period is "the critical window of opportunity'' in which the predictability metric is supposed to detect the structural change and issue an early-warning signal.
In the next three examples, we apply an ESN as a predictor to compute the predictability of complex dynamical systems. It is shown in [4,7,13] that an ESN has the ability to forecast the corresponding systems, and it can even estimate the critical values of the parameters if the values of the parameters are provided during the training. However, note again that the predictor in our examples uses a time series of the system variables only without information on the parameters.
Abrupt extinction of species from the earth has been occurring and is increasing rapidly, likely due to a shift in the parameters of the ecological systems. We consider a food-chain model of three species [18]:
dRdt=R(1−RK)−xcycCRR+R0,dCdt=xcC(ycRR+R0−)−xpypPCC+C0,dPdt=xpP(ypCC+C0−1). | (4.2) |
Here R,C and P are the population densities of the resource, consumer and predator species, respectively. The values of the other parameters xc,yc,xp,yp,R0 and C0 were taken from [18] and are presented in the caption of Figure 4. K is the environmental capacity of the resource species, which is the parameter that can induce a catastrophic bifurcation at the critical value K=Kc=0.99976. A bifurcation diagram with respect to K is shown in Figure 4. For a K slightly greater than Kc, the food-chain model tends to show transient chaos leading to a sudden species extinction.
To generate a time series in P that displays a sudden collapse, we slowly change K across Kc in the middle of the simulation. Figure 5(a) shows the resulting time series. The technical details of the time series are addressed in the caption. Note that the period of this transient chaos before collapse (between the green and red lines) is widely varied and sensitively depends on the systems' initial configuration.
Figure 5(b) shows the evolution of the predictability Pred(t), as marked with blue circles. It fluctuates over time, likely due to the chaotic nature of the data. The predictability notably drops from 0.7 to 0.2 between the green and the red vertical dotted lines. This implies that the predictability successfully captures the structural change in the system before its effect explicitly appears as a local extinction in P. This contrasts with the variance and autocorrelation in Figure 5(c) and (d), respectively, both of which do not show many notable changes.
In the electric power system, bifurcations linked with the onset of voltage collapse are of great interest for secure and stable operation. It is observed that the voltage's sudden collapse is often preceded by chaotic fluctuations for some duration. A possible explanation is that disturbances lead to a shift in the system parameters through the critical value [3], bringing the system into transient chaos, where it eventually settles down to a voltage collapse later. We consider the model of the electrical power system with voltage collapse in [3,23]:
˙δm=ω,M˙ω=−dmω+Pm−EmVYmsin(δm−δ)),Kqw˙δ=−Kqv2V2−KqvV+Q(δm,δ,V)−Q0−Q1,TKqwKpv˙V=KpwKqv2V2+(KpwKqv−KqwKpv)V,+Kqw(P(δm,δ,V)−P0−P1)−Kpw(Q(δm,δ,V)−Q0−Q1)), | (4.3) |
where δm,ω,δ and V are the generator voltage phase angle, the rotor speed, the load voltage phase angle and the voltage, respectively. P and Q are the real and reactive powers supplied by the load, and they are defined as
P(δm,δ,V)=−E0VY′0sin(δ)+EmVYmsin(δm−δ),Q(δm,δ,V)=E′0VY′0cos(δ)+EmVYmcos(δm−δ)−(Y′0+Ym)V2, |
where
E′0=E0(1+C2Y−20−2CY−10cosθ0)1/2,Y′0=Y0(1+C2Y−20−2CY−10cosθ0)1/2,θ′0=θ0+tan−1(CY−1sinθ0)1−CY−10cosθ0)). |
The values of all other constants were taken from [3,23] and are presented in the caption of Figure 6.
We take Q1 as the main parameter since it can induce a catastrophic bifurcation in V at the critical value Q1=Q1c=2.9898256. A bifurcation diagram with respect to Q1 is shown in Figure 6. For a Q1 slightly greater than Q1c, the power system tends to show transient chaos leading to sudden voltage collapse.
To generate a time series in V that displays a sudden collapse, we slowly change Q1 across Q1c in the middle of the simulation. Figure 7 shows the resulting time series and the corresponding EWIs in a similar manner to the previous examples. Considering the complexity of the power system, we constructed an ESN based on a four-dimensional vector time series for (δm,ω,δ,V). Once again, the predictability in Figure 7(b) shows a notable drop from 0.7 to 0.2 between the green and the red vertical dotted lines, while the variance and autocorrelation in Figure 7(c) and (d) fail to report any essential changes. This implies that the predictability is functioning as a more reliable EWI for the electric power system.
The final example is the coupled chaotic oscillators, which undergo a critical transition as a result of synchronization. It is generally of interest to detect the onset of synchronization. The task is challenging in practical situations where only a part of the variables is available. It is, in essence, to detect the transitions between high-dimensional chaos and lower-dimensional chaos. The Lyapunov exponent is not a good indicator in this case, since its evaluation requires a large data set from both asynchronous and synchronous regimes. We consider the case of the system composed of coupled Rössler-like chaotic oscillators [8]:
˙x1,2=−y1,2−z1,2+ϵ(x2,1−x1,2),˙y1,2=x1,2+ay1,2+ϵ(y2,1−y1,2),˙z1,2=b+z1,2(x1,2−c)+ϵ(z2,1−z1,2), | (4.4) |
where ϵ is the coupling coefficient, a=0.2,b=0.2 and c=5.7. We assume that the only observable is the summation of two variables, i.e., o(t)=x1(t)+x2(t), and the values of other variables are not available. While generating time series in o(t), we have ϵ slowly drifted so that two oscillators synchronize at some point. While the value of o(t) in Figure 8(a) seems to make no qualitative change, the graph of the phase difference |x1(t)−x2(t)| in Figure 8(b) reveals that complete synchronization between two oscillators occurs at t=4280 (marked with the green dotted line).
We evaluated the predictability of o(t) in order to find out if it timely reacts to the structural change of the system (synchronization). One can see in Figure 8(c) that the synchronization that occurs at t=4280 in the time series readily impacts the predictability. The reduction of complexity of the system leads to the rise of the predictability from what was effectively zero before. The phenomenon is the opposite of three previous examples and can be understood by considering the system escapes from an asynchronous state which is highly chaotic and does not allow reliable predictions for the predictor. It is obvious in Figure 8(d) that the variance is by far a crude way to capture such a transition.
In this paper, we presented a new approach to detect critical transitions in a time series from complex dynamical systems. Our approach is based on the metric called predictability which, at each time, indicates the level of precision of the prediction up to that moment made by a machine predictor with a fixed capacity. Different from other conventional EWIs that assume an underlying crisis-generating mechanism, the predictability rather takes a black-box approach. The metric indirectly shows the structural changes of the system in terms of how much predictability we gain or lose every moment. In other words, we argue that a change in predictability reflects a shift in the system parameters. Hence, it can be interpreted as an early-warning signal during the transient period until the structural change starts to build up into a more explicit form in the outcome.
While we were performing numerical experiments, we found that there is a subtle difference between two types of systems undergoing critical transitions. In a stochastic system with a simple transition-generating mechanism, the collapse immediately follows as the system parameter crosses a critical point. On the other side, complex systems tend to take some time to react to such a parameter shift. In either case, we observe the changes in the predictability to become clear before the collapse occurs, whether or not there is temporal discrepancy between the critical shift of the parameter and the actual collapse of the system.
The fluctuation of the predictability has been observed around the transitions, especially on the side of the higher predictable regime. That is, if the system undergoes a sudden collapse, then the predictability tends to fluctuate before the collapse occurs. If the system makes a sudden synchronization, the fluctuation starts to build up thereafter. This fluctuation may degrade the reliability of the predictability. However, we observed that the magnitude of the fluctuation depends not only on the nature of the time-series data, but also on the choice of a predictor, including its hyperparameters, such as the sliding window size and the tolerance level. We will leave this issue for future research.
The framework of predictability has flexibility in its implementation. Any general and model-free methods from statistics and machine learning can be used as a predictor. Some of those, like the ARIMA and ESN used in this work, can even be upgraded and sharpened in their ability to detect critical transitions by taking other EWIs as extra inputs. However, in essence, the crisis is what we cannot predict properly in advance. The predictability helps to capture such aspects of critical transitions and quantify them as a metric for the structural change.
P. Kim was supported by a National Research Foundation of Korea (NRF) grant funded by the South Korean government (MSIT) (No. NRF-2021R1A4A1032924).
There is no conflict of interest associated with any of the authors who contributed their efforts to this manuscript.
[1] |
C. N. Anderson, C.-h. Hsieh, S. A. Sandin, R. Hewitt, A. Hollowed, J. Beddington, et al., Why fishing magnifies fluctuations in fish abundance, Nature, 452 (2008), 835–839. https://doi.org/10.1038/nature06851 doi: 10.1038/nature06851
![]() |
[2] |
V. Dakos, S. R. Carpenter, W. A. Brock, A. M. Ellison, V. Guttal, A. R. Ives, et al., Methods for detecting early warnings of critical transitions in time series illustrated using simulated ecological data, PloS one, 7 (2012), e41010. https://doi.org/10.1371/journal.pone.0041010 doi: 10.1371/journal.pone.0041010
![]() |
[3] |
I. Dobson, H.-D. Chiang, Towards a theory of voltage collapse in electric power systems, Syst. Control Lett., 13 (1989), 253–262. https://doi.org/10.1016/0167-6911(89)90072-8 doi: 10.1016/0167-6911(89)90072-8
![]() |
[4] |
H. Fan, L.-W. Kong, Y.-C. Lai, X. Wang, Anticipating synchronization with machine learning, Phys. Rev. Res., 3 (2021), 023237. https://doi.org/10.1103/PhysRevResearch.3.023237 doi: 10.1103/PhysRevResearch.3.023237
![]() |
[5] |
A. S. Gsell, U. Scharfenberger, D. Özkundakci, A. Walters, L.-A. Hansson, A. B. Janssen, et al., Evaluating early-warning indicators of critical transitions in natural aquatic ecosystems, Proceedings of the National Academy of Sciences, 113 (2016), E8089–E8095. https://doi.org/10.1073/pnas.1608242113 doi: 10.1073/pnas.1608242113
![]() |
[6] |
A. Haluszczynski, J. Aumeier, J. Herteux, C. Räth, Reducing network size and improving prediction stability of reservoir computing, Chaos: An Interdisciplinary Journal of Nonlinear Science, 30 (2020), 063136. https://doi.org/10.1063/5.0006869 doi: 10.1063/5.0006869
![]() |
[7] |
A. Haluszczynski, C. Räth, Good and bad predictions: Assessing and improving the replication of chaotic attractors by means of reservoir computing, Chaos: An Interdisciplinary Journal of Nonlinear Science, 29 (2019), 103143. https://doi.org/10.1063/1.5118725 doi: 10.1063/1.5118725
![]() |
[8] |
L. Huang, Q. Chen, Y.-C. Lai, L. M. Pecora, Generic behavior of master-stability functions in coupled nonlinear dynamical systems, Phys. Rev. E, 80 (2009), 036204. https://doi.org/10.1103/PhysRevE.80.036204 doi: 10.1103/PhysRevE.80.036204
![]() |
[9] |
A. R. Ives, Measuring resilience in stochastic systems, Ecol. Monogr., 65 (1995), 217–233. https://doi.org/10.2307/2937138 doi: 10.2307/2937138
![]() |
[10] | H. Jaeger, The "echo state" approach to analysing and training recurrent neural networks-with an erratum note, Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, 148 (2001), 13. |
[11] |
H. Jaeger, H. Haas, Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication, science, 304 (2004), 78–80. https://doi.org/10.1126/science.1091277 doi: 10.1126/science.1091277
![]() |
[12] |
B. Kaszás, U. Feudel, T. Tél, Tipping phenomena in typical dynamical systems subjected to parameter drift, Scientific reports, 9 (2019), 1–12. https://doi.org/10.1038/s41598-019-44863-3 doi: 10.1038/s41598-019-44863-3
![]() |
[13] |
L.-W. Kong, H.-W. Fan, C. Grebogi, Y.-C. Lai, Machine learning prediction of critical transition and system collapse, Phys. Rev. Res., 3 (2021), 013090. https://doi.org/10.1103/PhysRevResearch.3.013090 doi: 10.1103/PhysRevResearch.3.013090
![]() |
[14] |
G. Kou, Y. Xu, Y. Peng, F. Shen, Y. Chen, K. Chang, et al., Bankruptcy prediction for smes using transactional data and two-stage multiobjective feature selection, Decis. Support Syst., 140 (2021), 113429. https://doi.org/10.1016/j.dss.2020.113429 doi: 10.1016/j.dss.2020.113429
![]() |
[15] |
S. J. Lade, T. Gross, Early warning signals for critical transitions: a generalized modeling approach, PLoS comput. biol., 8 (2012), e1002360. https://doi.org/10.1371/journal.pcbi.1002360 doi: 10.1371/journal.pcbi.1002360
![]() |
[16] |
T. M. Lenton, Early warning of climate tipping points, Nat. clim. change, 1 (2011), 201–209. https://doi.org/10.1038/nclimate1143 doi: 10.1038/nclimate1143
![]() |
[17] |
Z. Lu, J. Pathak, B. Hunt, M. Girvan, R. Brockett, E. Ott, Reservoir observers: Model-free inference of unmeasured variables in chaotic systems, Chaos: An Interdisciplinary Journal of Nonlinear Science, 27 (2017), 041102. https://doi.org/10.1063/1.4979665 doi: 10.1063/1.4979665
![]() |
[18] |
K. McCann, P. Yodzis, Nonlinear dynamics and population disappearances, The American Naturalist, 144 (1994), 873–879. https://doi.org/10.1086/285714 doi: 10.1086/285714
![]() |
[19] |
I. Noy-Meir, Stability of grazing systems: An application of predator-prey graphs, J. Ecol., 63 (1975), 459–481. https://doi.org/10.2307/2258730 doi: 10.2307/2258730
![]() |
[20] |
M. Scheffer, J. Bascompte, W. A. Brock, V. Brovkin, S. R. Carpenter, V. Dakos, et al., Early-warning signals for critical transitions, Nature, 461 (2009), 53–59. https://doi.org/10.1038/nature08227 doi: 10.1038/nature08227
![]() |
[21] |
M. Scheffer, S. R. Carpenter, T. M. Lenton, J. Bascompte, W. Brock, V. Dakos, et al., Anticipating critical transitions, science, 338 (2012), 344–348. https://doi.org/10.1126/science.1225244 doi: 10.1126/science.1225244
![]() |
[22] | Y. Uwatet, M. Schule, T. Ott, Y. Nishiot, Echo state network with chaos noise for time series prediction, in International Symposium on Nonlinear Theory and its Applications (NOLTA), Okinawa, Japan, 16–19 November 2020, (2020), 274. |
[23] |
H. O. Wang, E. H. Abed, A. M. Hamdan, Bifurcations, chaos, and crises in voltage collapse of a model power system, IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications, 41 (1994), 294–302. https://doi.org/10.1109/81.285684 doi: 10.1109/81.285684
![]() |
1. | Mathias Marconi, Karin Alfaro-Bittner, Lucas Sarrazin, Massimo Giudici, Jorge Tredicce, Critical slowing down in a real physical system, 2024, 186, 09600779, 115218, 10.1016/j.chaos.2024.115218 | |
2. | Sandip V George, Sneha Kachhara, G Ambika, Early warning signals for critical transitions in complex systems, 2023, 98, 0031-8949, 072002, 10.1088/1402-4896/acde20 |