This study aims to apply advanced machine-learning models and hybrid approaches to improve the forecasting accuracy of the US Consumer Price Index (CPI). The study examined the performance of LSTM, MARS, XGBoost, LSTM-MARS, and LSTM-XGBoost models using a large time-series data from January 1974 to October 2023. The data were combined with key economic indicators of the US, and the hyperparameters of the forecasting models were optimized using genetic algorithm and Bayesian optimization methods. According to the VAR model results, variables such as past values of CPI, oil prices (OP), and gross domestic product (GDP) have strong and significant effects on CPI. In particular, the LSTM-XGBoost model provided superior accuracy in CPI forecasts compared with other models and was found to perform the best by establishing strong relationships with variables such as the federal funds rate (FFER) and GDP. These results suggest that hybrid approaches can significantly improve economic forecasts and provide valuable insights for policymakers, investors, and market analysts.
Citation: Yunus Emre Gur. Development and application of machine learning models in US consumer price index forecasting: Analysis of a hybrid approach[J]. Data Science in Finance and Economics, 2024, 4(4): 469-513. doi: 10.3934/DSFE.2024020
[1] | Saranya Muniyappan, Arockia Xavier Annie Rayan, Geetha Thekkumpurath Varrieth . DTiGNN: Learning drug-target embedding from a heterogeneous biological network based on a two-level attention-based graph neural network. Mathematical Biosciences and Engineering, 2023, 20(5): 9530-9571. doi: 10.3934/mbe.2023419 |
[2] | Jiahui Wen, Haitao Gan, Zhi Yang, Ran Zhou, Jing Zhao, Zhiwei Ye . Mutual-DTI: A mutual interaction feature-based neural network for drug-target protein interaction prediction. Mathematical Biosciences and Engineering, 2023, 20(6): 10610-10625. doi: 10.3934/mbe.2023469 |
[3] | Peter Hinow, Philip Gerlee, Lisa J. McCawley, Vito Quaranta, Madalina Ciobanu, Shizhen Wang, Jason M. Graham, Bruce P. Ayati, Jonathan Claridge, Kristin R. Swanson, Mary Loveless, Alexander R. A. Anderson . A spatial model of tumor-host interaction: Application of chemotherapy. Mathematical Biosciences and Engineering, 2009, 6(3): 521-546. doi: 10.3934/mbe.2009.6.521 |
[4] | Wen Zhu, Yuxin Guo, Quan Zou . Prediction of presynaptic and postsynaptic neurotoxins based on feature extraction. Mathematical Biosciences and Engineering, 2021, 18(5): 5943-5958. doi: 10.3934/mbe.2021297 |
[5] | Bo Zhou, Bing Ran, Lei Chen . A GraphSAGE-based model with fingerprints only to predict drug-drug interactions. Mathematical Biosciences and Engineering, 2024, 21(2): 2922-2942. doi: 10.3934/mbe.2024130 |
[6] | Xinglong Yin, Lei Liu, Huaxiao Liu, Qi Wu . Heterogeneous cross-project defect prediction with multiple source projects based on transfer learning. Mathematical Biosciences and Engineering, 2020, 17(2): 1020-1040. doi: 10.3934/mbe.2020054 |
[7] | Huiqing Wang, Sen Zhao, Jing Zhao, Zhipeng Feng . A model for predicting drug-disease associations based on dense convolutional attention network. Mathematical Biosciences and Engineering, 2021, 18(6): 7419-7439. doi: 10.3934/mbe.2021367 |
[8] | Rachael C. Adams, Behnam Rashidieh . Can computers conceive the complexity of cancer to cure it? Using artificial intelligence technology in cancer modelling and drug discovery. Mathematical Biosciences and Engineering, 2020, 17(6): 6515-6530. doi: 10.3934/mbe.2020340 |
[9] | Dong Ma, Shuang Li, Zhihua Chen . Drug-target binding affinity prediction method based on a deep graph neural network. Mathematical Biosciences and Engineering, 2023, 20(1): 269-282. doi: 10.3934/mbe.2023012 |
[10] | Xianfang Wang, Qimeng Li, Yifeng Liu, Zhiyong Du, Ruixia Jin . Drug repositioning of COVID-19 based on mixed graph network and ion channel. Mathematical Biosciences and Engineering, 2022, 19(4): 3269-3284. doi: 10.3934/mbe.2022151 |
This study aims to apply advanced machine-learning models and hybrid approaches to improve the forecasting accuracy of the US Consumer Price Index (CPI). The study examined the performance of LSTM, MARS, XGBoost, LSTM-MARS, and LSTM-XGBoost models using a large time-series data from January 1974 to October 2023. The data were combined with key economic indicators of the US, and the hyperparameters of the forecasting models were optimized using genetic algorithm and Bayesian optimization methods. According to the VAR model results, variables such as past values of CPI, oil prices (OP), and gross domestic product (GDP) have strong and significant effects on CPI. In particular, the LSTM-XGBoost model provided superior accuracy in CPI forecasts compared with other models and was found to perform the best by establishing strong relationships with variables such as the federal funds rate (FFER) and GDP. These results suggest that hybrid approaches can significantly improve economic forecasts and provide valuable insights for policymakers, investors, and market analysts.
In the wake of the 2015 Paris Agreement, many countries are reconsidering which policies could drive deep reductions in carbon dioxide emissions (CO2). Carbon pricing is likely to feature prominently in this re-evaluation. In Japan, the government introduced a carbon tax in 2012 by adding a modest surcharge to current petroleum and coal tax schemes for crude oil, petroleum products, coal, liquefied petroleum gas and liquefied natural gas [1]. The tax revenue will then be recycled to help finance energy efficient and renewable energy technologies. The tax could also foreseeably motivate industries and consumers to save energy. But the current rates JPY289/tCO2 (USD 2.4/tCO2) are too low to induce these effects [2]. This raises an important question: what if tax rates were higher?
This paper examines this hypothetical “what if”. But rather than forecasting the effects of a higher carbon tax, the paper analyses historical fuel prices, subsidies, and investment data. This approach is taken because implementing a higher carbon tax would increase fuel prices and generate revenue that could subsidize investments in production upgrades, efficiency reforms, or technologies running on lower carbon fuels [3,4]. The consequent price gaps between conventional and lower carbon products may further influence consumer behavior [5,6]. Thus the paper will assess whether subsidy provisions and fuel price changes could improve energy and carbon intensity, thereby lowering CO2 emissions, and whether investments could lead to energy efficiency and carbon intensity improvements.
The study will focus on energy-intensive industries. At present, Japan’s industries account for 36% of the total CO2 emissions and 44% of total energy consumption. In the past, Japan succeeded in mitigating industrial-related CO2 emissions with reductions of 70MtCO2 between 1990 and 2014 [7]. However, these aggregate figures shed little light on what kinds of policy instruments affected energy and carbon-intensity in which industries. The paper hence concentrates on the iron & steel, chemical, and machinery industries for two reasons. The first is that they are significant contributors to CO2 emissions. The three selected industries account for 75% of CO2 emissions in Japan’s manufacturing sector (47.9% for iron & steel, 17.7% for chemical, 9.4% for machinery) [7]. They are also the top three contributors to the total domestic product in the manufacturing sector (11% in iron & steel, 16% in chemical and 42% in machinery industry) [8]. The second reason they are selected involves possible differences in their structure and responsiveness to varying policy changes. The paper will therefore also explore whether and to what extent the aforementioned fuel price changes and subsidy provisions had varying effects on investments that mitigate CO2 across different industries.
A time-series autoregressive moving average (ARMA) regression is used for these purposes. The regression results suggest the effects varied across industries. In the iron & steel industry, subsidies and price changes had negligible effects on energy or carbon intensity. This may be because existing iron & steel technologies have long lifetimes and sizable replacement costs [9,10]. It may also be because the industry is dominated by a few large companies that were relatively immune to policy changes 1. In the chemical industry, subsidies and fuel prices gave rise to investments that improved carbon and energy intensity. This may be because the industry has relatively higher operation costs that could be cut given financial incentives. In the cost structure of chemical industry, about 35 per cent of the total domestic production (gross outputs) is spent to purchase fuel products including coal, crude petroleum, natural gas, and oil and coal products 2. In the machinery industry, two of three fuel price changes (oil and gas), but not subsidy provisions, yielded improvements in carbon and energy intensity. This may reflect the heterogeneity of products that make up the industry 3 and other factors (such as a generally greater responsiveness to consumer demands and technology trends) 4. Overall, the study underlines that policymakers need to be cognizant of potentially varying effects of a carbon tax across different industries. This may require careful reflection on both the rate and the recycling of revenue in key industries.
1 A half of the total final energy consumption in iron and steel industry is by blast furnace manufacturing which is dominated by 15 entities in Japan. Data is available from METI (http://www.meti.go.jp/statistics/tyo/kougyo/result-2/h26/kakuho/sangyo/index.html and http://www.enecho.meti.go.jp/statistics/total_energy/results.html).
2 Ministry of Internal Affairs and Communications. Available from: http://www.soumu.go.jp/english/dgpp_ss/data/io/io11.htm
3 Machinery industry includes establishments such as electronic components and transportation equipment as well as production machinery. The number of business entities engaged in machinery industry accounts for about 78 per cent of the total establishments of manufacturing sector in Japan. (METI. Available from: http://www.meti.go.jp/statistics/tyo/kougyo/result-2/h26/kakuho/sangyo/index.html)
4 Machinery industry need to respond to fast changing consumer product demands and related technological trends. Therefore,energy consumption is likely to move with shifts in demands for products [64]. The industry also has comparatively high electricity demands [64] (also see METI. Available from: http://www.enecho.meti.go.jp/statistics/total_energy/results.html)
The remainder of the paper is divided into seven sections. The second section reviews literature to develop a simple conceptual model illustrating the links between key drivers of CO2 emission reductions. In the third section it describes the data and model based on the literature review. In the fourth and fifth sections results of a models are reported and discussed. The sixth section describes study limitations. A final section concludes with possible policy implications for a carbon tax in Japan.
Cutting industrial-related CO2 emission often involves improving energy or carbon intensity. The former involves lowering energy per unit of production, while the latter entails lowering CO2 emitted per unit of energy [11,12]. Improving energy intensity requires altering energy consumption patterns or deploying energy efficient technologies [13,14,15,16,17]. Improving carbon intensity requires shifting to less-CO2 intensive energy sources or alternative fuels [12,18,19]. To lower energy use, fuel reductions, exhaust heat recovery and fuel swtching can be achieved through investments in improved technologies and equipment as well as operational efficiency gains. For energy efficiency improvements, the transformation of primary energy resources and economic structures can reduce energy demands [20]. On the other hand, fuel switching can curb energy use and improve CO2 emission intensity of industrial energy use [21,22]. For instance, Kagawa & Inamura (2001) demonstrate the main reduction of energy demand stems from structural changes to input factors.
In industries, renewable energy can replace conventional fossil fuels in non-utility generation; heat recovery and waste can also be reused for energy [24]. For example, the cement & ceramics and pulp & paper industries could reduce CO2 emissions by altering the fuel mix, using more biomass, improving efficiencies, and recycling energy from heat [25]. As suggested by these options, a combination of energy efficiency improvement and fuel switching is often needed to curb energy use and CO2 emissions.
Japan employed several of the above techniques to reduce its dependencies on imported oil and stabilise energy supplies throughout its modern history. Though initial efforts focused on shifting from oil to coal and improving coal efficiency in the 1980s [26], the approach concentrated more on shifting from coal to natural gas and nuclear by the 1990s into 2000s [27,28]. Following the Fukushima accident, renewable energy became a greater point of emphasis [29].
While the above options are well known, widespread implementation can be challenging. Gillingham et al. (2009) suggests a key reason for the challenge is market failure. Market failures stem from prices that do not reflect the true marginal social cost of energy consumption, thus curtailing incentives to adopt new technologies or otherwise improve production methods. Another hurdle is that decision makers are not always rational; hence they do not always minimise present value for energy service provisions even when a full accounting suggests the benefits outweigh the costs [30].
A carbon tax could be one of the tool to help overcome these barriers. By making fossil fuels more expensive and renewable energy more competitive, carbon pricing can incentivise industries to adopt technical measures such as fuel reduction, fuel switching and efficiency reforms. Since it puts a price on carbon, pricing can also send a signal to the market. If the price on carbon is high enough, it may also persuade consumers to reduce energy consumption [31,32,33]. This paper thus sets out to examine the effects of carbon tax on improving carbon and energy intensity. However, instead of looking at the impact of carbon tax, this paper focuses on a three key instruments that could improve carbon and energy intensity: 1) fuel price changes; 2) subsidy provisions; and 3) investments.
One instrument that could improve energy and carbon intensity is subisdy provisions. Subsidies, including targeted grants and favourable loans, are a relatively straightforward approach to promoting new technologies or operational effiiciency reforms [34,35]. But while subsidies could help open or expand technology markets and encourage a range of supporting reforms, many maintain they are not cost-effective [25,36]. Rather often subsidies perform better when combined with tax policies [37,38,39]. More concretely, taxes on fuels can incentivise reductions in energy, while subsidies from tax revenue promote investments in energy efficient technologies. For instance, the Netherlands stimulated investments in clean automobiles with the subsidies that came from extra tax revenue from energy-intensitive cars [37]. In case of Japan, subsidies for measures of energy conservation and renewable energy are provided from the Special Account for measures to improve energy supply and demand structure of government budget. The financial sources of the Special Account include tax revenue from the petroleum and coal tax [40,41]. The subsidies includes fiscal incentives such as tax breaks as well as grants 5.
5 Tax break can improve the payback period by reducing the cost of energy-saving equipment through the reduction of tax payments [65].
Another possible instrument involves changing energy prices. Price changes incentivise energy use reductions. Estimating price elasticity has been used to assess regulatory policies and impacts of energy price shocks [42]. However, studies on energy price elasticity of energy demand show that price elasticity is much smaller than elasticities associated with GNP or energy efficiency improvements [20]. Price elasticity also tends to differ for particular fuel types and time-periods [43]. Moreover, there is a time lag between the price elasticity in the period of price rise and the period of price fall, and this cross-temporal variation is influenced by various factors such as technological changes, lifestyle changes, demographic changes [44]. Therefore, for the detailed of energy price elasticity of demand, each fuel type and individual industrial sector needs to consider price elasticity; these too vary significantly across fuel types and industries [43]. In the case of Japan, studies show that price elasticity of energy demand is significant only when the energy price reached the highest price levels in the long term although there are time lags in the price elasticity of energy demand and the elasticity varies by sectors [44].
Shifting investment from high to low energy intensity technologies or facilities is another approach to reducing CO2 emissions. Although Japan improved industrial energy efficiency, a shift to more energy efficiency technologies is required to reduce CO2 emissions. Technology innovation and technology installation to stimulate fuel switching, heat energy recycling and energy efficiency improvement can catalyse these reductions. Low-carbon research on Japan suggests that major investments in technologies offer cost-effective opportunities to improve energy efficiency as well as enhance international competitiveness [45]. Thus, stimulating investment is one of the factors that can promote the reductions of CO2 emissions in manufacturing industries. To facilitate technology investment, technology application and spill-over, policy and regulations play important roles by generating a market and securing investment in low-carbon technologies [46,47].
Investment in improving energy efficiency and replacing carbon intensive technologies can often be enhanced by a combination of subsidies. According to the IPCC (2007) 6, subsidies are frequently used to stimulate investment in energy-saving measures by reducing investment cost. Kimura & Ofuji (2013) analyses the effectiveness of subsidy programs for energy efficiency investment in Japan. The result implies that although a half of company questionnaire responses imply that, even though there was no subsidy, they were planned to implement energy efficiency investment, the subsidy programs were mostly cost-effective when compared to the avoided costs and carbon emission prices.
6 In IPCC (2007),subsidies to industrial sector include favourable loans and fiscal incentives,such as reduced taxes on energy-efficient equipment,accelerated depreciation,tax credits and tax deductions as well as grants.
From here, the paper begins to employ econometric analysis to assess what types of instruments--subsidies, fuel prices and investments—reduce carbon and energy intensity in which sectors. The purpose of the modelling, as illustrated in Figure 1, is to understand:
a. the impacts of subsidies on carbon and energy intensity;
b. the impacts of fuel price changes on carbon and energy intensity; and
c. the impacts of investment on carbon and energy intensity.
This paper also reflects on possible differences across industries. In so doing, it hypothesizes that in industries that have high upfront capital cost such as iron & steel [50], policy instruments (subsidies) and price incentives (fuel prices) will not significantly reduce energy and carbon intensity. On the other hand, in industries with higher operational cost such as chemical industry [51], increasing subsidies for energy efficiency and renewable energy and fuel prices can create stronger incentives to transition to more efficient facilities. Thus, increasing subsidies of energy efficiency and renewable energy and fuel prices could have a more discernible impact on carbon and energy intensity. In industries that require high levels of electricity consumption such as the machinery industry 7, carbon and energy intensity could be affected by external factors such as energy mix of electricity companies, as well as by energy conservation and less emission measures implemented by the industry.
7 Approximately 69 per cent of the total energy consumptions is from electricity consumptions. METI. Available from: http://www.enecho.meti.go.jp/statistics/total_energy/results.html.
A time-series analysis is used to assess the effects of these three policy instrument on carbon and energy intensity. Carbon intensity and energy intensity data from 1994 to 2013 is gathered for this analysis. A series of diagnostic tests are initially performed to determine: 1) whether serial correlations exist in the data; and 2) whether subsidies, fuel prices and investment variables have statistically and substantively significant effects on carbon and energy intensity in the three industrial sectors of interest.
Temporal trends and complex error processes may affect time series data. There are numerous ways to analyze the time-series data where a variable is observed sequentially over time [52]. An ARMA model is one of such approach. An ARMA model is a time series model that identifies the data-generating process underlying the data (carbon and energy intensity in this paper). As a simplistic and univariate model, ARMA is particularly useful when data requirements are limited to the series of interest [53] 8 On the other hand, Autoregressive distributed lag models (ADL) observe structural relationships in historical data between dependent variable and different exogenous variables [54] 9, vector auto-regression model (VAR) considers endogenous variables to affect each other and is also used as a multiplier analysis with the exogenous variables [55]. Although ARMA and ADL models are similar model as both models can include exogenous variables, this paper uses ARMA with an exogenous variable because the model is relatively intuitive model that can help illustrate the effect of an explanatory variable on a dependent variable [53,54].
8 As examples using these models,ARMA model is used to assess the impact of exogenous shock on electricity demand [66]. ADL is used to analyze energy demand to respond differently (asymmetrically) to price increases and decreases [20, 67],and VAR is used to assess relationship between economic growth and energy consumption [68, 69, 70].
9 In ADL model,the current dependent variable is regressed on the values of the dependent variable with lags and the present and lagged values of one or more explanatory variables.
In the case of industrial energy intensity and carbon intensity, the unique properties of the industrial sector may affect the structure of the data. Therefore, to identify the methods for modeling a time series and variables observed at regular intervals annually, the time series carbon and energy intensity is disaggregated by sector. To identify the appropriate model for carbon and energy intensity data in the iron & steel, chemical and machinery industries, the below steps are followed (See Figure 2 for an illustration of these steps):
(1) A diagnostic test is performed to check if a simple static ordinary least squared (OLS) is appropriate for analyzing the data. A Durbin-Watson or Breusch-Godfrey is used to test for autocorrelation. These tests can assess the null hypothesis that there is no autocorrelation.
(2) If there is serial correlation, a set of diagnostics using the autocorrelation function (ACF) and partial autocorrelation function (PACF) to determine how current values in a series correlate with previous values to identify appropriate lag variables. It also helps isolate whether carbon and energy intensity data demonstrate trend and seasonal variations that would necessitate using an ARMA model.
(3) The Ljung-Box Q-test is then deployed to diagnose whether Box-Jenkins modeling process sufficiently filters the data. The test assesses the presence of autocorrelation in quantitative manner. That is, it tests for autocorrelation at multiple lags by measuring the accumulated autocorrelation of lags [52].
(4) The final step involves using the properly specified model to test which explanatory variables have a significant effect on carbon and energy intensity.
The data and variables used in the analysis are as follows. The dependent variables are logged carbon intensity and logged energy intensity. Logged subsidies, logged fuel prices and logged investments are the explanatory variables (for the equation of ARMA model, please see Appendix 1).
Subsidies data is extracted from the breakdown of expenditure of subsidies for energy supply and demand measures (Special Accounts for Sophisticated Structure of Energy Supply and Demand budget) 10, which includes subsidies for energy efficiency and renewable energy technologies. The subsidy data is applied to the time series regression model.
10 Budget data is available from Ministry of Finance. Available from: https://www.mof.go.jp/budget/budger_workflow/budget/index.html
For fuel price data, price index of fuels (oil, coal and natural gas) published by the Institute of Electrical Engineers of Japan is used [56]. This analysis uses the price of oil, coal and liquefied natural gas as fuel variables. Depending on the type of fuels, the impact of the changes in fuel prices differ across industries. Therefore, the paper tests whether three fuel types fit the model: oil, coal and natural gas. To estimate the impacts of investment, capital investment flow data (installation basis) is extracted from the National Accounts of Japan database published by the Cabinet Office 11.
11 GDP data: Gross Capital Stock of Private Enterprises data is available from Cabinet Office,Government of Japan. Available from: http://www.esri.cao.go.jp/jp/sna/menu.html
The investment data is published by Ministry of Economy, Trade and Industry (METI) and includes total investment amount for newly installed facilities and equipment of industrial sectors. By using capital investment data of each industry and testing the variables for carbon and energy intensity, it is possible to determine whether new capital investment (installation base) improves carbon and energy intensity.
Data availability affects the time period for the analysis. Historical data on subsidies is available from 1990 to 2013, while investment flow data are available from 1994 to 2013. Thus, time-series regression analysis is applied for the 1994 to 2013 period.
The time-series regression analysis of carbon and energy intensity suggest that for energy intensity, the non-seasonal ARMA (0, 1) model best fits the data (see Appendix 1for a review of the equation and Appendix 2 for the model identification). The ARMA (0, 1) means that the time lag of the impacts is influenced by the previous change in the residuals. The paper uses an ARMA (0, 1) model for carbon and energy intensity in each industry.
As one of the factors influencing carbon intensity and energy intensity, the impact of subsidies is assessed using the ARMA (0, 1) model. The results indicate that subsidies for measures to enhance the energy supply-demand structure (including energy efficiency and renewable, and CO2 mitigation) improve carbon intensity in the chemical industry (Table 1). Since the z-ratio is more than |2.33|, it is statistically significant at the 0.01 level; there is only a 1% probability that the subsidies impact carbon intensity by random chance. However, the magnitude of the impact is small with an estimated effect of −0.0043. In case of iron & steel and machinery industry, values are not statistically discernible from zero—with z scores that are less than the standard |1.64| threshold.
Iron & steel | Chemical | Machinery | ||||||||||
SUB | SUB | SUB | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 0.838 | 0.005 | 40.267 | *** | 0.877 | 0.036 | 24.594 | *** | 0.700 | 0.077 | 9.101 | *** |
SUB | 0.0000 | 0.0003 | 1.303 | −0.004 | 0.002 | −2.424 | ** | 0.005 | 0.003 | 1.545 | ||
MA(1) | 0.667 | 0.155 | 4.790 | *** | 0.183 | 0.539 | 0.340 | 0.767 | 0.147 | 5.207 | *** | |
Residual variance=0.0000009. AIC=−215.99. log-likelihood=110.99. Ljung-Box test for autocorrelation=12.068 (p=0.674). | Residual variance=0.0000021. AIC=−198.26. log-likelihood=102.13. Ljung-Box test for autocorrelation=12.459 (p=0.644). | Residual variance=0.0000113. AIC=−164.13. log-likelihood=85.06. Ljung-Box test for autocorrelation=13.210 (p=0.586). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
The result also indicate that subsidies have impacts on energy intensity in the chemical industry (Table 2). The coefficient estimates suggest that the impact of the effect on energy intensity in the chemical industry is not only statistically significant but larger than carbon intensity in absolute terms (Table 3). Energy intensity improved by 4.3% (based on the coefficient of -0.043), on average, with increased subsidies. On the other hand, in the iron & steel and machinery industry, the results are similar to carbon intensity in that that there are no significant impacts on energy intensity with increasing subsidies.
Iron & steel | Chemical | Machinery | ||||||||||
SUB | SUB | SUB | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 1.924 | 0.798 | 2.412 | ** | 4.363 | 0.237 | 18.425 | *** | 3.854 | 1.248 | 3.087 | *** |
SUB | 0.046 | 0.031 | 1.507 | −0.043 | 0.009 | −4.764 | ** | −0.039 | 0.048 | −0.814 | ||
MA(1) | 0.852 | 0.238 | 3.580 | *** | 0.361 | 0.198 | 1.823 | * | 0.703 | 0.222 | 3.165 | *** |
Residual variance= 0.0000008.AIC=−77.58.log-likelihood=41.79. Ljung-Box test for autocorrelation=12.7516 (p=0.6215). | Residual variance= 0.00013.AIC=−116.24.log-likelihood=61.12. Ljung-Box test for autocorrelation=14.7107 (p=0.4724). | Residual variance= 0.0022.AIC=−58.93.log-likelihood=32.47. Ljung-Box test for autocorrelation=11.9376 (p=0.6837). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
The results to the fuel price models indicate that there is a price elasticity of carbon intensity in the chemical industry. The results suggest that, though there is statistically significant effect on oil the price elasticity for carbon intensity, the expected improvement of carbon intensity was very small with estimated coefficients −0.0016 on a logged scale (Table 3). In the case of coal price and gas price, both price variables demonstrate a statistically significant effect on improving carbon intensity (coefficients estimates −0.0023). That is, there is an estimated 0.23% improvement in carbon intensity with each unit increases in fuel prices. On the other hand, the fuel price elasticity of carbon intensity is not statistically significant in the iron & steel industry. In machinery industry, the significance dependsupon the fuel type. Although there is statistically significance in oil and gas price, the impacts have a counter-intuitive worsening effect on carbon intensity with estimated coefficients of +0.0042 and +0.0052, respectively. Coal price does not show a significant effect on the machinery industry.
Iron & steel | Chemical | Machinery | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | z-ratio | |||
OIL | ||||||||||||
Intercept | 0.831 | 0.005 | 168.171 | *** | 0.780 | 0.009 | 89.230 | *** | 0.776 | 0.017 | 44.689 | *** |
OIL | 0.0007 | 0.0005 | 1.305 | −0.0016 | 0.0009 | −1.774 | * | 0.0042 | 0.0017 | 2.457 | ** | |
MA(1) | 0.662 | 0.147 | 4.504 | *** | 0.603 | 0.203 | 2.964 | ** | 0.637 | 0.135 | 4.719 | *** |
Residual variance= 0.0000008.AIC=−217.92.log-likelihood=111.96. Ljung-Box test for autocorrelation=12.068 (p=0.6351). | Residual variance= 0.000003.AIC=−193.88.log-likelihood=99.94. Ljung-Box test for autocorrelation=9.2392 (p=0.8647). | Residual variance= 0.00001.AIC=−167.03.log-likelihood=86.51. Ljung-Box test for autocorrelation=19.1849 (p=0.2055). | ||||||||||
COAL | ||||||||||||
Intercept | 0.840 | 0.006 | 135.345 | *** | 0.784 | 0.012 | 67.351 | *** | 0.797 | 0.024 | 128.842 | *** |
COAL | −0.0002 | 0.0007 | −0.294 | −0.0023 | 0.0013 | −1.762 | * | 0.0024 | 0.0027 | 0.885 | ||
MA(1) | 0.676 | 0.155 | 4.358 | *** | 0.738 | 0.260 | 2.834 | *** | 0.645 | 0.141 | 4.101 | *** |
Residual variance= 0.0000009.AIC=−216.07.log-likelihood=111.04. Ljung-Box test for autocorrelation=13.780 (p=0.5422). | Residual variance= 0.000003.AIC=−194.14.log-likelihood=100.07. Ljung-Box test for autocorrelation=7.862 (p=0.9292). | Residual variance= 0.00001.AIC=−162.58.log-likelihood=84.29. Ljung-Box test for autocorrelation=14.609 (p=0.4799). | ||||||||||
GAS | ||||||||||||
Intercept | 0.830 | 0.006 | 128.841 | *** | 0.788 | 0.011 | 69.819 | *** | 0.764 | 0.023 | 33.673 | *** |
GAS | 0.0007 | 0.0006 | 1.166 | −0.0023 | 0.0011 | −2.111 | * | 0.0052 | 0.0022 | 2.409 | ** | |
MA(1) | 0.637 | 0.155 | 4.101 | *** | 0.643 | 0.224 | 2.872 | ** | 0.596 | 0.141 | 4.234 | *** |
Residual variance=0.0000008.AIC=−217.38.log-likelihood=111.69. Ljung-Box test for autocorrelation=13.419 (p=0.5699). | Residual variance= 0.000003.AIC=−195.01.log-likelihood=100.51. Ljung-Box test for autocorrelation=9.448 (p=0.853). | Residual variance= 0.00001.AIC=−166.64.log-likelihood=86.32. Ljung-Box test for autocorrelation=20.65 (p=0.1484). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Price elasticity of energy demand often depends on the sectors and fuel types, thus it is reasonable to assume that the price elasticity of energy intensity varies with sectors and fuel types. Fuel prices for oil, coal and gas in the chemical industry appear to register significant effects on improving energy intensity with the coefficients −0.029, −0.027 and −0.035, respectively (Table 4). The impact of the increase in oil and gas prices in machinery industry statistically significant and is larger in absolute terms than the chemical industry with the coefficients −0.081 and −0.106. In the machinery industry, the share of energy consumptions of oil and oil products in the total energy uses fell from 20% in 1990 to 10% in 2013 while oil prices increased from 1990 to 2013. Fuel price elasticities of energy intensity do not seem significant in iron & steel industry, though coal and coal product accounts for 70.1% of the total energy use in the iron & steel industry (as of 2013).
Iron & steel | Chemical | Machinery | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | z-ratio | |||
OIL | ||||||||||||
Intercept | 3.242 | 0.173 | 18.718 | *** | 3.530 | 0.044 | 80.855 | *** | 3.662 | 0.206 | 17.769 | *** |
OIL | −0.012 | 0.017 | −0.676 | −0.029 | 0.004 | −6.776 | *** | −0.081 | 0.020 | −4.012 | *** | |
MA(1) | 0.659 | 0.195 | 3.376 | *** | 0.353 | 0.246 | 1.435 | * | 0.726 | 0.185 | 3.925 | *** |
Residual variance= 0.001.AIC=−75.66.log-likelihood=40.83. Ljung-Box test for autocorrelation=13.5178 (p=0.5624). | Residual variance= 0.00009.AIC=−124.27.log-likelihood=65.13. Ljung-Box test for autocorrelation=24.1075 (p=0.06329). | Residual variance= 0.0013.AIC=−69.95.log-likelihood=37.98. Ljung-Box test for autocorrelation=13.4167 (p=0.5701). | ||||||||||
COAL | ||||||||||||
Intercept | 3.068 | 0.212 | 14.494 | *** | 3.474 | 0.089 | 39.118 | *** | 3.712 | 0.246 | 14.122 | *** |
COAL | 0.007 | 0.024 | 0.274 | −0.027 | 0.010 | −2.694 | ** | −0.098 | 0.028 | −0.973 | ||
MA(1) | 0.635 | 0.212 | 2.990 | ** | 1.000 | 1.147 | 0.872 | 1.000 | 0.203 | 4.921 | *** | |
Residual variance= 0.0001.AIC=−75.28.log-likelihood=111.96. Ljung-Box test for autocorrelation=14.2754 (p=0.5048). | Residual variance= 0.00017.AIC=−107.94.log-likelihood=56.97. Ljung-Box test for autocorrelation=24.5917 (p=0.0557). | Residual variance= 0.0.0013.AIC=−67.19.log-likelihood=36.59. Ljung-Box test for autocorrelation=15.6845 (p=0.4033). | ||||||||||
GAS | ||||||||||||
Intercept | 3.209 | 0.233 | 13.748 | *** | 3.596 | 0.069 | 51.954 | *** | 3.946 | 0.288 | 13.684 | *** |
GAS | −0.008 | 0.022 | −0.360 | −0.035 | 0.007 | −5.226 | *** | −0.106 | 0.028 | −3.851 | *** | |
MA(1) | 0.669 | 0.203 | 3.300 | *** | 0.451 | 0.234 | 1.929 | * | 0.876 | 0.180 | 4.868 | *** |
Residual variance= 0.00097.AIC=−75.33.log-likelihood=40.67. Ljung-Box test for autocorrelation=14.0869 (p=0.5189). | Residual variance= 0.00011.AIC=−118.62.log-likelihood=62.3. Ljung-Box test for autocorrelation=24.98842 (p=0.0501). | Residual variance= 0.0012.AIC=−70.16.log-likelihood=38.08. Ljung-Box test for autocorrelation=10.3797 (p=0.7952). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
As for investment, by using logged investment amount in industries it is possible to assess whether investments in new plants and equipment are devoted to improving energy efficiency or increasing alternative energy use (including renewable energies), which contribute to improving carbon and energy intensity. The results indicate that the impact of increasing investment in chemical and machinery industry is significant on carbon and energy intensity (Table 5). A one unit of increase in investment in chemical industry improves 0.28% of carbon intensity and a one unit of increase in investment in machinery industry improve carbon intensity by 4.8%.
Iron & steel | Chemical | Machinery | ||||||||||
INV | INV | INV | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 0.496 | 2.795 | 0.177 | 8.193 | 1.281 | 6.393 | *** | 11.813 | 2.624 | 4.502 | *** | |
INV | 0.153 | 0.163 | 0.941 | −0.286 | 0.074 | −3.870 | *** | −0.482 | 0.141 | −3.421 | *** | |
MA(1) | 0.651 | 0.209 | 3.124 | *** | 0.765 | 0.313 | 2.447 | ** | 0.707 | 0.180 | 3.923 | *** |
Residual variance= 0.00094.AIC=−76.07.log-likelihood=41.04. Ljung-Box test for autocorrelation=14.5235 (p=0.4863). | Residual variance= 0.00012.AIC=−116.8.log-likelihood=61.4. Ljung-Box test for autocorrelation=20.7272 (p=0.1458). | Residual variance= 0.00146.AIC=−67.15.log-likelihood=36.57. Ljung-Box test for autocorrelation=15.5819 (p=0.4104). | ||||||||||
INV: logged investment (installed based) amount by sector MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
The result of impact of investment on energy intensity suggests that in chemical industry, the variable of investment is statistically significant with a coefficient of −0.027 (Table 6). However, in other industries, the impacts of investment are not discernibly different from zero in terms of their effects on energy intensity.
Iron & steel | Chemical | Machinery | ||||||||||
INV | INV | INV | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 0.9368 | 0.0822 | 11.401169 | *** | 1.2323 | 0.1168 | 10.548472 | *** | 0.4952 | 0.2315 | 7.3278558 | *** |
INV | −0.0057 | 0.0048 | −1.206176 | −0.027 | 0.0067 | −4.002377 | *** | 0.0174 | 0.0124 | −0.638437 | ||
MA(1) | 0.6475 | 0.1781 | 3.634986 | *** | 0.4418 | 0.31 | 1.425193 | * | 0.6775 | 0.134 | 5.0488047 | *** |
Residual variance= 0.0000008.AIC=−217.38.log-likelihood=111.69. Ljung-Box test for autocorrelation=10.5245 (p=0.7855). | Residual variance= 0.0000018.AIC=−201.31.log-likelihood=103.66. Ljung-Box test for autocorrelation=10.2885 (p=0.8012). | Residual variance= 0.000013.AIC=−162.58.log-likelihood=84.29. Ljung-Box test for autocorrelation=19.3235 (p=0.1994). | ||||||||||
INV: logged investment (installed based) amount by sector
MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
The individual results from the modelling are illuminating, but it may be difficult to see the bigger picture without an overall assessment. Such an assessment can be seen more easily in Table 7. That tables suggeststhat while in chemical industry the increase in subsidies, fuel prices and investments significantly improve both carbon and energy intensity, the same set of variables do not have significant effects on the carbon and energy intensity in iron & steel industry. Meanwhile, the machinery industry, the result depends on the variables (Table 7). While none of the variables improve carbon intensity, the increase in oil price, gas price and investment significantly improve energy intensity.
Subsidies | Fuel price | Investment | ||||
Oil price | Coal price | Gas price | ||||
Carbon intensity | Iron & steel | n.s | n.s | n.s | n.s | n.s |
Chemical | * | * | * | * | * | |
Machinery | n.s | - | n.s | - | n.s | |
Energy intensity | Iron & steel | n.s | n.s | n.s | n.s | n.s |
Chemical | * | * | * | * | * | |
Machinery | n.s | * | n.s | * | * | |
* = significant to improve carbon/energy intensity (the value negatively impacts on the intensities) - = significant to worsen carbon/energy intensity (the value positively impacts on the intensities) n.g = not significant |
This paper conducts a time-series regression analysis to assess the factors influencing CO2 emissions by assessing the degree of impacts of variables (subsidies, fuel prices and investment) on carbon and energy intensity. The result indicates that the impacts depends on industry.
In case of iron & steel industry, none of the variables have significant effects. A possible reason is that iron & steel industry require high upfront cost [50]. The physical life of the iron & steel facilities such as a blast furnace is long, and maintenance and repair fees are also expensive. Thus, the fixed costs, including initial investment and expenses, for maintenance and repair may make it difficult to transition to more efficient technologies or lower carbon production methods [57,58,59,60]. In addition, according to Keidanren (2014), high value added products and environmental regulations introduced after 1990 actually increased industrial energy consumption. For example, additional energy was required for the implementation of environmental measures such as recycling or preventing air and water pollution [61].
In the chemical industry, subsidy, fuel prices and investment improved carbon and energy intensity. This result makes sense in that the chemical industry has invested approximately 554.5 billion yen from 1997 to 2012 in global warming countermeasures [61]. Around 10-25 billion JPY was invested every year from 2006 to 2011 to improve the efficiencies of facilities and equipment, including cogeneration. Expenditures on research and development, as an R&D oriented industry, are also larger than other industries [62]. Another consideration is that chemical industry have relatively higher operation cost [51], making it beneficial to improve carbon and energy intensity.
In the machinery industry, although an increase in subsidies for energy supply and demand (including energy efficiency and renewable energy) could presumably reduce carbon and energy intensity, these effects are not found from the data. Nor does the analysis reveal that investment significantly improve energy intensity. Rather only oil and gas prices have the predicted effects on improving carbon and energy intensity.
There are two notable limitations to the paper: 1) modelling and data; and 2) additional drivers of carbon and energy intensity.
In terms of the first set of limitations, although the time-series model is a useful for assessing the impact of explanatory variables, the time period 1994 to 2013 for carbon and energy intensity data is relatively short (with only 9 lags). A longer time period would arguably generate more robust results. In addition, this paper only uses one explanatory variable to assess and compare the impact of each variable; it did not assess a combination of variables that may also offer revealing insights into the additive effects of different instruments. As for the data, although subsidy data from Special Account is used, these figures include subsidies to promote energy efficiency and renewable energy for power, building, and transportation sectors not only industries. Thus, it does not always represent of subsidies for manufacturing sector.
In terms of the second set of limitations, there are other factors that may impact carbon and energy intensity besides fuel prices, subsidies and investment. For instance, the improvement of CO2 emission intensity from electricity may contribute to improvements in carbon intensity—though this depends on the share of electricity use to total energy used in industries. The CO2 emission intensity of electricity increased from 0.35 in 2010 to 0.487 ktCO2/kWh in 2012 following the Fukushima accident [63]. This is mainly due to the increased demands for natural gas in the electricity sector as an alternative to nuclear power. In line with the increase in the demands of natural gas, gas price went up in Japan from 2011, and the replacement of nuclear power generation to thermal power generation, including gas-fired power generation, resulted in an increase in electricity emission intensity. In consequence, carbon intensity increased in the machinery sector, especially after Fukushima. The effects could be large especially in machinery sector. Electricity accounts for a high proportion of total energy consumptions in the machinery industry of between 58% and 74% from 1990 to 2013 12. Therefore, increasing CO2 emitted from the electricity sector greatly affects carbon intensity. On the other hand, the chemical industry did not have a significant impact on carbon intensity although electricity emission intensity increased in 2011. This could be because electricity accounts for only 7% of the total energy consumption in the chemical industry. In the time-series analysis, while CO2 emission intensity from electricity apparently worsens carbon intensity in the iron & steel and machinery industries (with coefficients +0.014 and +0.081 (0.1% probability) respectively), the same variable is not significant in the chemical industry.
12 METI. Available from: http://www.enecho.meti.go.jp/statistics/total_energy/results.html.
From the findings in section 5, the conclusion returns to the initial discussion of the impact of carbon tax. Those findings suggest that although the fuel price elasticities of carbon intensity are dependent on fuel types and industry sectors, it is possible to increase the price elasticity of carbon and energy intensity, especially in chemical and machinery industry if fuel prices become higher. If a higher carbon tax is introduced from the current JPY289/tCO2, the impact could be greater and more significant on carbon and energy intensity. In the case of subsidies in the chemical industry, subsidies appear to reduce carbon intensity. The revenue from carbon tax could partly be used for subsidies on renewable energy, energy efficiency or CO2 emissions countermeasures. In terms of policy, this paper indicates that the deployment of subsidies could improve carbon intensity and lead to reductions in CO2 emissions in key sectors. But the results underline the need for careful analysis of the nature and extent to which subsidies affect changes within a sector. A closer analysis may help illustrate under which conditions subsidies promote improvements in carbon and energy intensity. They could also shed light on when the continuity of subsidies could prevent business sectors from making efforts to improve energy efficiency measures and promote CO2 emission reductions, perhaps creating barriers to the introduction of new policies such as carbon emission trading. In addition, as an alternative for other sources of tax revenue, a carbon tax could be used for the reduction of social security contributions for companies or individuals, while promoting the reduction of fossil fuel use.
The time-series analysis shows a clear trend toward improvements in carbon intensity in the chemical and machinery industries and improvements in the energy intensity in chemical industry. Thus, there is a possibility that raising fuel prices and subsidies through a carbon tax could increase the contribution to investment to improve carbon and energy intensity in these industries. On the other hand, for iron & steel industry, investment in R&D for new technologies for less energy use and less emissions or/and technology shift such as from blast furnace to electric furnace would be required.
For further reduction of CO2 emissions, a fuel mix for electricity with less carbon intensity and a fuel shift in each industrial sector is also required as well as a reduction in total energy use. The results further imply that a combination and harmonisation of various policies could be more effective than single policy in improving carbon and energy intensity. A carbon tax could help harmonize various instruments including energy price changes and investment. Policies to promote investment in less carbonintensive fuels and renewable energy as well as financial support to promote the replacement or installation of new facilities and equipment follow from a combination of policies and economic incentives may also follow. However, like the analysis offered here, more detailed industry specific policy analysis is required to understand these effects.
All authors declare no conflicts of interest in this paper.
Appendix 1. Equation of ARMA model
The equation of ARMA(p, q)(P, Q) h is:
Where, yt is carbon or energy intensity at t year and is white noise of E[]=0, Var ()= at t year. B is the backward shift operator (a linear operator), which is a useful notational device when working with time series lags. is parameter when the current value of data has correlation only with p year before, the order expresses is one. can be expressed with partial autocorrelation coefficient. Each data is influenced by the random shock (historical errors). The parameter is stationary. Current value of data is influenced by the past errors. If the white noise has correlation with the data of one month before, the order becomes one. The combination of AR(p) and MA(q) with seasonal components of ARs (P) (Seasonal AR part of order P), and MAs (Q) (Seasonal MA part of order Q) is estimated.
Appendix 2. Model identification
For the model analysis, logged carbon intensity and energy intensity data from 1994 to 2013 is used. Using the methodology in section 3.1 and the equation in Appendix 1, the ARMA(0, 1) model is selected as the best fit model for carbon and energy intensity in iron & steel, chemical and machinery industry.
Iron & steel industry
Carbon intensity
In case of iron & steel industry, the ARMA(0, 1) model is selected for carbon intensity by comparing the result of the Akaike Information Criterion (AIC) as shown in Table A1 to Table A5. Although ARMA(0, 1) is not the best fit model for the model including oil fuel price variable, this analysis uses the same model for data covering carbon intensity in iron & steel industry.
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 110.54 | −215.08 | 13.389 | 0.5723 |
ARMA(0, 1) | 111.8 | −217.59 | 12.068 | 0.6739 |
ARMA(1, 1) | 112.08 | −216.17 | 10.136 | 0.8111 |
ARMA(1, 0): AR1=s.e. NaN, ARMA(1, 1): AR1=s.e. NaN |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 112.15 | −218.3 | 9.2269 | 0.8654 |
ARMA(0, 1) | 111.96 | −217.92 | 12.575 | 0.6351 |
ARMA(1, 1) | 113.04 | −218.08 | 7.2962 | 0.9489 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 110.53 | −215.06 | 11.942 | 0.6834 |
ARMA(0, 1) | 111.04 | −216.07 | 13.781 | 0.5422 |
ARMA(1, 1) | 111.59 | −215.19 | 11.641 | 0.7059 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 111.42 | −216.84 | 11.662 | 0.7044 |
ARMA(0, 1) | 111.69 | −217.38 | 13.42 | 0.5699 |
ARMA(1, 1) | 112.2 | −216.41 | 10.399 | 0.7939 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 111.2 | −216.39 | 9.2154 | 0.866 |
ARMA(0, 1) | 111.69 | −217.38 | 10.524 | 0.7855 |
ARMA(1, 1) | 111.96 | −215.91 | 9.9323 | 0.824 |
ARMA(1, 0): AR1=s.e. NaN, ARMA(1, 1): AR1=s.e. NaN |
Energy intensity
ARMA(0, 1) model is selected for energy intensity in iron & steel industry by comparing the result of the AIC as shown in Table A6 to Table A10.
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 40.23 | −74.46 | 18.456 | 0.2395 |
ARMA(0, 1) | 41.79 | −77.58 | 12.752 | 0.6215 |
ARMA(1, 1) | 41.95 | −75.89 | 12.568 | 0.6356 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.81 | −73.63 | 16.077 | 0.377 |
ARMA(0, 1) | 40.83 | −75.66 | 13.518 | 0.5624 |
ARMA(1, 1) | 40.91 | −73.82 | 13.556 | 0.5594 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.68 | −73.36 | 16.64 | 0.3408 |
ARMA(0, 1) | 40.64 | −75.28 | 14.275 | 0.5048 |
ARMA(1, 1) | 40.77 | −73.54 | 14.004 | 0.5252 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.57 | −73.15 | 17.064 | 0.315 |
ARMA(0, 1) | 40.67 | −75.33 | 14.087 | 0.5189 |
ARMA(1, 1) | 40.77 | −73.55 | 14.044 | 0.5222 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.85 | −73.69 | 17.475 | 0.2913 |
ARMA(0, 1) | 41.04 | −76.07 | 14.524 | 0.4863 |
ARMA(1, 1) | 41.41 | −74.82 | 12.78 | 0.6193 |
Carbon intensity
In case of chemical industry, the ARMA(0, 1) model is selected for carbon intensity by comparing the result of the AIC as shown in Table A11 to Table A15. Although ARMA(0, 1) is not the best fit model for the model including oil fuel price variable, this analysis uses the same model for data covering carbon intensity in chemical industry.
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 102.26 | −198.52 | 11.361 | 0.7266 |
ARMA(0, 1) | 102.13 | −198.26 | 12.459 | 0.644 |
ARMA(1, 1) | 102.26 | −196.53 | 11.422 | 0.7221 |
ARMA(1, 0): AR1=s.e. NaN |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 100.27 | −194.53 | 9.5491 | 0.8471 |
ARMA(0, 1) | 99.94 | −193.88 | 9.2392 | 0.8647 |
ARMA(1, 1) | 100.29 | −192.58 | 9.7751 | 0.8336 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 99.92 | −193.84 | 13.701 | 0.5483 |
ARMA(0, 1) | 100.07 | −194.14 | 14.609 | 0.4799 |
ARMA(1, 1) | 100.3 | −192.61 | 8.286 | 0.9118 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 99.73 | −193.46 | 6.8141 | 0.9626 |
ARMA(0, 1) | 100.51 | −195.01 | 9.4475 | 0.853 |
ARMA(1, 1) | 100.52 | −193.05 | 9.122 | 0.8711 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 103.18 | −200.36 | 13.871 | 0.5354 |
ARMA(0, 1) | 103.66 | −201.31 | 10.524 | 0.7855 |
ARMA(1, 1) | 103.67 | −199.34 | 9.9323 | 0.824 |
Energy intensity
ARMA(0, 1) model is selected for energy intensity in chemical industry by comparing the result of the AIC as shown in Table A16 to Table A20. Although ARMA(0, 1) is not the best fit model for the model including oil, coal and gas variable, this analysis uses the same model for data covering energy intensity in chemical industry.
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 60.85 | −115.69 | 16.477 | 0.3511 |
ARMA(0, 1) | 61.12 | −116.24 | 14.711 | 0.4724 |
ARMA(1, 1) | 61.12 | −114.25 | 15.09 | 0.445 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 65.15 | −124.29 | 17.383 | 0.2965 |
ARMA(0, 1) | 65.13 | −124.27 | 13.417 | 0.5701 |
ARMA(1, 1) | 65.3 | −122.59 | 14.828 | 0.4639 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 58.56 | −111.13 | 25.8 | 0.04018 |
ARMA(0, 1) | 56.97 | −107.94 | 15.684 | 0.4033 |
ARMA(1, 1) | 58.38 | −108.75 | 18.066 | 0.2592 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 62.32 | −118.64 | 15.964 | 0.3845 |
ARMA(0, 1) | 62.31 | −118.62 | 10.38 | 0.7952 |
ARMA(1, 1) | 62.32 | −116.64 | 11.347 | 0.7276 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 61.26 | −116.53 | 20.173 | 0.1654 |
ARMA(0, 1) | 61.4 | −116.8 | 15.582 | 0.4104 |
ARMA(1, 1) | 61.72 | −115.45 | 16.162 | 0.3714 |
Carbon intensity
In case of machinery industry, the ARMA(0, 1) model is selected for carbon intensity by comparing the result of the AIC as shown in Table A21 to Table A25. Although ARMA(0, 1) is not the best fit model for the models including oil price variable, this analysis uses the same model for data covering carbon intensity in machinery industry.
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 83.11 | −160.22 | 11.073 | 0.7474 |
ARMA(0, 1) | 85.06 | −164.13 | 14.44 | 0.4925 |
ARMA(1, 1) | 85.65 | −163.29 | 10.155 | 0.8099 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 85.61 | −165.22 | 25.884 | 0.03926 |
ARMA(0, 1) | 86.51 | −167.03 | 24.108 | 0.06329 |
ARMA(1, 1) | 87.5 | −167 | 26.07 | 0.03729 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 83.26 | −160.51 | 32.08 | 0.006279 |
ARMA(0, 1) | 84.29 | −162.58 | 24.592 | 0.0557 |
ARMA(1, 1) | 85.32 | −162.65 | 33.455 | 0.004059 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 85.65 | −165.31 | 30.216 | 0.01117 |
ARMA(0, 1) | 86.32 | −166.64 | 24.988 | 0.0501 |
ARMA(1, 1) | 87.09 | −166.18 | 25.761 | 0.04061 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 83.48 | −160.95 | 28.167 | 0.02054 |
ARMA(0, 1) | 84.84 | −163.69 | 20.727 | 0.1458 |
ARMA(1, 1) | 85.74 | −163.49 | 24.003 | 0.06505 |
Energy intensity
The ARMA(0, 1) model is selected for energy intensity in machinery sector by comparing the result of the AIC as shown in Table A26 to Table A30.
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 31.55 | −57.1 | 16.787 | 0.3317 |
ARMA(0, 1) | 32.47 | −58.93 | 11.938 | 0.6837 |
ARMA(1, 1) | 32.48 | −56.95 | 12.061 | 0.6744 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 36.47 | −66.95 | 20.474 | 0.1545 |
ARMA(0, 1) | 37.98 | −69.95 | 19.185 | 0.2055 |
ARMA(1, 1) | 38.08 | −68.16 | 7.2962 | 0.9489 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 34.09 | −62.19 | 13.701 | 0.5483 |
ARMA(0, 1) | 36.59 | −67.19 | 14.609 | 0.4799 |
ARMA(1, 1) | 36.99 | −65.98 | 8.286 | 0.9118 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 35.77 | −65.55 | 21.538 | 0.1205 |
ARMA(0, 1) | 38.08 | −70.16 | 20.65 | 0.1484 |
ARMA(1, 1) | 38.11 | −68.22 | 16.339 | 0.3599 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 35.14 | −64.27 | 17.433 | 0.2936 |
ARMA(0, 1) | 36.57 | −67.15 | 19.324 | 0.1994 |
ARMA(1, 1) | 36.67 | −65.33 | 12.649 | 0.6294 |
[1] |
Adhikari DR, Stevens DP (2024) Effect of federal funds rate on cpi and ppi. J Appl Bus Econ 26. https://doi.org/10.33423/jabe.v26i1.6887 doi: 10.33423/jabe.v26i1.6887
![]() |
[2] |
Aghaabbasi M, Ali M, Jasiński M, et al. (2023) On hyperparameter optimization of machine learning methods using a bayesian optimization algorithm to predict work travel mode choice. IEEE Access 11: 19762–19774. https://doi.org/10.1109/access.2023.3247448 doi: 10.1109/access.2023.3247448
![]() |
[3] |
Ahmed N, Assadi M, Zhang Q, et al. (2023) Assessing impact of borehole field data's input parameters on hybrid deep learning models for heating and cooling forecasting: a local and global explainable ai analysis. IOP Conference Series: Materials Science and Engineering 1294: 012056. https://doi.org/10.1088/1757-899x/1294/1/012056 doi: 10.1088/1757-899x/1294/1/012056
![]() |
[4] | Akbulut H (2022) Forecasting inflation in Turkey: A comparison of time-series and machine learning models. Econ J Emerg Market 14. |
[5] |
Alhendawy HAA, Abdallah Mostafa MG, Elgohari MI, et al. (2023) Determinants of renewable energy production in egypt new approach: machine learning algorithms. Int J Energy Econ Policy 13: 679–689. https://doi.org/10.32479/ijeep.14985 doi: 10.32479/ijeep.14985
![]() |
[6] |
Ali M, Apriliana T, Fathonah AN (2023) The Effect of Money Supply and Bank Indonesia Rate on Consumer Price Index in Indonesia 2018–2022. J Ekonomi Bisnis Entrep 17: 488–497. https://doi.org/10.55208/jebe.v17i2.471 doi: 10.55208/jebe.v17i2.471
![]() |
[7] |
Alibabaei K, Gaspar PD, Lima TM (2021) Modeling soil water content and reference evapotranspiration from climate data using deep learning method. Appl Sci 11: 5029. https://doi.org/10.3390/app11115029 doi: 10.3390/app11115029
![]() |
[8] |
Alim M, Ye G, Guan P, et al. (2020) Comparison of arima model and xgboost model for prediction of human brucellosis in mainland China: a time-series study. BMJ Open 10: e039676. https://doi.org/10.1136/bmjopen-2020-039676 doi: 10.1136/bmjopen-2020-039676
![]() |
[9] |
Alizadeh M, Beheshti MTH, Ramezani A, et al. (2023) An optimized hybrid methodology for short‐term traffic forecasting in telecommunication networks. T Emerg Telecommun T 34: e4860. https://doi.org/10.1002/ett.4860 doi: 10.1002/ett.4860
![]() |
[10] |
Alizamir M, Shiri J, Fard AF, et al. (2023) Improving the accuracy of daily solar radiation prediction by climatic data using an efficient hybrid deep learning model: Long short-term memory (LSTM) network coupled with wavelet transform. Eng Appl Artif Intell 123: 106199. https://doi.org/10.1016/j.engappai.2023.106199 doi: 10.1016/j.engappai.2023.106199
![]() |
[11] |
Alshahrani SM, Alrayes FS, Alqahtani H, et al. (2023) Iot-cloud assisted botnet detection using rat swarm optimizer with deep learning. Cmc-Comput Mater Con 74: 3085–3100. https://doi.org/10.32604/cmc.2023.032972 doi: 10.32604/cmc.2023.032972
![]() |
[12] |
Amalu HI, Agbasi LO, Olife LU, et al. (2021) Responsiveness of service sector growth to financial development in nigeria: evidence from 1981–2019. J Adv Res Econ Adm Sci 2: 1–12. https://doi.org/10.47631/jareas.v2i3.305 doi: 10.47631/jareas.v2i3.305
![]() |
[13] |
Amin J, Sharif M, Raza M, et al. (2020) Brain tumor detection: a long short-term memory (LSTM)-based learning model. Neural Comput Appl 32: 15965–15973. https://doi.org/10.1007/s00521-019-04650-7 doi: 10.1007/s00521-019-04650-7
![]() |
[14] |
Ampomah EK, Nyame G, Qin Z, et al. (2021) Stock market prediction with gaussian naïve bayes machine learning algorithm. Informatica 45. https://doi.org/10.31449/inf.v45i2.3407 doi: 10.31449/inf.v45i2.3407
![]() |
[15] |
Anagnostis A, Moustakidis S, Papageorgiou EI, et al. (2022) A hybrid bimodal lstm architecture for cascading thermal energy storage modelling. Energies 15: 1959. https://doi.org/10.3390/en15061959 doi: 10.3390/en15061959
![]() |
[16] |
Araujo GS, Gaglianone WP (2023) Machine learning methods for inflation forecasting in Brazil: New contenders versus classical models. Lat Am J Cent Bank 4: 100087. https://doi.org/10.1016/j.latcb.2023.100087 doi: 10.1016/j.latcb.2023.100087
![]() |
[17] |
Arnone M, Romelli D (2013) Dynamic central bank independence indices and inflation rate: A new empirical exploration. J Financ Stabil 9: 385–398. https://doi.org/10.1016/j.jfs.2013.03.002 doi: 10.1016/j.jfs.2013.03.002
![]() |
[18] |
Arthur CK, Temeng VA, Ziggah YY (2020) Multivariate Adaptive Regression Splines (MARS) approach to blast-induced ground vibration prediction. Int J Min Reclam Env 34: 198–222. https://doi.org/10.1080/17480930.2019.1577940 doi: 10.1080/17480930.2019.1577940
![]() |
[19] |
Attoh-Okine NO, Cooger K, Mensah S (2009) Multivariate adaptive regression (MARS) and hinged hyperplanes (HHP) for doweled pavement performance modeling. Constr Build Maters 23: 3020–3023. https://doi.org/10.1016/j.conbuildmat.2009.04.010 doi: 10.1016/j.conbuildmat.2009.04.010
![]() |
[20] |
Balocchi R, Menicucci D, Santarcangelo L, et al. (2004) Deriving the respiratory sinus arrhythmia from the heartbeat time series using empirical mode decomposition. Chaos Soliton Fract 20: 171–177. https://doi.org/10.1016/s0960-0779(03)00441-7 doi: 10.1016/s0960-0779(03)00441-7
![]() |
[21] |
Balshi MS, McGuire AD, Duffy P, et al. (2009) Assessing the response of area burned to changing climate in western boreal North America using a Multivariate Adaptive Regression Splines (MARS) approach. Global Change Biol 15: 578–600. https://doi.org/10.1111/j.1365-2486.2008.01679.x doi: 10.1111/j.1365-2486.2008.01679.x
![]() |
[22] | Bandara K, Hyndman R, Bergmeir C (2021) Mstl: a seasonal-trend decomposition algorithm for time series with multiple seasonal patterns. https://doi.org/10.48550/arxiv.2107.13462 |
[23] |
Bandara WMS, De Mel WAR (2024) Evaluating the Efficacy of Supervised Machine Learning Models in Inflation Forecasting in Sri Lanka. Am J Appl Stat Econ 3: 51–60. https://doi.org/10.54536/ajase.v3i1.2385 doi: 10.54536/ajase.v3i1.2385
![]() |
[24] |
Barkan O, Benchimol J, Caspi I, et al. (2023) Forecasting CPI inflation components with hierarchical recurrent neural networks. Int J Forecast 39: 1145–1162. https://doi.org/10.1016/j.ijforecast.2022.04.009 doi: 10.1016/j.ijforecast.2022.04.009
![]() |
[25] |
Baybuza I (2018) Inflation forecasting using machine learning methods. Russ J Money Financ 77: 42–59. https://doi.org/10.31477/rjmf.201804.42 doi: 10.31477/rjmf.201804.42
![]() |
[26] | Bhanja S, Das A (2021) Deep neural network for multivariate time-series forecasting. In: Proceedings of International Conference on Frontiers in Computing and Systems : COMSYS 2020 (267–277). Springer Singapore. https://doi.org/10.1007/978-981-15-7834-2_25 |
[27] |
Bhati BS, Chugh G, Al‐Turjman F, et al. (2020) An improved ensemble based intrusion detection technique using xgboost. T Emerg Telecommun T 32: e4076. https://doi.org/10.1002/ett.4076 doi: 10.1002/ett.4076
![]() |
[28] |
Bouktif S, Fiaz A, Ouni A, et al. (2020). Multi-sequence lstm-rnn deep learning and metaheuristics for electric load forecasting. Energies 13: 391. https://doi.org/10.3390/en13020391 doi: 10.3390/en13020391
![]() |
[29] |
Brzan PP, Obradovic Z, Stiglic G (2017) Contribution of temporal data to predictive performance in 30-day readmission of morbidly obese patients. Peer J 5 e3230. https://doi.org/10.7287/peerj.3230v0.1/reviews/2 doi: 10.7287/peerj.3230v0.1/reviews/2
![]() |
[30] |
Budiharto W (2021) Data science approach to stock prices forecasting in Indonesia during Covid-19 using Long Short-Term Memory (LSTM). J Big Data 8: 1–9. https://doi.org/10.1186/s40537-021-00430-0 doi: 10.1186/s40537-021-00430-0
![]() |
[31] | Cahyono ND, Sumpeno S, Setiiadi E (2023) Multivariate Time Series for Customs Revenue Forecasting Using LSTM Neural Networks. In: 2023 International Conference on Information Technology and Computing (ICITCOM), 357–362. https://doi.org/10.1109/ICITCOM60176.2023.10442562 |
[32] |
Cain MK, Zhang Z, Yuan KH (2017) Univariate and multivariate skewness and kurtosis for measuring nonnormality: Prevalence, influence and estimation. Behav Res Methods 49: 1716–1735. https://doi.org/10.3758/s13428-016-0814-1 doi: 10.3758/s13428-016-0814-1
![]() |
[33] |
Cao L, Li Y, Zhang J, et al. (2020) Electrical load prediction of healthcare buildings through single and ensemble learning. Energy Rep 6: 2751–2767. https://doi.org/10.1016/j.egyr.2020.10.005 doi: 10.1016/j.egyr.2020.10.005
![]() |
[34] |
Chen S (2023) Multiple stock prediction based on linear and non-linear machine learning regression methods. Advances in Economics, Management and Political Sciences 46: 225–232. https://doi.org/10.54254/2754-1169/46/20230343 doi: 10.54254/2754-1169/46/20230343
![]() |
[35] | Chen T, Guestrin C (2016) Xgboost: a scalable tree boosting system. In: Proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining. https://doi.org/10.48550/arxiv.1603.02754 |
[36] |
Choi JY, Lee B (2018) Combining lstm network ensemble via adaptive weighting for improved time series forecasting. Math Probl Eng 2018: 1–8. https://doi.org/10.1155/2018/2470171 doi: 10.1155/2018/2470171
![]() |
[37] |
Choudhary K, Jha GK, Kumar RR, et al. (2019) Agricultural commodity price analysis using ensemble empirical mode decomposition: a case study of daily potato price series. Indian J Agr Sci 89: 882–886. https://doi.org/10.56093/ijas.v89i5.89682 doi: 10.56093/ijas.v89i5.89682
![]() |
[38] |
Correa E (2023) Effect of unemployment, inflation and foreign direct investment on economic growth in sub-saharan africa. J Dev Econ 8: 297–315. https://doi.org/10.20473/jde.v8i2.47283 doi: 10.20473/jde.v8i2.47283
![]() |
[39] | Coulibaly P, Baldwin CK (2005) Nonstationary hydrological time series forecasting using nonlinear dynamic methods. J Hydrol 307: 164–174. |
[40] |
Cui Q, Rong S, Zhang B (2023) Advancing the comprehension of consumer price index and influencing factors: insight into the mechanism based on prediction machine learning models. Adv Econ Manage Res 7: 125–125. https://doi.org/10.56028/aemr.7.1.125.2023 doi: 10.56028/aemr.7.1.125.2023
![]() |
[41] |
DeJong DN, Nankervis JC, Savin NE, et al. (1992). The power problems of unit root test in time series with autoregressive errors. J Econometrics 53: 323–343. https://doi.org/10.1016/0304-4076(92)90090-E doi: 10.1016/0304-4076(92)90090-E
![]() |
[42] |
Delage O, Portafaix T, Benchérif H, et al. (2022). Empirical adaptive wavelet decomposition (eawd): an adaptive decomposition for the variability analysis of observation time series in atmospheric science. Nonlinear Proc Geoph 29: 265–277. https://doi.org/10.5194/npg-29-265-2022 doi: 10.5194/npg-29-265-2022
![]() |
[43] | Dhamo D, Dhamo X, Spahiu A, et al. (2022) PV production forecasting using machine learning and deep learning techniques: Albanian case study. Adv Eng Days 5: 68–70. |
[44] |
Dickey DA, Fuller WA (1981) Likelihood ratio statistics for autoregressive time series with a unit root. Econometrica: J Econom Soc 1057–1072. https://doi.org/10.2307/1912517 doi: 10.2307/1912517
![]() |
[45] |
Dinh TN, Thirunavukkarasu GS, Seyedmahmoudian M, et al. (2023). Predicting Commercial Building Energy Consumption Using a Multivariate Multilayered Long-Short Term Memory Time-Series Model. Appl Sci 13: 7775. https://doi.org/10.3390/app13137775 doi: 10.3390/app13137775
![]() |
[46] |
Djordjević K, Jordović-Pavlović MI, Ćojbašić Ž, et al. (2022) Influence of data scaling and normalization on overall neural network performances in photoacoustics. Opt Quant Electron 54. https://doi.org/10.1007/s11082-022-03799-1 doi: 10.1007/s11082-022-03799-1
![]() |
[47] |
Elgui K, Bianchi P, Portier F, et al. (2020) Learning methods for rssi-based geolocation: a comparative study. Pervasive Mob Comput 67: 101199. https://doi.org/10.1016/j.pmcj.2020.101199 doi: 10.1016/j.pmcj.2020.101199
![]() |
[48] |
Enke D, Mehdiyev N (2014) A hybrid neuro-fuzzy model to forecast inflation. Proc Comput Sci 36: 254–260. https://doi.org/10.1016/j.procs.2014.09.088. doi: 10.1016/j.procs.2014.09.088
![]() |
[49] |
Fan C, Zhang D, Zhang C (2010) On sample size of the kruskal-wallis test with application to a mouse peritoneal cavity study. Biometrics 67: 213–224. https://doi.org/10.1111/j.1541-0420.2010.01407.x doi: 10.1111/j.1541-0420.2010.01407.x
![]() |
[50] |
Farsi B, Amayri M, Bouguila N, et al. (2021) On short-term load forecasting using machine learning techniques and a novel parallel deep lstm-cnn approach. IEEE Access 9: 31191–31212. https://doi.org/10.1109/access.2021.3060290 doi: 10.1109/access.2021.3060290
![]() |
[51] |
Feng H (2024) Analysis and Forecast of CPI in China Based on LSTM and VAR Model. Advances in Digital Economy and Data Analysis Technology The 2nd International Conference on Internet Finance and Digital Economy, Kuala Lumpur Malaysia, 339–357. https://doi.org/10.1142/9789811267505_0025 doi: 10.1142/9789811267505_0025
![]() |
[52] |
Feurer M, Springenberg JT, Hutter F (2015) Initializing bayesian hyperparameter optimization via meta-learning. Proceedings of the AAAI Conference on Artificial Intelligence 29. https://doi.org/10.1609/aaai.v29i1.9354 doi: 10.1609/aaai.v29i1.9354
![]() |
[53] | Friedman JH (1991) Multivariate adaptive regression splines. Annals Stat 19: 1–67. |
[54] | Friedman J, Hastie T, Tibshirani R (2010) Regularization paths for generalized linear models via coordinate descent. J Stat Software 33: 1–22. |
[55] |
Gao Z, Kuruoglu EE (2023) Attention based hybrid parametric and neural network models for non‐stationary time series prediction. Expert Syst 41. https://doi.org/10.1111/exsy.13419 doi: 10.1111/exsy.13419
![]() |
[56] | Gastinger J, Nicolas S, Stepić D, et al. (2021) A study on ensemble learning for time series forecasting and the need for meta-learning. In: 2021 International Joint Conference on Neural Networks (IJCNN), 1–8. https://doi.org/10.48550/arxiv.2104.11475 |
[57] |
Gil-Cordero E, Rondán-Cataluña FJ, Sigüenza-Morales D (2020) Private label and macroeconomic indicators: Europe and USA. Adm Sci 10: 91. https://doi.org/10.3390/admsci10040091 doi: 10.3390/admsci10040091
![]() |
[58] |
Greenland S, Senn S, Rothman KJ, et al. (2016) Statistical tests, p values, confidence intervals, and power: a guide to misinterpretations. Eur J Epidemiol 31: 337–350. https://doi.org/10.1007/s10654-016-0149-3 doi: 10.1007/s10654-016-0149-3
![]() |
[59] |
Groeneveld RA, Meeden G (1984) Measuring skewness and kurtosis. J Roy Stat Soc Series D 33: 391–399. https://doi.org/10.2307/2987742. doi: 10.2307/2987742
![]() |
[60] |
Guo Y, Strauss VY, Prieto-Alhambra D, et al. (2022) Use of machine learning for comparing disease risk scores and propensity scores under complex confounding and large sample size scenarios: a simulation study. medRxiv 1–12. https://doi.org/10.1101/2022.02.03.22270151 doi: 10.1101/2022.02.03.22270151
![]() |
[61] | Hajdini I, Knotek II ES, Leer J, et al. (2024) Indirect consumer inflation expectations: Theory and evidence. J Monetary Econ 103568. |
[62] |
Hao J, Feng Q, Li J, et al. (2023) A bi‐level ensemble learning approach to complex time series forecasting: taking exchange rates as an example. J Forecasting 42: 1385–1406. https://doi.org/10.1002/for.2971 doi: 10.1002/for.2971
![]() |
[63] |
Harding M, Lamarche C (2021) Small steps with big data: using machine learning in energy and environmental economics. Annu Rev Resour Econ 13: 469–488. https://doi.org/10.1146/annurev-resource-100920-034117 doi: 10.1146/annurev-resource-100920-034117
![]() |
[64] | Hasanah SH (2021) Multivariate Adaptive Regression Splines (MARS) for Modeling The Student Status at Universitas Terbuka. J Mat MANTIK 7: 51–58. |
[65] | Hauke J, Kossowski T (2011) Comparison of values of Pearson's and Spearman's correlation coefficients on the same sets of data. Quaest Geogr 30: 87–93. |
[66] |
He Y, Zeng X, Li H, et al. (2022) Application of lstm model optimized by individual-ordering-based adaptive genetic algorithm in stock forecasting. Int J Intell Comput 16: 277–294. https://doi.org/10.1108/ijicc-04-2022-0104 doi: 10.1108/ijicc-04-2022-0104
![]() |
[67] |
Henderi H (2021) Comparison of min-max normalization and z-score normalization in the k-nearest neighbor (knn) algorithm to test the accuracy of types of breast cancer. IJIIS: Int J Inf Inf Syst 4: 13–20. https://doi.org/10.47738/ijiis.v4i1.73 doi: 10.47738/ijiis.v4i1.73
![]() |
[68] | Hossain MS, Mitra R (2017) The determinants of price inflation in the United States: a multivariate dynamic cointegration and causal analysis. J Dev Areas 51: 153–175. https://www.jstor.org/stable/26415701 |
[69] |
Ibrahim A, Mirjalili S, El-Said M, et al. (2021) Wind speed ensemble forecasting based on deep learning using adaptive dynamic optimization algorithm. IEEE Access 9: 125787–125804. https://doi.org/10.1109/access.2021.3111408 doi: 10.1109/access.2021.3111408
![]() |
[70] |
Imbens GW, Athey S (2021) Breiman's two cultures: a perspective from econometrics. Obs Stud 7: 127–133. https://doi.org/10.1353/obs.2021.0028 doi: 10.1353/obs.2021.0028
![]() |
[71] |
Imron M, Utami WD, Khaulasari H, et al. (2022) Arima model of outlier detection for forecasting consumer price index (cpi). BAREKENG: J Ilmu Matematika Dan Terapan 16: 1259–1270. https://doi.org/10.30598/barekengvol16iss4pp1259-1270 doi: 10.30598/barekengvol16iss4pp1259-1270
![]() |
[72] |
Islam H, Islam MS, Saha S, et al. (2024) Impact of macroeconomic factors on performance of banks in bangladesh. J Ekon. https://doi.org/10.58251/ekonomi.1467784 doi: 10.58251/ekonomi.1467784
![]() |
[73] |
Iqbal Z, Akbar M, Amjad W (2021) Nexus of gold price-exchange rate-interest rate-oil price: lessons for monetary policy in pakistan. Int J Bus Manag 16: 1–16. https://doi.org/10.52015/nijbm.v16i1.50 doi: 10.52015/nijbm.v16i1.50
![]() |
[74] | Ivașcu C (2023) Can Machine Learning Models Predict Inflation? In: Proceedings of the International Conference on Business Excellence, 17: 1748–1756. |
[75] |
Jaber AM, Ismail MT, Altaher AM (2014) Empirical mode decomposition combined with local linear quantile regression for automatic boundary correction. Abstr Appl Anal 2014: 1–8. https://doi.org/10.1155/2014/731827 doi: 10.1155/2014/731827
![]() |
[76] |
Jadiya AK, Chaudhary A, Thakur R (2020) Polymorphic sbd preprocessor: a preprocessing approach for social big data. Indian J Comput Syst Sci Eng 11: 953–961. https://doi.org/10.21817/indjcse/2020/v11i6/201106169 doi: 10.21817/indjcse/2020/v11i6/201106169
![]() |
[77] |
Jakubik J, Nazemi A, Geyer-Schulz A, et al. (2023) Incorporating financial news for forecasting Bitcoin prices based on long short-term memory networks. Quant Financ 23: 335–349. https://doi.org/10.1080/14697688.2022.2130085 doi: 10.1080/14697688.2022.2130085
![]() |
[78] |
Khan A, Kandel J, Tayara H, et al. (2024) Predicting the bandgap and efficiency of perovskite solar cells using machine learning methods. Mol Inform 43. https://doi.org/10.1002/minf.202300217 doi: 10.1002/minf.202300217
![]() |
[79] | Knotek ES, Mitchell J, Pedemonte MO, et al. (2024) The effects of interest rate increases on consumers' inflation expectations: the roles of informedness and compliance. Working Paper 24 –01 Federal Reserve Bank of Cleveland. https://doi.org/10.26509/frbc-wp-202401 |
[80] |
Jin Q, Fan X, Liu J, et al. (2019) Using extreme gradient boosting to predict changes in tropical cyclone intensity over the western north pacific. Atmosphere 10: 341. https://doi.org/10.3390/atmos10060341 doi: 10.3390/atmos10060341
![]() |
[81] |
Johansen S (2009) Cointegration: Overview and development. Handbook Financ Time Ser 671–693. https://doi.org/10.1007/978-3-540-71297-8_29 doi: 10.1007/978-3-540-71297-8_29
![]() |
[82] |
Jung HS, Lee SH, Lee H, et al. (2023) Predicting bitcoin trends through machine learning using sentiment analysis with technical indicators. Comput Syst Sci Eng 46: 2231–2246. https://doi.org/10.32604/csse.2023.034466 doi: 10.32604/csse.2023.034466
![]() |
[83] |
Jurado S, Nebot À, Mugica F, et al. (2015) Hybrid methodologies for electricity load forecasting: entropy-based feature selection with machine learning and soft computing techniques. Energy 86: 276–291. https://doi.org/10.1016/j.energy.2015.04.039 doi: 10.1016/j.energy.2015.04.039
![]() |
[84] |
Khandelwal I, Adhikari R, Verma G (2015) Time series forecasting using hybrid arima and ann models based on dwt decomposition. Procedia Comput Sci 48: 173–179. https://doi.org/10.1016/j.procs.2015.04.167 doi: 10.1016/j.procs.2015.04.167
![]() |
[85] | Khodabakhsh A, Ari I, Bakır M, et al. (2020) Forecasting multivariate time-series data using LSTM and mini-batches. In: Data Science: From Research to Application, 121–129. Springer. https://doi.org/10.1007/978-3-030-37309-2_10 |
[86] |
Kilian L, Zhou X (2021) The impact of rising oil prices on u. s. inflation and inflation expectations in 2020-23. Energy Econ 113: 106228. https://doi.org/10.2139/ssrn.3980337 doi: 10.2139/ssrn.3980337
![]() |
[87] |
Kitani R, Iwata S (2023) Verification of interpretability of phase-resolved partial discharge using a cnn with shap. IEEE Access 11: 4752–4762. https://doi.org/10.1109/access.2023.3236315 doi: 10.1109/access.2023.3236315
![]() |
[88] | Kumar SD, Subha DP (2019) Prediction of depression from EEG signal using long short term memory (LSTM). In: 2019 3rd International Conference on Trends in Electronics and Informatics (ICOEI), 1248–1253. https://doi.org/10.1109/ICOEI.2019.8862560 |
[89] |
Kunstmann L, Pina D, Silva F, et al. (2021) Online deep learning hyperparameter tuning based on provenance analysis. J Inf Data Manage 12. https://doi.org/10.5753/jidm.2021.1924 doi: 10.5753/jidm.2021.1924
![]() |
[90] |
Lee K, Ayyasamy MV, Ji Y, et al. (2022) A comparison of explainable artificial intelligence methods in the phase classification of multi-principal element alloys. Sci Rep 12: 11591. https://doi.org/10.1038/s41598-022-15618-4 doi: 10.1038/s41598-022-15618-4
![]() |
[91] |
Lees T, Buechel M, Anderson B, et al. (2021) Benchmarking data-driven rainfall–runoff models in Great Britain: a comparison of long short-term memory (LSTM)-based models with four lumped conceptual models. Hydrol Earth Syst Sci 25: 5517–5534. https://doi.org/10.5194/hess-25-5517-2021 doi: 10.5194/hess-25-5517-2021
![]() |
[92] |
Lewis PA, Ray BK (1997) Modeling long-range dependence, nonlinearity, and periodic phenomena in sea surface temperatures using TSMARS. J Am Stat Assoc 92: 881–893. https://doi.org/10.1080/01621459.1997.10474043 doi: 10.1080/01621459.1997.10474043
![]() |
[93] |
Lewis PA, Stevens JG (1991) Nonlinear modeling of time series using multivariate adaptive regression splines (MARS). J Am Stat Assoc 86: 864–877. https://doi.org/10.1080/01621459.1991.10475126 doi: 10.1080/01621459.1991.10475126
![]() |
[94] |
Li G, Yang N (2022) A hybrid sarima‐lstm model for air temperature forecasting. Adv Theor Simul 6. https://doi.org/10.1002/adts.202200502 doi: 10.1002/adts.202200502
![]() |
[95] |
Li P, Zhang JS (2018) A new hybrid method for China's energy supply security forecasting based on ARIMA and XGBoost. Energies 11: 1687. https://doi.org/10.3390/en11071687 doi: 10.3390/en11071687
![]() |
[96] |
Li S, Huang H, Lu W (2021) A neural networks based method for multivariate time-series forecasting. IEEE Access 9: 63915–63924. https://doi.org/10.1109/access.2021.3075063 doi: 10.1109/access.2021.3075063
![]() |
[97] |
Li T, Hua M, Wu X (2020) A hybrid cnn-lstm model for forecasting particulate matter (pm2.5). IEEE Access 8: 26933–26940. https://doi.org/10.1109/access.2020.2971348 doi: 10.1109/access.2020.2971348
![]() |
[98] | Li X, Huo H, Liu Z (2022) Analysis and prediction of pm2.5 concentration based on lstm-xgboost-svr model. https://doi.org/10.21203/rs.3.rs-2158285/v1 |
[99] |
Liu Z (2023) Review on the influence of machine learning methods and data science on the economics. Appl Comput Eng 22: 137–141. https://doi.org/10.54254/2755-2721/22/20231208 doi: 10.54254/2755-2721/22/20231208
![]() |
[100] |
Liu Y, Yang Y, Chin RJ, et al. (2023) Long Short-Term Memory (LSTM) Based Model for Flood Forecasting in Xiangjiang River. J Civil Eng 27: 5030–5040. https://doi.org/10.1007/s12205-023-2469-7 doi: 10.1007/s12205-023-2469-7
![]() |
[101] |
Lv C, An S, Qiao B, et al. (2021) Time series analysis of hemorrhagic fever with renal syndrome in mainland china by using an xgboost forecasting model. Bmc Infect Dis 21. https://doi.org/10.1186/s12879-021-06503-y doi: 10.1186/s12879-021-06503-y
![]() |
[102] |
Medeiros MC, Vasconcelos GF, Veiga Á, et al. (2019) Forecasting inflation in a data-rich environment: the benefits of machine learning methods. J Bus Econ Stat 39: 98–119. https://doi.org/10.1080/07350015.2019.1637745 doi: 10.1080/07350015.2019.1637745
![]() |
[103] |
Mitchell DJB (1999) Review of Getting Prices Right: The Debate over the Consumer Price Index, by D. Baker. Ind Labor Relat Rev 52: 317–318. https://doi.org/10.2307/2525170 doi: 10.2307/2525170
![]() |
[104] | Mohammed AA, Immanuel PJ, Roobini MS (2023) Forecasting Consumer Price Index (CPI) Using Deep Learning and Hybrid Ensemble Technique. 2023 International Conference on Advances in Computing, Communication and Applied Informatics (ACCAI), Chennai, India, 1–8. https://doi.org/10.1109/ACCAI58221.2023.10200153 |
[105] |
Mohan S, Hutson A, MacDonald I, et al. (2019) Impact of macroeconomic indicators on housing prices. Int J Hous Mark Anal 12: 1055–1071. https://doi.org/10.1108/IJHMA-09-2018-0070 doi: 10.1108/IJHMA-09-2018-0070
![]() |
[106] |
Mulenga M, Kareem SA, Sabri AQM, et al. (2021) Stacking and chaining of normalization methods in deep learning-based classification of colorectal cancer using gut microbiome data. IEEE Access 9: 97296–97319. https://doi.org/10.1109/access.2021.3094529 doi: 10.1109/access.2021.3094529
![]() |
[107] |
Murat N (2023) Outlier detection in statistical modeling via multivariate adaptive regression splines. Commun Stat-Simul C 52: 3379–3390. https://doi.org/10.1080/03610918.2021.2007400 doi: 10.1080/03610918.2021.2007400
![]() |
[108] |
Muruganandam NS, Arumugam U (2023) Dynamic ensemble multivariate time series forecasting model for pm2.5. Comput Syst Sci Eng 44: 979–989. https://doi.org/10.32604/csse.2023.024943 doi: 10.32604/csse.2023.024943
![]() |
[109] |
Naidu S, Pandaram A, Chand A (2017) A Johansen cointegration test for the relationship between remittances and economic growth of Japan. Mod Appl Sci 11: 137–151. https://doi.org/10.5539/mas.v11n10p137 doi: 10.5539/mas.v11n10p137
![]() |
[110] |
Naser AH, Badr AH, Henedy SN, et al. (2022) Application of Multivariate Adaptive Regression Splines (MARS) approach in prediction of compressive strength of eco-friendly concrete. Case Stud Constr Mat 17: e01262. https://doi.org/10.1016/j.cscm.2022.e01262 doi: 10.1016/j.cscm.2022.e01262
![]() |
[111] |
Nguyen LT, Chung HH, Tuliao KV, et al. (2020) Using xgboost and skip-gram model to predict online review popularity. SAGE Open 10: 215824402098331. https://doi.org/10.1177/2158244020983316 doi: 10.1177/2158244020983316
![]() |
[112] | Nguyen TT, Nguyen HG, Lee JY, et al. (2023) The consumer price index prediction using machine learning approaches: Evidence from the United States. Heliyon 9. |
[113] |
Njenga JK (2024) Analysis and Forecasting of Consumer Price Index (CPI) in Kenya and South Africa using Holt Winter Model. Asian J Econ Bus Account 24: 322–331. https://doi.org/10.9734/ajeba/2024/v24i41283 doi: 10.9734/ajeba/2024/v24i41283
![]() |
[114] |
Noorunnahar M, Chowdhury AH, Mila FA. (2023) A tree based extreme gradient boosting (xgboost) machine learning model to forecast the annual rice production in bangladesh. Plos One 18: e0283452. https://doi.org/10.1371/journal.pone.0283452 doi: 10.1371/journal.pone.0283452
![]() |
[115] | Pan J, Zhang Z, Peters S, et al. (2023) Cerebrovascular disease case identification in inpatient electronic medical record data using natural language processing. https://doi.org/10.21203/rs.3.rs-2640617/v1 |
[116] | Paparoditis E, Politis DN (2018) The asymptotic size and power of the augmented Dickey–Fuller test for a unit root. Econ Rev 37: 955–973. |
[117] |
Papíková L, Papík M (2022) Effects of classification, feature selection, and resampling methods on bankruptcy prediction of small and medium‐sized enterprises. Intell Syst Account Financ Manage 29: 254–281. https://doi.org/10.1002/isaf.1521 doi: 10.1002/isaf.1521
![]() |
[118] |
Park HJ, Kim Y, Kim HY (2022) Stock market forecasting using a multi-task approach integrating long short-term memory and the random forest framework. Appl Soft Comput 114: 108106. https://doi.org/10.1016/j.asoc.2021.108106 doi: 10.1016/j.asoc.2021.108106
![]() |
[119] |
Phillips PC, Perron P (1988) Testing for a unit root in time series regression. Biometrika 75: 335–346. https://doi.org/10.1093/biomet/75.2.335 doi: 10.1093/biomet/75.2.335
![]() |
[120] | Poh CW, Tan R (1997) Performance of Johansen's cointegration test. In: East Asian Economic Issues: Volume III, 402–414. |
[121] |
Porcher R, Thomas G (2003) Order determination in nonlinear time series by penalized least-squares. Commun Stat-Simul C 32: 1115–1129. https://doi.org/10.1081/SAC-120023881 doi: 10.1081/SAC-120023881
![]() |
[122] |
Qinghe Z, Wen X, Huang B, et al. (2022) Optimised extreme gradient boosting model for short term electric load demand forecasting of regional grid system. Sci Rep 12: 19282. https://doi.org/10.1038/s41598-022-22024-3 doi: 10.1038/s41598-022-22024-3
![]() |
[123] |
Radev L, Golitsis P, Mitreva M. (2023) Economic and financial determinants of gold etf price volatility on the u. s. futures market (comex). J Econ 8: 12–26. https://doi.org/10.46763/joe2382012r doi: 10.46763/joe2382012r
![]() |
[124] |
Raheem Ahmed R, Vveinhardt J, Štreimikienė D, et al. (2017) Estimation of long-run relationship of inflation (cpi & wpi), and oil prices with kse-100 index: Evidence from johansen multivariate cointegration approach. Technol Econ Dev Econ 23: 567–588. https://doi.org/10.3846/20294913.2017.1289422 doi: 10.3846/20294913.2017.1289422
![]() |
[125] |
Reddy S, Akashdeep S, Harshvardhan R, et al. (2022) Stacking Deep learning and Machine learning models for short-term energy consumption forecasting. Adv Eng Inform 52: 101542. https://doi.org/10.1016/j.aei.2022.101542 doi: 10.1016/j.aei.2022.101542
![]() |
[126] | Reed SB (2014) One hundred years of price change: The Consumer Price Index and the American inflation experience. Monthly Lab Rev 137: 1. |
[127] |
Rezaie-Balf M, Zahmatkesh Z, Kim S (2017) Soft computing techniques for rainfall-runoff simulation: local non–parametric paradigm vs. model classification methods. Water Resour Manag 31: 3843–3865. https://doi.org/10.1007/s11269-017-1711-9 doi: 10.1007/s11269-017-1711-9
![]() |
[128] | Ribeiro MHDM, Silva RG, Mariani VC, et al. (2021) Dengue cases forecasting based on extreme gradient boosting ensemble with coyote optimization. Anais Do 15. Congresso Brasileiro De Inteligência Computacional. https://doi.org/10.21528/cbic2021-36 |
[129] |
Ribeiro MHDM, Stefenon SF, Lima JD, et al. (2020) Electricity price forecasting based on self-adaptive decomposition and heterogeneous ensemble learning. Energies 13: 5190. https://doi.org/10.3390/en13195190 doi: 10.3390/en13195190
![]() |
[130] | Rippy D (2014) The first hundred years of the Consumer Price Index: a methodological and political history. Monthly Lab Rev 137: 1. |
[131] | Rosado R, Abreu AJ, Arencibia JC, et al. (2021) Consumer price index forecasting based on univariate time series and a deep neural network. In: International Workshop on Artificial Intelligence and Pattern Recognition, 33–42. Cham: Springer. https://doi.org/10.1007/978-3-030-89691-1_4 |
[132] |
Rodríguez-Pérez R, Bajorath J (2020) Interpretation of machine learning models using shapley values: application to compound potency and multi-target activity predictions. J Comput Aid Mol Des 34: 1013–1026. https://doi.org/10.1007/s10822-020-00314-0 doi: 10.1007/s10822-020-00314-0
![]() |
[133] |
Sagheer A, Kotb M (2019) Unsupervised pre-training of a deep LSTM-based stacked autoencoder for multivariate time series forecasting problems. Sci Rep 9: 19038. https://doi.org/10.1038/s41598-019-55320-6 doi: 10.1038/s41598-019-55320-6
![]() |
[134] | Saputra AW, Wibawa AP, Pujianto U, et al. (2022) LSTM-based Multivariate Time-Series Analysis: A Case of Journal Visitors Forecasting. ILKOM J Ilm 14: 57–62. |
[135] |
Sarangi PK, Sahoo AK, Sinha S (2022) Modeling consumer price index: a machine learning approach. Macromol Sym 401. https://doi.org/10.1002/masy.202100349 doi: 10.1002/masy.202100349
![]() |
[136] |
Setyanto A, Laksito A, Alarfaj F, et al. (2022). Arabic language opinion mining based on long short-term memory (LSTM). Appl Sci 12: 4140. https://doi.org/10.3390/app12094140 doi: 10.3390/app12094140
![]() |
[137] |
Shahbaz M, Khraief N, Mahalik MK (2020) Investigating the environmental Kuznets's curve for Sweden: Evidence from multivariate adaptive regression splines (MARS). Empir Econ 59: 1883–1902. https://doi.org/10.1007/s00181-019-01698-1 doi: 10.1007/s00181-019-01698-1
![]() |
[138] |
Sharda VN, Prasher SO, Patel RM, et al. (2008) Performance of Multivariate Adaptive Regression Splines (MARS) in predicting runoff in mid-Himalayan micro-watersheds with limited data/Performances de régressions par splines multiples et adaptives (MARS) pour la prévision d'écoulement au sein de micro-bassins versants Himalayens d'altitudes intermédiaires avec peu de données. Hydrolog Sci J 53: 1165–1175. https://doi.org/10.1623/hysj.53.6.1165 doi: 10.1623/hysj.53.6.1165
![]() |
[139] |
Sharma SS (2016) Can consumer price index predict gold price returns? Econ Model 55: 269–278. https://doi.org/10.1016/j.econmod.2016.02.014 doi: 10.1016/j.econmod.2016.02.014
![]() |
[140] |
Shi F, Lu S, Gu J, et al. (2022) Modeling and evaluation of the permeate flux in forward osmosis process with machine learning. Ind Eng Chem Res 61: 18045–18056. https://doi.org/10.1021/acs.iecr.2c03064 doi: 10.1021/acs.iecr.2c03064
![]() |
[141] |
Shiferaw Y (2023) An understanding of how gdp, unemployment and inflation interact and change across time and frequency. Economies 11: 131. https://doi.org/10.3390/economies11050131 doi: 10.3390/economies11050131
![]() |
[142] | Siami-Namini S, Tavakoli N, Namin AS (2019) The performance of LSTM and BiLSTM in forecasting time series. In: 2019 IEEE International conference on big data (Big Data), 3285–3292, IEEE. |
[143] |
Sibai N, El-Moursy F, Sibai A (2024) Forecasting the consumer price index: a comparative study of machine learning methods. Int J Comput Digit Syst 15: 487–497. https://doi.org/10.12785/ijcds/150137 doi: 10.12785/ijcds/150137
![]() |
[144] | Simsek AI (2024) Improving the Performance of Stock Price Prediction: A Comparative Study of Random Forest, XGBoost, and Stacked Generalization Approaches. In: Revolutionizing the Global Stock Market: Harnessing Blockchain for Enhanced Adaptability, 83–99. IGI Global. |
[145] | Subhani MI (2009) Relationship between Consumer Price Index (CPI) and government bonds. S Asian J Manage Sci 3: 11–17. |
[146] |
Sukarsa IM, Pinata NNP, Rusjayanthi NKD, et al. (2021) Estimation of gourami supplies using gradient boosting decision tree method of xgboost. TEM J 144–151. https://doi.org/10.18421/tem101-17 doi: 10.18421/tem101-17
![]() |
[147] | Sumita S, Nakagawa H, Tsuchiya T (2023) Xtune: an xai-based hyperparameter tuning method for time-series forecasting using deep learning. https://doi.org/10.21203/rs.3.rs-3008932/v1Shimon |
[148] | Sun Y, Tian L (2022) Research on stock prediction based on simulated annealing algorithm and ensemble neural learning. Third International Conference on Computer Science and Communication Technology (ICCSCT 2022). https://doi.org/10.1117/12.2663138 |
[149] |
Tan KR, Seng JJB, Kwan YH, et al. (2021) Evaluation of machine learning methods developed for prediction of diabetes complications: a systematic review. J Diabetes Sci Techn 17: 474–489. https://doi.org/10.1177/19322968211056917 doi: 10.1177/19322968211056917
![]() |
[150] |
Temür AS, Yildiz Ş (2021) Comparison of forecasting performance of arima lstm and hybrid models for the sales volume budget of a manufacturing enterprise. Istanb Bus Res 50: 15–46. https://doi.org/10.26650/ibr.2021.51.0117 doi: 10.26650/ibr.2021.51.0117
![]() |
[151] |
Thapa KB (2023) Macroeconomic determinants of the stock market in nepal: an empirical analysis. NCC J 8: 65–73. https://doi.org/10.3126/nccj.v8i1.63087 doi: 10.3126/nccj.v8i1.63087
![]() |
[152] |
Tian L, Feng L, Sun Y, et al. (2021) Forecast of lstm-xgboost in stock price based on bayesian optimization. Intell Autom Soft Comput 29: 855–868. https://doi.org/10.32604/iasc.2021.016805 doi: 10.32604/iasc.2021.016805
![]() |
[153] |
Toraman C, Basarir Ç (2014) The long run relationship between stock market capitalization rate and interest rate: Co-integration approach. Procedia-Soc Behav Sci 143: 1070–1073. https://doi.org/10.1016/j.sbspro.2014.07.557 doi: 10.1016/j.sbspro.2014.07.557
![]() |
[154] |
Upadhyaya Y, Kharel K (2022) Inflation with gdp, unemployment and remittances: an outline of the joint effect on nepalese economy. Interd J Manage Soc Sci 3: 154–163. https://doi.org/10.3126/ijmss.v3i1.50244 doi: 10.3126/ijmss.v3i1.50244
![]() |
[155] |
Utama ABP, Wibawa AP, Muladi M, et al. (2022) Pso based hyperparameter tuning of cnn multivariate time- series analysis. J Online Inform 7: 193–202. https://doi.org/10.15575/join.v7i2.858 doi: 10.15575/join.v7i2.858
![]() |
[156] |
Varian HR (2014) Big data: New tricks for econometrics. J Econ Perspect 28: 3–28. https://doi.org/10.1257/jep.28.2.3 doi: 10.1257/jep.28.2.3
![]() |
[157] |
Vasco-Carofilis RA, Gutiérrez–Naranjo MA, Cárdenas‐Montes M (2020) Pbil for optimizing hyperparameters of convolutional neural networks and stl decomposition. Lect Notes Comput Sci 147–159. https://doi.org/10.1007/978-3-030-61705-9_13 doi: 10.1007/978-3-030-61705-9_13
![]() |
[158] |
Vlachas PR, Byeon W, Wan Z, et al. (2018) Data-driven forecasting of high-dimensional chaotic systems with long short-term memory networks. P Roy Soc A-Math Phy 474: 20170844. https://doi.org/10.1098/rspa.2017.0844 doi: 10.1098/rspa.2017.0844
![]() |
[159] |
Wan R, Mei S, Wang J, et al. (2019) Multivariate temporal convolutional network: A deep neural networks approach for multivariate time series forecasting. Electronics 8: 876. https://doi.org/10.3390/electronics8080876 doi: 10.3390/electronics8080876
![]() |
[160] |
Wang L, Zhao L (2022) Digital economy meets artificial intelligence: forecasting economic conditions based on big data analytics. Mob Inf Syst 2022: 1–9. https://doi.org/10.1155/2022/7014874 doi: 10.1155/2022/7014874
![]() |
[161] |
Wang L, Haofei Z, Su J, et al. (2013) An arima‐ann hybrid model for time series forecasting. Syst Res Behav Sci 30: 244–259. https://doi.org/10.1002/sres.2179 doi: 10.1002/sres.2179
![]() |
[162] |
Wang W, Shi Y, Lyu G, et al. (2017) Electricity consumption prediction using xgboost based on discrete wavelet transform. DEStech T Comput Sci Eng. https://doi.org/10.12783/dtcse/aiea2017/15003 doi: 10.12783/dtcse/aiea2017/15003
![]() |
[163] |
Wang Y, Ye G (2020) Forecasting method of stock market volatility in time series data based on mixed model of arima and xgboost. China Commun 17: 205–221. https://doi.org/10.23919/jcc.2020.03.017 doi: 10.23919/jcc.2020.03.017
![]() |
[164] |
Wang Y, Bao F, Hua Q, et al. (2021). Short-term solar power forecasting: a combined long short-term memory and gaussian process regression method. Sustainability 13: 3665. https://doi.org/10.3390/su13073665 doi: 10.3390/su13073665
![]() |
[165] |
Wei B, Yue J, Rao Y (2017) A deep learning framework for financial time series using stacked autoencoders and long-short term memory. Plos One 12: e0180944. https://doi.org/10.1371/journal.pone.0180944 doi: 10.1371/journal.pone.0180944
![]() |
[166] | Weinzierl M (2014) Seesaws and social security benefits indexing(No. w20671). National Bureau of Economic Research, Cambridge. |
[167] |
Widiputra H, Mailangkay ABL, Gautama E (2021) Multivariate cnn-lstm model for multiple parallel financial time-series prediction. Complexity 2021: 1–14. https://doi.org/10.1155/2021/9903518 doi: 10.1155/2021/9903518
![]() |
[168] |
Qureshi M, Khan A, Daniyal M, et al. (2023) A comparative analysis of traditional sarima and machine learning models for cpi data modelling in pakistan. Appl Comput Intell S 2023: 1–10. https://doi.org/10.1155/2023/3236617 doi: 10.1155/2023/3236617
![]() |
[169] |
Xiao C, Wang Y, Wang S (2023) Machine learning to set hyperparameters for overlapping community detection algorithms. J Eng 2023. https://doi.org/10.1049/tje2.12292 doi: 10.1049/tje2.12292
![]() |
[170] |
Xu J, He J, Gu J, et al. (2022) Financial Time Series Prediction Based on XGBoost and Generative Adversarial Networks. Int J Circ Syst Signal Process 16: 637–645. https://doi.org/10.46300/9106.2022.16.79 doi: 10.46300/9106.2022.16.79
![]() |
[171] |
Yang C, Guo S (2021) Inflation prediction method based on deep learning. Comput Intel Neurosc 2021. https://doi.org/10.1155/2021/1071145 doi: 10.1155/2021/1071145
![]() |
[172] |
Yang L, Shami A (2020) On hyperparameter optimization of machine learning algorithms: theory and practice. Neurocomputing 415: 295–316. https://doi.org/10.1016/j.neucom.2020.07.061 doi: 10.1016/j.neucom.2020.07.061
![]() |
[173] |
Ye M, Mohammed KS, Tiwari S, et al. (2023) The effect of the global supply chain and oil prices on the inflation rates in advanced economies and emerging markets. Geo J 58: 2805–2817. https://doi.org/10.1002/gj.4742 doi: 10.1002/gj.4742
![]() |
[174] |
Yilmazkuday H (2024) Pass‐through of shocks into different u.s. prices. Rev Int Econ 32: 1300–1315. https://doi.org/10.1111/roie.12726 doi: 10.1111/roie.12726
![]() |
[175] |
Yildiz M, Ozdemir L (2022) Determination of the sensitivity of stock index to macroeconomic and psychological factors by MARS method, in: Insurance and Risk Management for Disruptions in Social, Economic and Environmental Systems: Decision and Control Allocations within New Domains of Risk. Emerald Publishing Limited 2022: 81–105. https://doi.org/10.1108/978-1-80117-139-720211005 doi: 10.1108/978-1-80117-139-720211005
![]() |
[176] |
Yuan M, Yang N, Qian Z, et al. (2020) What makes an online review more helpful: an interpretation framework using xgboost and shap values. J Theor Appl El Comm 16: 466–490. https://doi.org/10.3390/jtaer16030029 doi: 10.3390/jtaer16030029
![]() |
[177] |
Zahara SS, Ilmiddaviq MB (2020) Consumer price index prediction using Long Short Term Memory (LSTM) based cloud computing. J Phys 1456: 1–8, IOP Publishing. https://doi.org/10.1088/1742-6596/1456/1/012022 doi: 10.1088/1742-6596/1456/1/012022
![]() |
[178] |
Zazo R, Lozano-Diez A, Gonzalez-Dominguez J, et al. (2016) Language identification in short utterances using long short-term memory (LSTM) recurrent neural networks. PloS One 11: e0146917. https://doi.org/10.1371/journal.pone.0146917 doi: 10.1371/journal.pone.0146917
![]() |
[179] | Zhai N, Yao P, Zhou X (2020) Multivariate time series forecast in industrial process based on XGBoost and GRU. In: 2020 IEEE 9th Joint International Information Technology and Artificial Intelligence Conference (ITAIC) 9: 1397–1400. |
[180] |
Zhang J, Meng Y, Jin W (2021) A novel hybrid deep learning model for sugar price forecasting based on time series decomposition. Math Probl Eng 2021: 1–9. https://doi.org/10.1155/2021/6507688 doi: 10.1155/2021/6507688
![]() |
[181] |
Zhang J, Wen J, Yang Z (2022) China's GDP forecasting using Long Short Term Memory Recurrent Neural Network and Hidden Markov Model. Plos One 17: e0269529. https://doi.org/10.1371/journal.pone.0269529 doi: 10.1371/journal.pone.0269529
![]() |
[182] |
Zhang X, Yang E (2024) Have housing value indicators changed during COVID? Housing value prediction based on unemployment, construction spending, and housing consumer price index. Int J Hous Mark Anal 17: 242–260. https://doi.org/10.1108/IJHMA-01-2023-0015 doi: 10.1108/IJHMA-01-2023-0015
![]() |
[183] |
Zhou S, Zhou L, Mao M, et al. (2019) An optimized heterogeneous structure lstm network for electricity price forecasting. IEEE Access 7: 108161–108173. https://doi.org/10.1109/access.2019.2932999 doi: 10.1109/access.2019.2932999
![]() |
[184] | Zhou X, Pranolo A, Mao Y (2023) AB-LSTM: Attention Bidirectional Long Short-Term Memory for Multivariate Time-Series Forecasting. In: 2023 International Conference on Computer, Electronics & Electrical Engineering & their Applications (IC2E3), 1–6. |
[185] | Zhou Z, Song Z, Ren T (2022) Predicting China's CPI by Scanner Big Data. arXiv preprint arXiv: 2211.16641. |
[186] |
Zhu C, Ma X, Zhang C, et al. (2023) Information granules-based long-term forecasting of time series via BPNN under three-way decision framework. Inf Sci 634: 696–715. https://doi.org/10.1016/j.ins.2023.03.133 doi: 10.1016/j.ins.2023.03.133
![]() |
Iron & steel | Chemical | Machinery | ||||||||||
SUB | SUB | SUB | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 0.838 | 0.005 | 40.267 | *** | 0.877 | 0.036 | 24.594 | *** | 0.700 | 0.077 | 9.101 | *** |
SUB | 0.0000 | 0.0003 | 1.303 | −0.004 | 0.002 | −2.424 | ** | 0.005 | 0.003 | 1.545 | ||
MA(1) | 0.667 | 0.155 | 4.790 | *** | 0.183 | 0.539 | 0.340 | 0.767 | 0.147 | 5.207 | *** | |
Residual variance=0.0000009. AIC=−215.99. log-likelihood=110.99. Ljung-Box test for autocorrelation=12.068 (p=0.674). | Residual variance=0.0000021. AIC=−198.26. log-likelihood=102.13. Ljung-Box test for autocorrelation=12.459 (p=0.644). | Residual variance=0.0000113. AIC=−164.13. log-likelihood=85.06. Ljung-Box test for autocorrelation=13.210 (p=0.586). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
SUB | SUB | SUB | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 1.924 | 0.798 | 2.412 | ** | 4.363 | 0.237 | 18.425 | *** | 3.854 | 1.248 | 3.087 | *** |
SUB | 0.046 | 0.031 | 1.507 | −0.043 | 0.009 | −4.764 | ** | −0.039 | 0.048 | −0.814 | ||
MA(1) | 0.852 | 0.238 | 3.580 | *** | 0.361 | 0.198 | 1.823 | * | 0.703 | 0.222 | 3.165 | *** |
Residual variance= 0.0000008.AIC=−77.58.log-likelihood=41.79. Ljung-Box test for autocorrelation=12.7516 (p=0.6215). | Residual variance= 0.00013.AIC=−116.24.log-likelihood=61.12. Ljung-Box test for autocorrelation=14.7107 (p=0.4724). | Residual variance= 0.0022.AIC=−58.93.log-likelihood=32.47. Ljung-Box test for autocorrelation=11.9376 (p=0.6837). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | z-ratio | |||
OIL | ||||||||||||
Intercept | 0.831 | 0.005 | 168.171 | *** | 0.780 | 0.009 | 89.230 | *** | 0.776 | 0.017 | 44.689 | *** |
OIL | 0.0007 | 0.0005 | 1.305 | −0.0016 | 0.0009 | −1.774 | * | 0.0042 | 0.0017 | 2.457 | ** | |
MA(1) | 0.662 | 0.147 | 4.504 | *** | 0.603 | 0.203 | 2.964 | ** | 0.637 | 0.135 | 4.719 | *** |
Residual variance= 0.0000008.AIC=−217.92.log-likelihood=111.96. Ljung-Box test for autocorrelation=12.068 (p=0.6351). | Residual variance= 0.000003.AIC=−193.88.log-likelihood=99.94. Ljung-Box test for autocorrelation=9.2392 (p=0.8647). | Residual variance= 0.00001.AIC=−167.03.log-likelihood=86.51. Ljung-Box test for autocorrelation=19.1849 (p=0.2055). | ||||||||||
COAL | ||||||||||||
Intercept | 0.840 | 0.006 | 135.345 | *** | 0.784 | 0.012 | 67.351 | *** | 0.797 | 0.024 | 128.842 | *** |
COAL | −0.0002 | 0.0007 | −0.294 | −0.0023 | 0.0013 | −1.762 | * | 0.0024 | 0.0027 | 0.885 | ||
MA(1) | 0.676 | 0.155 | 4.358 | *** | 0.738 | 0.260 | 2.834 | *** | 0.645 | 0.141 | 4.101 | *** |
Residual variance= 0.0000009.AIC=−216.07.log-likelihood=111.04. Ljung-Box test for autocorrelation=13.780 (p=0.5422). | Residual variance= 0.000003.AIC=−194.14.log-likelihood=100.07. Ljung-Box test for autocorrelation=7.862 (p=0.9292). | Residual variance= 0.00001.AIC=−162.58.log-likelihood=84.29. Ljung-Box test for autocorrelation=14.609 (p=0.4799). | ||||||||||
GAS | ||||||||||||
Intercept | 0.830 | 0.006 | 128.841 | *** | 0.788 | 0.011 | 69.819 | *** | 0.764 | 0.023 | 33.673 | *** |
GAS | 0.0007 | 0.0006 | 1.166 | −0.0023 | 0.0011 | −2.111 | * | 0.0052 | 0.0022 | 2.409 | ** | |
MA(1) | 0.637 | 0.155 | 4.101 | *** | 0.643 | 0.224 | 2.872 | ** | 0.596 | 0.141 | 4.234 | *** |
Residual variance=0.0000008.AIC=−217.38.log-likelihood=111.69. Ljung-Box test for autocorrelation=13.419 (p=0.5699). | Residual variance= 0.000003.AIC=−195.01.log-likelihood=100.51. Ljung-Box test for autocorrelation=9.448 (p=0.853). | Residual variance= 0.00001.AIC=−166.64.log-likelihood=86.32. Ljung-Box test for autocorrelation=20.65 (p=0.1484). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | z-ratio | |||
OIL | ||||||||||||
Intercept | 3.242 | 0.173 | 18.718 | *** | 3.530 | 0.044 | 80.855 | *** | 3.662 | 0.206 | 17.769 | *** |
OIL | −0.012 | 0.017 | −0.676 | −0.029 | 0.004 | −6.776 | *** | −0.081 | 0.020 | −4.012 | *** | |
MA(1) | 0.659 | 0.195 | 3.376 | *** | 0.353 | 0.246 | 1.435 | * | 0.726 | 0.185 | 3.925 | *** |
Residual variance= 0.001.AIC=−75.66.log-likelihood=40.83. Ljung-Box test for autocorrelation=13.5178 (p=0.5624). | Residual variance= 0.00009.AIC=−124.27.log-likelihood=65.13. Ljung-Box test for autocorrelation=24.1075 (p=0.06329). | Residual variance= 0.0013.AIC=−69.95.log-likelihood=37.98. Ljung-Box test for autocorrelation=13.4167 (p=0.5701). | ||||||||||
COAL | ||||||||||||
Intercept | 3.068 | 0.212 | 14.494 | *** | 3.474 | 0.089 | 39.118 | *** | 3.712 | 0.246 | 14.122 | *** |
COAL | 0.007 | 0.024 | 0.274 | −0.027 | 0.010 | −2.694 | ** | −0.098 | 0.028 | −0.973 | ||
MA(1) | 0.635 | 0.212 | 2.990 | ** | 1.000 | 1.147 | 0.872 | 1.000 | 0.203 | 4.921 | *** | |
Residual variance= 0.0001.AIC=−75.28.log-likelihood=111.96. Ljung-Box test for autocorrelation=14.2754 (p=0.5048). | Residual variance= 0.00017.AIC=−107.94.log-likelihood=56.97. Ljung-Box test for autocorrelation=24.5917 (p=0.0557). | Residual variance= 0.0.0013.AIC=−67.19.log-likelihood=36.59. Ljung-Box test for autocorrelation=15.6845 (p=0.4033). | ||||||||||
GAS | ||||||||||||
Intercept | 3.209 | 0.233 | 13.748 | *** | 3.596 | 0.069 | 51.954 | *** | 3.946 | 0.288 | 13.684 | *** |
GAS | −0.008 | 0.022 | −0.360 | −0.035 | 0.007 | −5.226 | *** | −0.106 | 0.028 | −3.851 | *** | |
MA(1) | 0.669 | 0.203 | 3.300 | *** | 0.451 | 0.234 | 1.929 | * | 0.876 | 0.180 | 4.868 | *** |
Residual variance= 0.00097.AIC=−75.33.log-likelihood=40.67. Ljung-Box test for autocorrelation=14.0869 (p=0.5189). | Residual variance= 0.00011.AIC=−118.62.log-likelihood=62.3. Ljung-Box test for autocorrelation=24.98842 (p=0.0501). | Residual variance= 0.0012.AIC=−70.16.log-likelihood=38.08. Ljung-Box test for autocorrelation=10.3797 (p=0.7952). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
INV | INV | INV | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 0.496 | 2.795 | 0.177 | 8.193 | 1.281 | 6.393 | *** | 11.813 | 2.624 | 4.502 | *** | |
INV | 0.153 | 0.163 | 0.941 | −0.286 | 0.074 | −3.870 | *** | −0.482 | 0.141 | −3.421 | *** | |
MA(1) | 0.651 | 0.209 | 3.124 | *** | 0.765 | 0.313 | 2.447 | ** | 0.707 | 0.180 | 3.923 | *** |
Residual variance= 0.00094.AIC=−76.07.log-likelihood=41.04. Ljung-Box test for autocorrelation=14.5235 (p=0.4863). | Residual variance= 0.00012.AIC=−116.8.log-likelihood=61.4. Ljung-Box test for autocorrelation=20.7272 (p=0.1458). | Residual variance= 0.00146.AIC=−67.15.log-likelihood=36.57. Ljung-Box test for autocorrelation=15.5819 (p=0.4104). | ||||||||||
INV: logged investment (installed based) amount by sector MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
INV | INV | INV | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 0.9368 | 0.0822 | 11.401169 | *** | 1.2323 | 0.1168 | 10.548472 | *** | 0.4952 | 0.2315 | 7.3278558 | *** |
INV | −0.0057 | 0.0048 | −1.206176 | −0.027 | 0.0067 | −4.002377 | *** | 0.0174 | 0.0124 | −0.638437 | ||
MA(1) | 0.6475 | 0.1781 | 3.634986 | *** | 0.4418 | 0.31 | 1.425193 | * | 0.6775 | 0.134 | 5.0488047 | *** |
Residual variance= 0.0000008.AIC=−217.38.log-likelihood=111.69. Ljung-Box test for autocorrelation=10.5245 (p=0.7855). | Residual variance= 0.0000018.AIC=−201.31.log-likelihood=103.66. Ljung-Box test for autocorrelation=10.2885 (p=0.8012). | Residual variance= 0.000013.AIC=−162.58.log-likelihood=84.29. Ljung-Box test for autocorrelation=19.3235 (p=0.1994). | ||||||||||
INV: logged investment (installed based) amount by sector
MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Subsidies | Fuel price | Investment | ||||
Oil price | Coal price | Gas price | ||||
Carbon intensity | Iron & steel | n.s | n.s | n.s | n.s | n.s |
Chemical | * | * | * | * | * | |
Machinery | n.s | - | n.s | - | n.s | |
Energy intensity | Iron & steel | n.s | n.s | n.s | n.s | n.s |
Chemical | * | * | * | * | * | |
Machinery | n.s | * | n.s | * | * | |
* = significant to improve carbon/energy intensity (the value negatively impacts on the intensities) - = significant to worsen carbon/energy intensity (the value positively impacts on the intensities) n.g = not significant |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 110.54 | −215.08 | 13.389 | 0.5723 |
ARMA(0, 1) | 111.8 | −217.59 | 12.068 | 0.6739 |
ARMA(1, 1) | 112.08 | −216.17 | 10.136 | 0.8111 |
ARMA(1, 0): AR1=s.e. NaN, ARMA(1, 1): AR1=s.e. NaN |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 112.15 | −218.3 | 9.2269 | 0.8654 |
ARMA(0, 1) | 111.96 | −217.92 | 12.575 | 0.6351 |
ARMA(1, 1) | 113.04 | −218.08 | 7.2962 | 0.9489 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 110.53 | −215.06 | 11.942 | 0.6834 |
ARMA(0, 1) | 111.04 | −216.07 | 13.781 | 0.5422 |
ARMA(1, 1) | 111.59 | −215.19 | 11.641 | 0.7059 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 111.42 | −216.84 | 11.662 | 0.7044 |
ARMA(0, 1) | 111.69 | −217.38 | 13.42 | 0.5699 |
ARMA(1, 1) | 112.2 | −216.41 | 10.399 | 0.7939 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 111.2 | −216.39 | 9.2154 | 0.866 |
ARMA(0, 1) | 111.69 | −217.38 | 10.524 | 0.7855 |
ARMA(1, 1) | 111.96 | −215.91 | 9.9323 | 0.824 |
ARMA(1, 0): AR1=s.e. NaN, ARMA(1, 1): AR1=s.e. NaN |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 40.23 | −74.46 | 18.456 | 0.2395 |
ARMA(0, 1) | 41.79 | −77.58 | 12.752 | 0.6215 |
ARMA(1, 1) | 41.95 | −75.89 | 12.568 | 0.6356 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.81 | −73.63 | 16.077 | 0.377 |
ARMA(0, 1) | 40.83 | −75.66 | 13.518 | 0.5624 |
ARMA(1, 1) | 40.91 | −73.82 | 13.556 | 0.5594 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.68 | −73.36 | 16.64 | 0.3408 |
ARMA(0, 1) | 40.64 | −75.28 | 14.275 | 0.5048 |
ARMA(1, 1) | 40.77 | −73.54 | 14.004 | 0.5252 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.57 | −73.15 | 17.064 | 0.315 |
ARMA(0, 1) | 40.67 | −75.33 | 14.087 | 0.5189 |
ARMA(1, 1) | 40.77 | −73.55 | 14.044 | 0.5222 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.85 | −73.69 | 17.475 | 0.2913 |
ARMA(0, 1) | 41.04 | −76.07 | 14.524 | 0.4863 |
ARMA(1, 1) | 41.41 | −74.82 | 12.78 | 0.6193 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 102.26 | −198.52 | 11.361 | 0.7266 |
ARMA(0, 1) | 102.13 | −198.26 | 12.459 | 0.644 |
ARMA(1, 1) | 102.26 | −196.53 | 11.422 | 0.7221 |
ARMA(1, 0): AR1=s.e. NaN |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 100.27 | −194.53 | 9.5491 | 0.8471 |
ARMA(0, 1) | 99.94 | −193.88 | 9.2392 | 0.8647 |
ARMA(1, 1) | 100.29 | −192.58 | 9.7751 | 0.8336 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 99.92 | −193.84 | 13.701 | 0.5483 |
ARMA(0, 1) | 100.07 | −194.14 | 14.609 | 0.4799 |
ARMA(1, 1) | 100.3 | −192.61 | 8.286 | 0.9118 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 99.73 | −193.46 | 6.8141 | 0.9626 |
ARMA(0, 1) | 100.51 | −195.01 | 9.4475 | 0.853 |
ARMA(1, 1) | 100.52 | −193.05 | 9.122 | 0.8711 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 103.18 | −200.36 | 13.871 | 0.5354 |
ARMA(0, 1) | 103.66 | −201.31 | 10.524 | 0.7855 |
ARMA(1, 1) | 103.67 | −199.34 | 9.9323 | 0.824 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 60.85 | −115.69 | 16.477 | 0.3511 |
ARMA(0, 1) | 61.12 | −116.24 | 14.711 | 0.4724 |
ARMA(1, 1) | 61.12 | −114.25 | 15.09 | 0.445 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 65.15 | −124.29 | 17.383 | 0.2965 |
ARMA(0, 1) | 65.13 | −124.27 | 13.417 | 0.5701 |
ARMA(1, 1) | 65.3 | −122.59 | 14.828 | 0.4639 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 58.56 | −111.13 | 25.8 | 0.04018 |
ARMA(0, 1) | 56.97 | −107.94 | 15.684 | 0.4033 |
ARMA(1, 1) | 58.38 | −108.75 | 18.066 | 0.2592 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 62.32 | −118.64 | 15.964 | 0.3845 |
ARMA(0, 1) | 62.31 | −118.62 | 10.38 | 0.7952 |
ARMA(1, 1) | 62.32 | −116.64 | 11.347 | 0.7276 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 61.26 | −116.53 | 20.173 | 0.1654 |
ARMA(0, 1) | 61.4 | −116.8 | 15.582 | 0.4104 |
ARMA(1, 1) | 61.72 | −115.45 | 16.162 | 0.3714 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 83.11 | −160.22 | 11.073 | 0.7474 |
ARMA(0, 1) | 85.06 | −164.13 | 14.44 | 0.4925 |
ARMA(1, 1) | 85.65 | −163.29 | 10.155 | 0.8099 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 85.61 | −165.22 | 25.884 | 0.03926 |
ARMA(0, 1) | 86.51 | −167.03 | 24.108 | 0.06329 |
ARMA(1, 1) | 87.5 | −167 | 26.07 | 0.03729 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 83.26 | −160.51 | 32.08 | 0.006279 |
ARMA(0, 1) | 84.29 | −162.58 | 24.592 | 0.0557 |
ARMA(1, 1) | 85.32 | −162.65 | 33.455 | 0.004059 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 85.65 | −165.31 | 30.216 | 0.01117 |
ARMA(0, 1) | 86.32 | −166.64 | 24.988 | 0.0501 |
ARMA(1, 1) | 87.09 | −166.18 | 25.761 | 0.04061 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 83.48 | −160.95 | 28.167 | 0.02054 |
ARMA(0, 1) | 84.84 | −163.69 | 20.727 | 0.1458 |
ARMA(1, 1) | 85.74 | −163.49 | 24.003 | 0.06505 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 31.55 | −57.1 | 16.787 | 0.3317 |
ARMA(0, 1) | 32.47 | −58.93 | 11.938 | 0.6837 |
ARMA(1, 1) | 32.48 | −56.95 | 12.061 | 0.6744 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 36.47 | −66.95 | 20.474 | 0.1545 |
ARMA(0, 1) | 37.98 | −69.95 | 19.185 | 0.2055 |
ARMA(1, 1) | 38.08 | −68.16 | 7.2962 | 0.9489 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 34.09 | −62.19 | 13.701 | 0.5483 |
ARMA(0, 1) | 36.59 | −67.19 | 14.609 | 0.4799 |
ARMA(1, 1) | 36.99 | −65.98 | 8.286 | 0.9118 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 35.77 | −65.55 | 21.538 | 0.1205 |
ARMA(0, 1) | 38.08 | −70.16 | 20.65 | 0.1484 |
ARMA(1, 1) | 38.11 | −68.22 | 16.339 | 0.3599 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 35.14 | −64.27 | 17.433 | 0.2936 |
ARMA(0, 1) | 36.57 | −67.15 | 19.324 | 0.1994 |
ARMA(1, 1) | 36.67 | −65.33 | 12.649 | 0.6294 |
Iron & steel | Chemical | Machinery | ||||||||||
SUB | SUB | SUB | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 0.838 | 0.005 | 40.267 | *** | 0.877 | 0.036 | 24.594 | *** | 0.700 | 0.077 | 9.101 | *** |
SUB | 0.0000 | 0.0003 | 1.303 | −0.004 | 0.002 | −2.424 | ** | 0.005 | 0.003 | 1.545 | ||
MA(1) | 0.667 | 0.155 | 4.790 | *** | 0.183 | 0.539 | 0.340 | 0.767 | 0.147 | 5.207 | *** | |
Residual variance=0.0000009. AIC=−215.99. log-likelihood=110.99. Ljung-Box test for autocorrelation=12.068 (p=0.674). | Residual variance=0.0000021. AIC=−198.26. log-likelihood=102.13. Ljung-Box test for autocorrelation=12.459 (p=0.644). | Residual variance=0.0000113. AIC=−164.13. log-likelihood=85.06. Ljung-Box test for autocorrelation=13.210 (p=0.586). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
SUB | SUB | SUB | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 1.924 | 0.798 | 2.412 | ** | 4.363 | 0.237 | 18.425 | *** | 3.854 | 1.248 | 3.087 | *** |
SUB | 0.046 | 0.031 | 1.507 | −0.043 | 0.009 | −4.764 | ** | −0.039 | 0.048 | −0.814 | ||
MA(1) | 0.852 | 0.238 | 3.580 | *** | 0.361 | 0.198 | 1.823 | * | 0.703 | 0.222 | 3.165 | *** |
Residual variance= 0.0000008.AIC=−77.58.log-likelihood=41.79. Ljung-Box test for autocorrelation=12.7516 (p=0.6215). | Residual variance= 0.00013.AIC=−116.24.log-likelihood=61.12. Ljung-Box test for autocorrelation=14.7107 (p=0.4724). | Residual variance= 0.0022.AIC=−58.93.log-likelihood=32.47. Ljung-Box test for autocorrelation=11.9376 (p=0.6837). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | z-ratio | |||
OIL | ||||||||||||
Intercept | 0.831 | 0.005 | 168.171 | *** | 0.780 | 0.009 | 89.230 | *** | 0.776 | 0.017 | 44.689 | *** |
OIL | 0.0007 | 0.0005 | 1.305 | −0.0016 | 0.0009 | −1.774 | * | 0.0042 | 0.0017 | 2.457 | ** | |
MA(1) | 0.662 | 0.147 | 4.504 | *** | 0.603 | 0.203 | 2.964 | ** | 0.637 | 0.135 | 4.719 | *** |
Residual variance= 0.0000008.AIC=−217.92.log-likelihood=111.96. Ljung-Box test for autocorrelation=12.068 (p=0.6351). | Residual variance= 0.000003.AIC=−193.88.log-likelihood=99.94. Ljung-Box test for autocorrelation=9.2392 (p=0.8647). | Residual variance= 0.00001.AIC=−167.03.log-likelihood=86.51. Ljung-Box test for autocorrelation=19.1849 (p=0.2055). | ||||||||||
COAL | ||||||||||||
Intercept | 0.840 | 0.006 | 135.345 | *** | 0.784 | 0.012 | 67.351 | *** | 0.797 | 0.024 | 128.842 | *** |
COAL | −0.0002 | 0.0007 | −0.294 | −0.0023 | 0.0013 | −1.762 | * | 0.0024 | 0.0027 | 0.885 | ||
MA(1) | 0.676 | 0.155 | 4.358 | *** | 0.738 | 0.260 | 2.834 | *** | 0.645 | 0.141 | 4.101 | *** |
Residual variance= 0.0000009.AIC=−216.07.log-likelihood=111.04. Ljung-Box test for autocorrelation=13.780 (p=0.5422). | Residual variance= 0.000003.AIC=−194.14.log-likelihood=100.07. Ljung-Box test for autocorrelation=7.862 (p=0.9292). | Residual variance= 0.00001.AIC=−162.58.log-likelihood=84.29. Ljung-Box test for autocorrelation=14.609 (p=0.4799). | ||||||||||
GAS | ||||||||||||
Intercept | 0.830 | 0.006 | 128.841 | *** | 0.788 | 0.011 | 69.819 | *** | 0.764 | 0.023 | 33.673 | *** |
GAS | 0.0007 | 0.0006 | 1.166 | −0.0023 | 0.0011 | −2.111 | * | 0.0052 | 0.0022 | 2.409 | ** | |
MA(1) | 0.637 | 0.155 | 4.101 | *** | 0.643 | 0.224 | 2.872 | ** | 0.596 | 0.141 | 4.234 | *** |
Residual variance=0.0000008.AIC=−217.38.log-likelihood=111.69. Ljung-Box test for autocorrelation=13.419 (p=0.5699). | Residual variance= 0.000003.AIC=−195.01.log-likelihood=100.51. Ljung-Box test for autocorrelation=9.448 (p=0.853). | Residual variance= 0.00001.AIC=−166.64.log-likelihood=86.32. Ljung-Box test for autocorrelation=20.65 (p=0.1484). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | z-ratio | |||
OIL | ||||||||||||
Intercept | 3.242 | 0.173 | 18.718 | *** | 3.530 | 0.044 | 80.855 | *** | 3.662 | 0.206 | 17.769 | *** |
OIL | −0.012 | 0.017 | −0.676 | −0.029 | 0.004 | −6.776 | *** | −0.081 | 0.020 | −4.012 | *** | |
MA(1) | 0.659 | 0.195 | 3.376 | *** | 0.353 | 0.246 | 1.435 | * | 0.726 | 0.185 | 3.925 | *** |
Residual variance= 0.001.AIC=−75.66.log-likelihood=40.83. Ljung-Box test for autocorrelation=13.5178 (p=0.5624). | Residual variance= 0.00009.AIC=−124.27.log-likelihood=65.13. Ljung-Box test for autocorrelation=24.1075 (p=0.06329). | Residual variance= 0.0013.AIC=−69.95.log-likelihood=37.98. Ljung-Box test for autocorrelation=13.4167 (p=0.5701). | ||||||||||
COAL | ||||||||||||
Intercept | 3.068 | 0.212 | 14.494 | *** | 3.474 | 0.089 | 39.118 | *** | 3.712 | 0.246 | 14.122 | *** |
COAL | 0.007 | 0.024 | 0.274 | −0.027 | 0.010 | −2.694 | ** | −0.098 | 0.028 | −0.973 | ||
MA(1) | 0.635 | 0.212 | 2.990 | ** | 1.000 | 1.147 | 0.872 | 1.000 | 0.203 | 4.921 | *** | |
Residual variance= 0.0001.AIC=−75.28.log-likelihood=111.96. Ljung-Box test for autocorrelation=14.2754 (p=0.5048). | Residual variance= 0.00017.AIC=−107.94.log-likelihood=56.97. Ljung-Box test for autocorrelation=24.5917 (p=0.0557). | Residual variance= 0.0.0013.AIC=−67.19.log-likelihood=36.59. Ljung-Box test for autocorrelation=15.6845 (p=0.4033). | ||||||||||
GAS | ||||||||||||
Intercept | 3.209 | 0.233 | 13.748 | *** | 3.596 | 0.069 | 51.954 | *** | 3.946 | 0.288 | 13.684 | *** |
GAS | −0.008 | 0.022 | −0.360 | −0.035 | 0.007 | −5.226 | *** | −0.106 | 0.028 | −3.851 | *** | |
MA(1) | 0.669 | 0.203 | 3.300 | *** | 0.451 | 0.234 | 1.929 | * | 0.876 | 0.180 | 4.868 | *** |
Residual variance= 0.00097.AIC=−75.33.log-likelihood=40.67. Ljung-Box test for autocorrelation=14.0869 (p=0.5189). | Residual variance= 0.00011.AIC=−118.62.log-likelihood=62.3. Ljung-Box test for autocorrelation=24.98842 (p=0.0501). | Residual variance= 0.0012.AIC=−70.16.log-likelihood=38.08. Ljung-Box test for autocorrelation=10.3797 (p=0.7952). | ||||||||||
OIL: logged oil price; COAL: logged coal price; GAS: logged gas price. MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
INV | INV | INV | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 0.496 | 2.795 | 0.177 | 8.193 | 1.281 | 6.393 | *** | 11.813 | 2.624 | 4.502 | *** | |
INV | 0.153 | 0.163 | 0.941 | −0.286 | 0.074 | −3.870 | *** | −0.482 | 0.141 | −3.421 | *** | |
MA(1) | 0.651 | 0.209 | 3.124 | *** | 0.765 | 0.313 | 2.447 | ** | 0.707 | 0.180 | 3.923 | *** |
Residual variance= 0.00094.AIC=−76.07.log-likelihood=41.04. Ljung-Box test for autocorrelation=14.5235 (p=0.4863). | Residual variance= 0.00012.AIC=−116.8.log-likelihood=61.4. Ljung-Box test for autocorrelation=20.7272 (p=0.1458). | Residual variance= 0.00146.AIC=−67.15.log-likelihood=36.57. Ljung-Box test for autocorrelation=15.5819 (p=0.4104). | ||||||||||
INV: logged investment (installed based) amount by sector MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Iron & steel | Chemical | Machinery | ||||||||||
INV | INV | INV | ||||||||||
Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | Estimate | Standard error | z-ratio | ||||
Intercept | 0.9368 | 0.0822 | 11.401169 | *** | 1.2323 | 0.1168 | 10.548472 | *** | 0.4952 | 0.2315 | 7.3278558 | *** |
INV | −0.0057 | 0.0048 | −1.206176 | −0.027 | 0.0067 | −4.002377 | *** | 0.0174 | 0.0124 | −0.638437 | ||
MA(1) | 0.6475 | 0.1781 | 3.634986 | *** | 0.4418 | 0.31 | 1.425193 | * | 0.6775 | 0.134 | 5.0488047 | *** |
Residual variance= 0.0000008.AIC=−217.38.log-likelihood=111.69. Ljung-Box test for autocorrelation=10.5245 (p=0.7855). | Residual variance= 0.0000018.AIC=−201.31.log-likelihood=103.66. Ljung-Box test for autocorrelation=10.2885 (p=0.8012). | Residual variance= 0.000013.AIC=−162.58.log-likelihood=84.29. Ljung-Box test for autocorrelation=19.3235 (p=0.1994). | ||||||||||
INV: logged investment (installed based) amount by sector
MA(1): First-order moving average *: the alpha level of 0.05; **: the alpha level of 0.01; ***: the alpha level of 0.001 Notes: Estimates computed in R3.0.0. |
Subsidies | Fuel price | Investment | ||||
Oil price | Coal price | Gas price | ||||
Carbon intensity | Iron & steel | n.s | n.s | n.s | n.s | n.s |
Chemical | * | * | * | * | * | |
Machinery | n.s | - | n.s | - | n.s | |
Energy intensity | Iron & steel | n.s | n.s | n.s | n.s | n.s |
Chemical | * | * | * | * | * | |
Machinery | n.s | * | n.s | * | * | |
* = significant to improve carbon/energy intensity (the value negatively impacts on the intensities) - = significant to worsen carbon/energy intensity (the value positively impacts on the intensities) n.g = not significant |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 110.54 | −215.08 | 13.389 | 0.5723 |
ARMA(0, 1) | 111.8 | −217.59 | 12.068 | 0.6739 |
ARMA(1, 1) | 112.08 | −216.17 | 10.136 | 0.8111 |
ARMA(1, 0): AR1=s.e. NaN, ARMA(1, 1): AR1=s.e. NaN |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 112.15 | −218.3 | 9.2269 | 0.8654 |
ARMA(0, 1) | 111.96 | −217.92 | 12.575 | 0.6351 |
ARMA(1, 1) | 113.04 | −218.08 | 7.2962 | 0.9489 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 110.53 | −215.06 | 11.942 | 0.6834 |
ARMA(0, 1) | 111.04 | −216.07 | 13.781 | 0.5422 |
ARMA(1, 1) | 111.59 | −215.19 | 11.641 | 0.7059 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 111.42 | −216.84 | 11.662 | 0.7044 |
ARMA(0, 1) | 111.69 | −217.38 | 13.42 | 0.5699 |
ARMA(1, 1) | 112.2 | −216.41 | 10.399 | 0.7939 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 111.2 | −216.39 | 9.2154 | 0.866 |
ARMA(0, 1) | 111.69 | −217.38 | 10.524 | 0.7855 |
ARMA(1, 1) | 111.96 | −215.91 | 9.9323 | 0.824 |
ARMA(1, 0): AR1=s.e. NaN, ARMA(1, 1): AR1=s.e. NaN |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 40.23 | −74.46 | 18.456 | 0.2395 |
ARMA(0, 1) | 41.79 | −77.58 | 12.752 | 0.6215 |
ARMA(1, 1) | 41.95 | −75.89 | 12.568 | 0.6356 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.81 | −73.63 | 16.077 | 0.377 |
ARMA(0, 1) | 40.83 | −75.66 | 13.518 | 0.5624 |
ARMA(1, 1) | 40.91 | −73.82 | 13.556 | 0.5594 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.68 | −73.36 | 16.64 | 0.3408 |
ARMA(0, 1) | 40.64 | −75.28 | 14.275 | 0.5048 |
ARMA(1, 1) | 40.77 | −73.54 | 14.004 | 0.5252 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.57 | −73.15 | 17.064 | 0.315 |
ARMA(0, 1) | 40.67 | −75.33 | 14.087 | 0.5189 |
ARMA(1, 1) | 40.77 | −73.55 | 14.044 | 0.5222 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 39.85 | −73.69 | 17.475 | 0.2913 |
ARMA(0, 1) | 41.04 | −76.07 | 14.524 | 0.4863 |
ARMA(1, 1) | 41.41 | −74.82 | 12.78 | 0.6193 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 102.26 | −198.52 | 11.361 | 0.7266 |
ARMA(0, 1) | 102.13 | −198.26 | 12.459 | 0.644 |
ARMA(1, 1) | 102.26 | −196.53 | 11.422 | 0.7221 |
ARMA(1, 0): AR1=s.e. NaN |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 100.27 | −194.53 | 9.5491 | 0.8471 |
ARMA(0, 1) | 99.94 | −193.88 | 9.2392 | 0.8647 |
ARMA(1, 1) | 100.29 | −192.58 | 9.7751 | 0.8336 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 99.92 | −193.84 | 13.701 | 0.5483 |
ARMA(0, 1) | 100.07 | −194.14 | 14.609 | 0.4799 |
ARMA(1, 1) | 100.3 | −192.61 | 8.286 | 0.9118 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 99.73 | −193.46 | 6.8141 | 0.9626 |
ARMA(0, 1) | 100.51 | −195.01 | 9.4475 | 0.853 |
ARMA(1, 1) | 100.52 | −193.05 | 9.122 | 0.8711 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 103.18 | −200.36 | 13.871 | 0.5354 |
ARMA(0, 1) | 103.66 | −201.31 | 10.524 | 0.7855 |
ARMA(1, 1) | 103.67 | −199.34 | 9.9323 | 0.824 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 60.85 | −115.69 | 16.477 | 0.3511 |
ARMA(0, 1) | 61.12 | −116.24 | 14.711 | 0.4724 |
ARMA(1, 1) | 61.12 | −114.25 | 15.09 | 0.445 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 65.15 | −124.29 | 17.383 | 0.2965 |
ARMA(0, 1) | 65.13 | −124.27 | 13.417 | 0.5701 |
ARMA(1, 1) | 65.3 | −122.59 | 14.828 | 0.4639 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 58.56 | −111.13 | 25.8 | 0.04018 |
ARMA(0, 1) | 56.97 | −107.94 | 15.684 | 0.4033 |
ARMA(1, 1) | 58.38 | −108.75 | 18.066 | 0.2592 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 62.32 | −118.64 | 15.964 | 0.3845 |
ARMA(0, 1) | 62.31 | −118.62 | 10.38 | 0.7952 |
ARMA(1, 1) | 62.32 | −116.64 | 11.347 | 0.7276 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 61.26 | −116.53 | 20.173 | 0.1654 |
ARMA(0, 1) | 61.4 | −116.8 | 15.582 | 0.4104 |
ARMA(1, 1) | 61.72 | −115.45 | 16.162 | 0.3714 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 83.11 | −160.22 | 11.073 | 0.7474 |
ARMA(0, 1) | 85.06 | −164.13 | 14.44 | 0.4925 |
ARMA(1, 1) | 85.65 | −163.29 | 10.155 | 0.8099 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 85.61 | −165.22 | 25.884 | 0.03926 |
ARMA(0, 1) | 86.51 | −167.03 | 24.108 | 0.06329 |
ARMA(1, 1) | 87.5 | −167 | 26.07 | 0.03729 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 83.26 | −160.51 | 32.08 | 0.006279 |
ARMA(0, 1) | 84.29 | −162.58 | 24.592 | 0.0557 |
ARMA(1, 1) | 85.32 | −162.65 | 33.455 | 0.004059 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 85.65 | −165.31 | 30.216 | 0.01117 |
ARMA(0, 1) | 86.32 | −166.64 | 24.988 | 0.0501 |
ARMA(1, 1) | 87.09 | −166.18 | 25.761 | 0.04061 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 83.48 | −160.95 | 28.167 | 0.02054 |
ARMA(0, 1) | 84.84 | −163.69 | 20.727 | 0.1458 |
ARMA(1, 1) | 85.74 | −163.49 | 24.003 | 0.06505 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 31.55 | −57.1 | 16.787 | 0.3317 |
ARMA(0, 1) | 32.47 | −58.93 | 11.938 | 0.6837 |
ARMA(1, 1) | 32.48 | −56.95 | 12.061 | 0.6744 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 36.47 | −66.95 | 20.474 | 0.1545 |
ARMA(0, 1) | 37.98 | −69.95 | 19.185 | 0.2055 |
ARMA(1, 1) | 38.08 | −68.16 | 7.2962 | 0.9489 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 34.09 | −62.19 | 13.701 | 0.5483 |
ARMA(0, 1) | 36.59 | −67.19 | 14.609 | 0.4799 |
ARMA(1, 1) | 36.99 | −65.98 | 8.286 | 0.9118 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 35.77 | −65.55 | 21.538 | 0.1205 |
ARMA(0, 1) | 38.08 | −70.16 | 20.65 | 0.1484 |
ARMA(1, 1) | 38.11 | −68.22 | 16.339 | 0.3599 |
ARMA | Log likelihood | AIC | X squared | p-Value |
ARMA(1, 0) | 35.14 | −64.27 | 17.433 | 0.2936 |
ARMA(0, 1) | 36.57 | −67.15 | 19.324 | 0.1994 |
ARMA(1, 1) | 36.67 | −65.33 | 12.649 | 0.6294 |