Coffee is a major industrial crop that creates high economic value in Thailand and other countries worldwide. A lack of certainty in forecasting coffee production could lead to serious operation problems for business. Applying machine learning (ML) to coffee production is crucial since it can help in productivity prediction and increase prediction accuracy rate in response to customer demands. An ML technique of artificial neural network (ANN) model, and a statistical technique of autoregressive integrated moving average (ARIMA) model were adopted in this study to forecast arabica coffee yields. Six variable datasets were collected from 2004 to 2018, including cultivated areas, productivity zone, rainfalls, relative humidity and minimum and maximum temperatures, totaling 180 time-series data points. Their prediction performances were evaluated in terms of correlation coefficient (R2), and root means square error (RMSE). From this work, the ARIMA model was optimized using the fitting model of (p, d, q) amounted to 64 conditions through the Akaike information criteria arriving at (2, 1, 2). The ARIMA results showed that its R2 and RMSE were 0.7041 and 0.1348, respectively. Moreover, the R2 and RMSE of the ANN model were 0.9299 and 0.0642 by the Levenberg-Marquardt algorithm with TrainLM and LearnGDM training functions, two hidden layers and six processing elements. Both models were acceptable in forecasting the annual arabica coffee production, but the ANN model appeared to perform better.
Citation: Yotsaphat Kittichotsatsawat, Anuwat Boonprasope, Erwin Rauch, Nakorn Tippayawong, Korrakot Yaibuathet Tippayawong. Forecasting arabica coffee yields by auto-regressive integrated moving average and machine learning approaches[J]. AIMS Agriculture and Food, 2023, 8(4): 1052-1070. doi: 10.3934/agrfood.2023057
[1] | Yasmina Khiar, Esmeralda Mainar, Eduardo Royo-Amondarain, Beatriz Rubio . High relative accuracy for a Newton form of bivariate interpolation problems. AIMS Mathematics, 2025, 10(2): 3836-3847. doi: 10.3934/math.2025178 |
[2] | Chih-Hung Chang, Ya-Chu Yang, Ferhat Şah . Reversibility of linear cellular automata with intermediate boundary condition. AIMS Mathematics, 2024, 9(3): 7645-7661. doi: 10.3934/math.2024371 |
[3] | Yongge Tian . Miscellaneous reverse order laws and their equivalent facts for generalized inverses of a triple matrix product. AIMS Mathematics, 2021, 6(12): 13845-13886. doi: 10.3934/math.2021803 |
[4] | Yang Chen, Kezheng Zuo, Zhimei Fu . New characterizations of the generalized Moore-Penrose inverse of matrices. AIMS Mathematics, 2022, 7(3): 4359-4375. doi: 10.3934/math.2022242 |
[5] | Pattrawut Chansangiam, Arnon Ploymukda . Riccati equation and metric geometric means of positive semidefinite matrices involving semi-tensor products. AIMS Mathematics, 2023, 8(10): 23519-23533. doi: 10.3934/math.20231195 |
[6] | Zongcheng Li, Jin Li . Linear barycentric rational collocation method for solving a class of generalized Boussinesq equations. AIMS Mathematics, 2023, 8(8): 18141-18162. doi: 10.3934/math.2023921 |
[7] | Maolin Cheng, Bin Liu . A novel method for calculating the contribution rates of economic growth factors. AIMS Mathematics, 2023, 8(8): 18339-18353. doi: 10.3934/math.2023932 |
[8] | Wenyue Feng, Hailong Zhu . The orthogonal reflection method for the numerical solution of linear systems. AIMS Mathematics, 2025, 10(6): 12888-12899. doi: 10.3934/math.2025579 |
[9] | Junsheng Zhang, Lihua Hao, Tengyun Jiao, Lusong Que, Mingquan Wang . Mathematical morphology approach to internal defect analysis of A356 aluminum alloy wheel hubs. AIMS Mathematics, 2020, 5(4): 3256-3273. doi: 10.3934/math.2020209 |
[10] | Sizhong Zhou, Quanru Pan . The existence of subdigraphs with orthogonal factorizations in digraphs. AIMS Mathematics, 2021, 6(2): 1223-1233. doi: 10.3934/math.2021075 |
Coffee is a major industrial crop that creates high economic value in Thailand and other countries worldwide. A lack of certainty in forecasting coffee production could lead to serious operation problems for business. Applying machine learning (ML) to coffee production is crucial since it can help in productivity prediction and increase prediction accuracy rate in response to customer demands. An ML technique of artificial neural network (ANN) model, and a statistical technique of autoregressive integrated moving average (ARIMA) model were adopted in this study to forecast arabica coffee yields. Six variable datasets were collected from 2004 to 2018, including cultivated areas, productivity zone, rainfalls, relative humidity and minimum and maximum temperatures, totaling 180 time-series data points. Their prediction performances were evaluated in terms of correlation coefficient (R2), and root means square error (RMSE). From this work, the ARIMA model was optimized using the fitting model of (p, d, q) amounted to 64 conditions through the Akaike information criteria arriving at (2, 1, 2). The ARIMA results showed that its R2 and RMSE were 0.7041 and 0.1348, respectively. Moreover, the R2 and RMSE of the ANN model were 0.9299 and 0.0642 by the Levenberg-Marquardt algorithm with TrainLM and LearnGDM training functions, two hidden layers and six processing elements. Both models were acceptable in forecasting the annual arabica coffee production, but the ANN model appeared to perform better.
Throughout this article, Cm×n denotes the collection of all m×n matrices over the field of complex numbers, A∗ denotes the conjugate transpose of A∈Cm×n, r(A) denotes the rank of A∈Cm×n, R(A) denotes the range of A∈Cm×n, Im denotes the identity matrix of order m, and [A,B] denotes a row block matrix consisting of A∈Cm×n and B∈Cm×p. The Moore-Penrose generalized inverse of A∈Cm×n, denoted by A†, is the unique matrix X∈Cn×m satisfying the four Penrose equations
(1) AXA=A, (2) XAX=X, (3) (AX)∗=AX, (4) (XA)∗=XA. |
Further, we denote by
PA=AA†, EA=Im−AA† | (1.1) |
the two orthogonal projectors induced from A∈Cm×n. For more detailed information regarding generalized inverses of matrices, we refer the reader to [2,3,4].
Recall that the well-known Kronecker product of any two matrices A=(aij)∈Cm×n and B=(bij)∈Cp×q is defined to be
A⊗B=(aijB)=[a11Ba12B⋯a1nBa21Ba22B⋯a2nB⋮⋮⋱⋮am1Bam2B⋯amnB]∈Cmp×nq. |
The Kronecker product, named after German mathematician Leopold Kronecker, was classified to be a special kind of matrix operation, which has been regarded as an important matrix operation and mathematical technique. This product has wide applications in system theory, matrix calculus, matrix equations, system identification and more (cf. [1,5,6,7,8,9,10,11,12,13,14,16,18,19,20,21,23,24,25,27,28,33,34]). It has been known that the matrices operations based on Kronecker products have a series of rich and good structures and properties, and thus they have many significant applications in the research areas of both theoretical and applied mathematics. In fact, mathematicians established a variety of useful formulas and facts related to the products and used them to deal with various concrete scientific computing problems. Specifically, the basic facts on Kronecker products of matrices in the following lemma were highly appraised and recognized (cf. [15,17,21,32,34]).
Fact 1.1. Let A∈Cm×n, B∈Cp×q, C∈Cn×s, and D∈Cq×t. Then, the following equalities hold:
(A⊗B)(C⊗D)=(AC)⊗(BD), | (1.2) |
(A⊗B)∗=A∗⊗B∗, (A⊗B)†=A†⊗B†, | (1.3) |
PA⊗B=PA⊗PB, r(A⊗B)=r(A)r(B). | (1.4) |
In addition, the Kronecker product of matrices has a rich variety of algebraic operation properties. For example, one of the most important features is that the product A1⊗A2 can be factorized as certain ordinary products of matrices:
A1⊗A2=(A1⊗Im2)(In1⊗A2)=(Im1⊗A2)(A1⊗In2) | (1.5) |
for any A1∈Cm1×n1 and A2∈Cm2×n2, and the triple Kronecker product A1⊗A2⊗A3 can be written as
A1⊗A2⊗A3=(A1⊗Im2⊗Im3)(In1⊗A2⊗Im3)(In1⊗In2⊗A3), | (1.6) |
A1⊗A2⊗A3=(In1⊗A2⊗Im3)(In1(In1⊗In2⊗A3)(A1⊗Im2⊗Im3), | (1.7) |
A1⊗A2⊗A3=(Im1⊗Im2⊗A3)(A1⊗In2⊗In3)(Im1⊗A2⊗In3) | (1.8) |
for any A1∈Cm1×n1, A2∈Cm2×n2 and A3∈Cm3×n3, where the five matrices in the parentheses on the right hand sides of (1.5)–(1.8) are usually called the dilation expressions of the given three matrices A1, A2 and A3, and the four equalities in (1.5)–(1.8) are called the dilation factorizations of the Kronecker products A1⊗A2 and A1⊗A2⊗A3, respectively. A common feature of the four matrix equalities in (1.5)–(1.8) is that they factorize Kronecker products of any two or three matrices into certain ordinary products of the dilation expressions of A1, A2 and A3. Particularly, a noticeable fact we have to point out is that the nine dilation expressions of matrices in (1.5)–(1.8) commute each other by the well-known mixed-product property in (1.2) when A1, A2 and A3 are all square matrices. It can further be imagined that there exists proper extension of the dilation factorizations to Kronecker products of multiple matrices. Although the dilation factorizations in (1.5)–(1.8) seem to be technically trivial in form, they can help deal with theoretical and computational issues regarding Kronecker products of two or three matrices through the ordinary addition and multiplication operations of matrices.
In this article, we provide a new analysis of performances and properties of Kronecker products of matrices, as well as present a wide range of novel and explicit facts and results through the dilation factorizations described in (1.5)–(1.8) for the purpose of obtaining a deeper understanding and grasping of Kronecker products of matrices, including a number of analytical formulas for calculating ranks, dimensions, orthogonal projectors, and ranges of the dilation expressions of matrices and their algebraic operations.
The remainder of article is organized as follows. In section two, we introduce some preliminary facts and results concerning ranks, ranges, and generalized inverses of matrices. In section three, we propose and prove a collection of useful equalities, inequalities, and formulas for calculating the ranks, dimensions, orthogonal projectors, and ranges associated with the Kronecker products A1⊗A2 and A1⊗A2⊗A3 through the dilation expressions of A1, A2 and A3 and their operations. Conclusions and remarks are given in section four.
One of the remarkable applications of generalized inverses of matrices is to establish various exact and analytical expansion formulas for calculating the ranks of partitioned matrices. As convenient and skillful tools, these matrix rank formulas can be used to deal with a wide variety of theoretical and computational issues in matrix theory and its applications (cf. [22]). In this section, we present a mixture of commonly used formulas and facts in relation to ranks of matrices and their consequences about the commutativity of two orthogonal projectors, which we shall use as analytical tools to approach miscellaneous formulas related to Kronecker products of matrices.
Lemma 2.1. [22,29] Let A∈Cm×n and B∈Cm×k. Then, the following rank equalities
r[A,B]=r(A)+r(EAB)=r(B)+r(EBA), | (2.1) |
r[A,B]=r(A)+r(B)−2r(A∗B)+r[PAPB,PBPA], | (2.2) |
r[A,B]=r(A)+r(B)−r(A∗B)+2−1r(PAPB−PBPA), | (2.3) |
r[A,B]=r(A)+r(B)−r(A∗B)+r(P[A,B]−PA−PB+PAPB) | (2.4) |
hold. Therefore,
PAPB=PBPA⇔P[A,B]=PA+PB−PAPB⇔r(EAB)=r(B)−r(A∗B)⇔r[A,B]=r(A)+r(B)−r(A∗B)⇔R(PAPB)=R(PBPA). | (2.5) |
If PAPB=PBPA, PAPC=PCPA and PBPC=PCPB, then
P[A,B,C]=PA+PB+PC−PAPB−PAPC−PBPC+PAPBPC. | (2.6) |
Lemma 2.2. [30] Let A,B and C∈Cm×m be three idempotent matrices. Then, the following rank equality
r[A,B,C]=r(A)+r(B)+r(C)−r[AB,AC]−r[BA,BC]−r[CA,CB] +r[AB,AC,BA,BC,CA,CB] | (2.7) |
holds. As a special instance, if AB=BA, AC=CA and BC=CB, then
r[A,B,C]=r(A)+r(B)+r(C)−r[AB,AC]−r[BA,BC]−r[CA,CB]+r[AB,AC,BC]. | (2.8) |
The formulas and facts in the above two lemmas belong to mathematical competencies and conceptions in ordinary linear algebra. Thus they can easily be understood and technically be utilized to establish and simplify matrix expressions and equalities consisting of matrices and their generalized inverses.
We first establish a group of formulas and facts associated with the orthogonal projectors, ranks, dimensions, and ranges of the matrix product in (1.5).
Theorem 3.1. Let A1∈Cm1×n1 and A2∈Cm2×n2, and denote by
M1=A1⊗Im2, M2=Im1⊗A2 |
the two dilation expressions of A1 and A2, respectively. Then, we have the following results.
(a) The following orthogonal projector equalities hold:
PA1⊗A2=PM1PM2=PM2PM1=PA1⊗PA2. | (3.1) |
(b) The following four rank equalities hold:
r[A1⊗Im2,Im1⊗A2]=m1m2−(m1−r(A1))(m2−r(A2)), | (3.2) |
r[A1⊗Im2,Im1⊗EA2]=m1m2−(m1−r(A1))r(A2), | (3.3) |
r[EA1⊗Im2,Im1⊗A2]=m1m2−r(A1)(m2−r(A2)), | (3.4) |
r[EA1⊗Im2,Im1⊗EA2]=m1m2−r(A1)r(A2), | (3.5) |
and the following five dimension equalities hold:
dim(R(M1)∩R(M2))=r(M1M2)=r(A1)r(A2), | (3.6) |
dim(R(M1)∩R⊥(M2))=r(M1EM2)=r(A1)(m2−r(A2)), | (3.7) |
dim(R⊥(M1)∩R(M2))=r(EM1M2)=(m1−r(A1))r(A2), | (3.8) |
dim(R⊥(M1)∩R⊥(M2))=r(EM1EM2)=(m1−r(A1))(m2−r(A2)), | (3.9) |
dim(R(M1)∩R(M2))+dim(R(M1)∩R⊥(M2))+dim(R⊥(M1)∩R⊥(M2)) +dim(R⊥(M1)∩R⊥(M2))=m1m2. | (3.10) |
(c) The following range equalities hold:
R(M1)∩R(M2)=R(M1M2)=R(M2M1)=R(A1⊗A2), | (3.11) |
R(M1)∩R⊥(M2)=R(M1EM2)=R(EM2M1)=R(A1⊗EA2), | (3.12) |
R⊥(M1)∩R(M2)=R(EM1M2)=R(M2EM1)=R(EA1⊗A2), | (3.13) |
R⊥(M1)∩R⊥(M2)=R(EM1EM2)=R(EM2EM1)=R(EA1⊗EA2), | (3.14) |
(R(M1)∩R(M2))⊕(R⊥(M1)∩R(M2))⊕(R(M1)∩R⊥(M2))⊕(R⊥(M1)∩R⊥(M2))=Cm1m2. | (3.15) |
(d) The following orthogonal projector equalities hold:
PR(M1)∩R(M2)=PM1PM2=PA1⊗PA2, | (3.16) |
PR(M1)∩R⊥(M2)=PM1EM2=PA1⊗EA2, | (3.17) |
PR⊥(M1)∩R(M2)=EM1PM2=EA1⊗PA2, | (3.18) |
PR⊥(M1)∩R⊥(M2)=EM1EM2=EA1⊗EA2, | (3.19) |
PR(M1)∩R(M2)+PR⊥(M1)∩R(M2)+PR(M1)∩R⊥(M2)+PR⊥(M1)∩R⊥(M2)=Im1m2. | (3.20) |
(e) The following orthogonal projector equalities hold:
P[A1⊗Im2,Im1⊗A2]=PA1⊗Im2+Im1⊗PA2−PA1⊗PA2=Im1m2−EA1⊗EA2, | (3.21) |
P[A1⊗Im2,Im1⊗EA2]=PA1⊗Im2+Im1⊗EA2−PA1⊗EA2=Im1m2−EA1⊗PA2, | (3.22) |
P[EA1⊗Im2,Im1⊗A2]=EA1⊗Im2+Im1⊗PA2−EA1⊗PA2=Im1m2−PA1⊗EA2, | (3.23) |
P[EA1⊗Im2,Im1⊗EA2]=EA1⊗Im2+Im1⊗EA2−EA1⊗EA2=Im1m2−PA1⊗PA2. | (3.24) |
Proof. It can be seen from (1.2) and (1.4) that
PM1PM2=(A1⊗Im2)(A1⊗Im2)†(Im1⊗A2)(Im1⊗A2)†=(A1⊗Im2)(A†1⊗Im2)(Im1⊗A2)(Im1⊗A†2)=(PA1⊗Im2)(Im1⊗PA2)=PA1⊗PA2,PM2PM1=(Im1⊗A2)(Im1⊗A2)†(A1⊗Im2)(A1⊗Im2)†=(Im1⊗A2)(Im1⊗A†2)(A1⊗Im2)(A†1⊗Im2)=(Im1⊗PA2)(PA1⊗Im2)=PA1⊗PA2, |
thus establishing (3.1).
Applying (2.1) to [A1⊗Im2,Im1⊗A2] and then simplifying by (1.2)–(1.4) yields
r[A1⊗Im2,Im1⊗A2]=r(A1⊗Im2)+r((Im1m2−(A1⊗Im2)(A1⊗Im2)†)(Im1⊗A2))=r(A1⊗Im2)+r((Im1m2−(A1⊗Im2)(A†1⊗Im2))(Im1⊗A2))=r(A1⊗Im2)+r((Im1m2−(A1A†1)⊗Im2)(Im1⊗A2))=m2r(A1)+r((Im1−A1A†1)⊗Im2)(Im1⊗A2))=m2r(A1)+r((Im1−A1A†1)⊗A2))=m2r(A1)+r(Im1−A1A†1)r(A2)=m2r(A1)+(m1−r(A1))r(A2)=m1m2−(m1−r(A1))(m2−r(A2)), |
as required for (3.2). In addition, (3.2) can be directly established by applying (2.5) to the left hand side of (3.2). Equations (3.3)–(3.5) can be obtained by a similar approach. Subsequently by (3.2),
dim(R(M1)∩R(M2))=r(M1)+r(M2)−r[M1,M2]=r(A1)r(A2), |
as required for (3.6). Equations (3.7)–(3.9) can be established by a similar approach. Adding (3.7)–(3.9) leads to (3.10).
The first two equalities in (3.11) follow from (3.6), and the last two range equalities follow from (3.1).
Equations (3.12)–(3.14) can be established by a similar approach. Adding (3.11)–(3.14) and combining with (3.10) leads to (3.15).
Equations (3.16)–(3.19) follow from (3.11)–(3.14). Adding (3.16)–(3.19) leads to (3.20).
Under (3.1), we find from (2.5) that
P[M1,M2]=PM1+PM2−PM1PM2=PA1⊗Im2+Im1⊗PA2−PA1⊗PA2=Im1m2−EA1⊗EA2, |
as required for (3.21). Equations (3.22)–(3.24) can be established by a similar approach.
Equation (3.2) was first shown in [7]; see also [27] for some extended forms of (3.2). Obviously, Theorem 3.1 reveals many performances and properties of Kronecker products of matrices, and it is no doubt that they can be used as analysis tools to deal with various matrix equalities composed of algebraic operations of Kronecker products of matrices. For example, applying the preceding results to the Kronecker sum and difference A1⊗Im2±Im1⊗A2 for two idempotent matrices A1 and A2, we obtain the following interesting consequences.
Theorem 3.2. Let A1∈Cm1×m1 and A2∈Cm2×m2. Then, the following rank inequality
r(A1⊗Im2+Im1⊗A2)≥m1r(A2)+m2r(A1)−2r(A1)r(A2) | (3.25) |
holds. If A1=A21 and A2=A22, then the following two rank equalities hold:
r(A1⊗Im2+Im1⊗A2)=m1r(A2)+m2r(A1)−r(A1)r(A2), | (3.26) |
r(A1⊗Im2−Im1⊗A2)=m1r(A2)+m2r(A1)−2r(A1)r(A2). | (3.27) |
Proof. Equation (3.25) follows from applying the following well-known rank inequality (cf. [22])
r(A+B)≥r[AB]+r[A,B]−r(A)−r(B) |
and (2.1) to A1⊗Im2+Im1⊗A2. Specifically, if A1=A21 and A2=A22, then it is easy to verify that (A1⊗Im2)2=A21⊗Im2=A1⊗Im2 and (Im1⊗A2)2=Im1⊗A22=Im1⊗A2 under A21=A1 and A22=A2. In this case, applying the following two known rank formulas
r(A+B)=r[ABB0]−r(B)=r[BAA0]−r(A),r(A−B)=r[AB]+r[A,B]−r(A)−r(B), |
where A and B are two idempotent matrices of the same size (cf. [29,31]), to A1⊗Im2±Im1⊗A2 and then simplifying by (2.1) and (3.2) yields (3.26) and (3.27), respectively.
Undoubtedly, the above two theorems reveal some essential relations among the dilation forms of two matrices by Kronecker products, which demonstrate that there still exist various concrete research topics on the Kronecker product of two matrices with analytical solutions that can be proposed and obtained. As a natural and useful generalization of the preceding formulas, we next give a diverse range of results related to the three-term Kronecker products of matrices in (1.6).
Theorem 3.3. Let A1∈Cm1×n1,A2∈Cm2×n2 and A3∈Cm3×n3, and let
X1=A1⊗Im2⊗Im3, X2=Im1⊗A2⊗Im3, X3=Im1⊗Im2⊗A3 | (3.28) |
denote the three dilation expressions of A1, A2 and A3, respectively. Then, we have the following results.
(a) The following three orthogonal projector equalities hold:
PX1=PA1⊗Im2⊗Im3, PX2=Im1⊗PA2⊗Im3, PX3=Im1⊗Im2⊗PA3, | (3.29) |
the following equalities hold:
PX1PX2=PX2PX1=PA1⊗PA2⊗Im3, | (3.30) |
PX1PX3=PX3PX1=PA1⊗Im2⊗PA3, | (3.31) |
PX2PX3=PX3PX2=Im1⊗PA2⊗PA3, | (3.32) |
and the equalities hold:
PA1⊗A2⊗A3=PX1PX2PX3=PA1⊗PA2⊗PA3. | (3.33) |
(b) The following eight rank equalities hold:
r[A1⊗Im2⊗Im3,Im1⊗A2⊗Im3,Im1⊗Im2⊗A3] =m1m2m3−(m1−r(A1))(m2−r(A2))(m3−r(A3)), | (3.34) |
r[A1⊗Im2⊗Im3,Im1⊗A2⊗Im3,Im1⊗Im2⊗EA3] =m1m2m3−(m1−r(A1))(m2−r(A2))r(A3), | (3.35) |
r[A1⊗Im2⊗Im3,Im1⊗EA2⊗Im3,Im1⊗Im2⊗A3] =m1m2m3−(m1−r(A1))r(A2)(m3−r(A3)), | (3.36) |
r[EA1⊗Im2⊗Im3,Im1⊗A2⊗Im3,Im1⊗Im2⊗A3] =m1m2m3−r(A1)(m2−r(A2))(m3−r(A3)), | (3.37) |
r[A1⊗Im2⊗Im3,Im1⊗EA2⊗Im3,Im1⊗Im2⊗EA3] =m1m2m3−(m1−r(A1))r(A2)r(A3), | (3.38) |
r[EA1⊗Im2⊗Im3,Im1⊗A2⊗Im3,Im1⊗Im2⊗EA3] =m1m2m3−r(A1)(m2−r(A2))r(A3), | (3.39) |
r[EA1⊗Im2⊗Im3,Im1⊗EA2⊗Im3,Im1⊗Im2⊗A3] =m1m2m3−r(A1)r(A2)(m3−r(A3)), | (3.40) |
r[EA1⊗Im2⊗Im3,Im1⊗EA2⊗Im3,Im1⊗Im2⊗EA3]=m1m2m3−r(A1)r(A2)r(A3), | (3.41) |
the following eight dimension equalities hold:
dim(R(X1)∩R(X2)∩R(X3))=r(A1)r(A2)r(A3), | (3.42) |
dim(R(X1)∩R(X2)∩R⊥(X3))=r(A1)r(A2)(m3−r(A3)), | (3.43) |
dim(R⊥(X1)∩R⊥(X2)∩R(X3))=r(A1)(m2−r(A2))r(A3), | (3.44) |
dim(R⊥(X1)∩R⊥(X2)∩R(X3))=(m1−r(A1))r(A2)r(A3), | (3.45) |
dim(R(X1)∩R⊥(X2)∩R⊥(X3))=r(A1)(m2−r(A2))(m3−r(A3)), | (3.46) |
dim(R⊥(X1)∩R(X2)∩R⊥(X3))=(m1−r(A1))r(A2)(m3−r(A3)), | (3.47) |
dim(R⊥(X1)∩R⊥(X2)∩R(X3))=(m1−r(A1))(m2−r(A2))r(A3), | (3.48) |
dim(R⊥(X1)∩R⊥(X2)∩R⊥(X3))=(m1−r(A1))(m2−r(A2))(m3−r(A3)), | (3.49) |
and the following dimension equality holds:
dim(R(X1)∩R(X2)∩R(X3))+dim(R(X1)∩R(X2)∩R⊥(X3)) +dim(R⊥(X1)∩R⊥(X2)∩R(X3))+dim(R⊥(X1)∩R⊥(X2)∩R(X3)) +dim(R(X1)∩R⊥(X2)∩R⊥(X3))+dim(R⊥(X1)∩R(X2)∩R⊥(X3)) +dim(R⊥(X1)∩R⊥(X2)∩R(X3))+dim(R⊥(X1)∩R⊥(X2)∩R⊥(X3)=m1m2m3. | (3.50) |
(c) The following eight groups of range equalities hold:
R(X1)∩R(X2)∩R(X3)=R(X1X2X3)=R(A1⊗A2⊗A3), | (3.51) |
R(X1)∩R(X2)∩R⊥(X3)=R(X1X2EX3)=R(A1⊗A2⊗EA3), | (3.52) |
R(X1)∩R⊥(X2)∩R(X3)=R(X1EX2X3)=R(A1⊗EA2⊗A3), | (3.53) |
R⊥(X1)∩R(X2)∩R(X3)=R(EX1X2X3)=R(EA1⊗A2⊗A3), | (3.54) |
R(X1)∩R⊥(X2)∩R⊥(X3)=R(X1EX2EX3)=R(A1⊗EA2⊗EA3), | (3.55) |
R⊥(X1)∩R(X2)∩R⊥(X3)=R(EX1X2EX3)=R(EA1⊗A2⊗EA3), | (3.56) |
R⊥(X1)∩R⊥(X2)∩R(X3)=R(X1EX2X3)=R(EA1⊗EA2⊗A3), | (3.57) |
R⊥(X1)∩R⊥(X2)∩R⊥(X3)=R(EX1EX2EX3)=R(EA1⊗EA2⊗EA3), | (3.58) |
and the following direct sum equality holds:
(R(X1)∩R(X2)∩R(X3))⊕(R⊥(X1)∩R(X2)∩R(X3))⊕(R(X1)∩R⊥(X2)∩R(X3))⊕(R(X1)∩R(X2)∩R⊥(X3))⊕(R⊥(X1)∩R⊥(X2)∩R(X3))⊕(R⊥(X1)∩R(X2)∩R⊥(X3))⊕(R(X1)∩R⊥(X2)∩R⊥(X3))⊕(R⊥(X1)∩R⊥(X2)∩R⊥(X3))=Cm1m2m3. | (3.59) |
(d) The following eight orthogonal projector equalities hold:
PR(X1)∩R(X2)∩R(X3)=PA1⊗PA2⊗PA3, | (3.60) |
PR(X1)∩R(X2)∩R⊥(X3)=PA1⊗PA2⊗EA3, | (3.61) |
PR(X1)∩R⊥(X2)∩R(X3)=PA1⊗EA2⊗PA3, | (3.62) |
PR⊥(X1)∩R(X2)∩R(X3)=EA1⊗PA2⊗PA3, | (3.63) |
PR(X1)∩R⊥(X2)∩R⊥(X3)=PA1⊗EA2⊗EA3, | (3.64) |
PR⊥(X1)∩R(X2)∩R⊥(X3)=EA1⊗PA2⊗EA3, | (3.65) |
PR⊥(X1)∩R⊥(X2)∩R(X3)=EA1⊗EA2⊗PA3, | (3.66) |
PR⊥(X1)∩R⊥(X2)∩R⊥(X3)=EA1⊗EA2⊗EA3, | (3.67) |
and the following orthogonal projector equality holds:
PR(X1)∩R(X2)∩R(X3)+PR(X1)∩R(X2)∩R⊥(X3) +PR(X1)∩R⊥(X2)∩R(X3)+PR⊥(X1)∩R(X2)∩R(X3) +PR(X1)∩R⊥(X2)∩R⊥(X3)+PR⊥(X1)∩R(X2)∩R⊥(X3) +PR⊥(X1)∩R⊥(X2)∩R(X3)+PR⊥(X1)∩R⊥(X2)∩R⊥(X3)=Im1m2m3. | (3.68) |
(e) The following eight orthogonal projector equalities hold:
P[A1⊗Im2⊗Im3,Im1⊗A2⊗Im3,Im1⊗Im2⊗A3]=Im1m2m3−EA1⊗EA2⊗EA3, | (3.69) |
P[A1⊗Im2⊗Im3,Im1⊗A2⊗Im3,Im1⊗Im2⊗EA3]=Im1m2m3−EA1⊗EA2⊗PA3, | (3.70) |
P[A1⊗Im2⊗Im3,Im1⊗EA2⊗Im3,Im1⊗Im2⊗A3]=Im1m2m3−EA1⊗PA2⊗EA3, | (3.71) |
P[EA1⊗Im2⊗Im3,Im1⊗A2⊗Im3,Im1⊗Im2⊗A3]=Im1m2m3−PA1⊗EA2⊗EA3, | (3.72) |
P[A1⊗Im2⊗Im3,Im1⊗EA2⊗Im3,Im1⊗Im2⊗EA3]=Im1m2m3−EA1⊗PA2⊗PA3, | (3.73) |
P[EA1⊗Im2⊗Im3,Im1⊗A2⊗Im3,Im1⊗Im2⊗EA3]=Im1m2m3−PA1⊗EA2⊗PA3, | (3.74) |
P[EA1⊗Im2⊗Im3,Im1⊗EA2⊗Im3,Im1⊗Im2⊗A3]=Im1m2m3−PA1⊗PA2⊗EA3, | (3.75) |
P[EA1⊗Im2⊗Im3,Im1⊗EA2⊗Im3,Im1⊗Im2⊗EA3]=Im1m2m3−PA1⊗PA2⊗PA3. | (3.76) |
Proof. By (1.1)–(1.3),
PX1=(A1⊗Im2⊗Im3)(A1⊗Im2⊗Im3)†=(A1⊗Im2⊗Im3)(A†1⊗Im2⊗Im3)=(A1A†1)⊗Im2⊗Im3=PA1⊗Im2⊗Im3, |
thus establishing the first equality in (3.29). The second and third equalities in (3.29) can be shown in a similar way. Also by (1.1)–(1.3),
PA1⊗A2⊗A3=(A1⊗A2⊗A3)(A1⊗A2⊗A3)†=(A1⊗A2⊗A3)(A†1⊗A†2⊗A†3)=(A1A†1)⊗(A2A†2)⊗(A3A†3)=PA1⊗PA2⊗PA3, | (3.77) |
and by (1.2) and (3.29),
PX1PX2PX3=(PA1⊗Im2⊗Im3)(Im1⊗PA2⊗Im3)(Im1⊗Im2⊗PA3)=PA1⊗PA2⊗PA3. | (3.78) |
Combining (3.77) and (3.78) leads to (3.33).
By (2.1), (1.2)–(1.4) and (3.2),
r[A1⊗Im2⊗Im3,Im1⊗A2⊗Im3,Im1⊗Im2⊗A3]=r(A1⊗Im2⊗Im3)+r((Im1−A1A†1)⊗[A2⊗Im3,Im2⊗A3])=m2m3r(A1)+r(Im1−A1A†1)r[A2⊗Im3,Im2⊗A3]=m2m3r(A1)+(m1−r(A1))(m2m3−(m2−r(A2))(m3−r(A3)))=m1m2m3−(m1−r(A1))(m2−r(A2))(m3−r(A3)), |
thus establishing (3.34). Equations (3.35)–(3.41) can be established in a similar way.
By (3.11), we are able to obtain
R(X1)∩R(X2)=R(X1X2)=R(X2X1)=R(A1⊗A2⊗Im3). |
Consequently,
R(X1)∩R(X2)∩R(X3)=R(X1X2)∩R(X3)=R(X1X2X3)=R(A1⊗A2⊗A3), |
as required for (3.51). Equations (3.52)–(3.58) can be established in a similar way. Adding (3.51)–(3.58) leads to (3.59).
Taking the dimensions of both sides of (3.51)–(3.58) and applying (1.4), we obtain (3.42)–(3.50).
Equations (3.60)–(3.68) follow from (3.51)–(3.59).
Equations (3.69)–(3.77) follow from (2.6) and (3.30)–(3.32).
In addition to (3.28), we can construct the following three dilation expressions
Y1=Im1⊗A2⊗A3, Y2=A1⊗Im2⊗A3 and Y3=A1⊗A2⊗Im3 | (3.79) |
from any three matrices A1∈Cm1×n1,A2∈Cm2×n2 and A3∈Cm3×n3. Some concrete topics on rank equalities for the dilation expressions under vector situations were considered in [26]. Below, we give a sequence of results related to the three dilation expressions.
Theorem 3.4. Let Y1, Y2 and Y3 be the same as given in (3.79). Then, we have the following results.
(a) The following three projector equalities hold:
PY1=Im1⊗PA2⊗PA3, PY2=PA1⊗Im2⊗PA3 and PY3=PA1⊗PA2⊗Im3. | (3.80) |
(b) The following twelve matrix equalities hold:
PY1PY2=PY2PY1=PY1PY3=PY3PY1=PY2PY3=PY3PY2=PY1PY2PY3=PY1PY3PY2=PY2PY1PY3=PY2PY3PY1=PY3PY1PY2=PY3PY2PY1=PA1⊗PA2⊗PA3. | (3.81) |
(c) The following rank equality holds:
r[Y1,Y2,Y3]=m1r(A2)r(A3)+m2r(A1)r(A3)+m3r(A1)r(A2)−2r(A1)r(A2)r(A3). | (3.82) |
(d) The following range equality holds:
R(Y1)∩R(Y2)∩R(Y3)=R(A1⊗A2⊗A3). | (3.83) |
(e) The following dimension equality holds:
dim(R(Y1)∩R(Y2)∩R(Y3))=r(A1)r(A2)r(A3). | (3.84) |
(f) The following projector equality holds:
PR(Y1)∩R(Y2)∩R(Y3)=PA1⊗PA2⊗PA3. | (3.85) |
(g) The following projector equality holds:
P[Y1,Y2,Y3]=Im1⊗PA2⊗PA3+PA1⊗Im2⊗PA3+PA1⊗PA2⊗Im3−2(PA1⊗PA2⊗PA3). | (3.86) |
Proof. Equation (3.80) follows directly from (3.79), and (3.81) follows from (3.80). Since PY1, PY2 and PY3 are idempotent matrices, we find from (2.8) and (3.80) that
r[Y1,Y2,Y3]=r[PY1,PY2,PY3]=r(PY1)+r(PY2)+r(PY3) −r[PY1PY2,PY1PY3]−r[PY2PY1,PY2PY3]−r[PY3PY1,PY3PY1] +r[PY1PY2,PY1PY3,PY2PY2]=r(PY1)+r(PY2)+r(PY3)−2r(PA1⊗PA2⊗PA3)=m1r(A2)r(A3)+m2r(A1)r(A3)+m3r(A1)r(A2)−2r(A1)r(A2)r(A3), |
thus establishing (3.82). Equations (3.83)–(3.86) are left as exercises for the reader.
There are some interesting consequences to Theorems 3.3 and 3.4. For example, applying the following well-known rank inequality (cf. [22]):
r(A+B+C)≥r[ABC]+r[A,B,C]−r(A)−r(B)−r(C) |
to the sums of matrices in (3.28) and (3.80) yields the two rank inequalities
r(A1⊗Im2⊗Im3+Im1⊗A2⊗Im3+Im1⊗Im2⊗A3)≥m1m2r(A3)+m1m3r(A2)+m2m3r(A1)−2m1r(A2)r(A3)−2m2r(A1)r(A3) −2m3r(A1)r(A2)+2r(A1)r(A2)r(A3) |
and
r(Im1⊗A2⊗A3+A1⊗Im2⊗A3+A1⊗A2⊗Im3)≥m1r(A2)r(A3)+m2r(A1)r(A3)+m3r(A1)r(A2)−4r(A1)r(A2)r(A3), |
respectively, where A1∈Cm1×m1,A2∈Cm2×m2 and A3∈Cm3×m3.
We presented a new analysis of the dilation factorizations of the Kronecker products of two or three matrices, and obtained a rich variety of exact formulas and facts related to ranks, dimensions, orthogonal projectors, and ranges of Kronecker products of matrices. Admittedly, it is easy to understand and utilize these resulting formulas and facts in dealing with Kronecker products of matrices under various concrete situations. Given the formulas and facts in the previous theorems, there is no doubt to say that this study clearly demonstrates significance and usefulness of the dilation factorizations of Kronecker products of matrices. Therefore, we believe that this study can bring deeper insights into performances of Kronecker products of matrices, and thereby can lead to certain advances of enabling methodology in the domain of Kronecker products. We also hope that the findings in this resultful study can be taken as fundamental facts and useful supplementary materials in matrix theory when identifying and approaching various theoretical and computational issues associated with Kronecker products of matrices.
Moreover, the numerous formulas and facts in this article can be extended to the situations for dilation factorizations of multiple Kronecker products of matrices, which can help us a great deal in producing more impressive and useful contributions of researches related to Kronecker products of matrices and developing other relevant mathematical techniques applicable to solving practical topics. Thus, they can be taken as a reference and a source of inspiration for deep understanding and exploration of numerous performances and properties of Kronecker products of matrices.
The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.
The authors would like to express their sincere thanks to anonymous reviewers for their helpful comments and suggestions.
The authors declare that they have no conflict of interest.
[1] | Food, Nations AOotU (2017) The future of food and agriculture: Trends and challenges: FAO. |
[2] |
Giovannucci D, Purcell T (2008) Standards and agricultural trade in Asia. Soc Sci Res Netw Electron J 34: 789–797. https://doi.org/10.2139/ssrn.1330266 doi: 10.2139/ssrn.1330266
![]() |
[3] |
Chittithaworn C, Islam MA, Keawchana T, et al. (2011) Factors affecting business success of small & medium enterprises (SMEs) in Thailand. Asian Soc Sci 7: 180–190. https://doi.org/10.5539/ass.v7n5p180 doi: 10.5539/ass.v7n5p180
![]() |
[4] |
Anderson K (2022) Agriculture in a more uncertain global trade environment. Agric Econ 53: 563–579. https://doi.org/10.1111/agec.12726 doi: 10.1111/agec.12726
![]() |
[5] |
Gu YH, Jin D, Yin H, et al. (2022) Forecasting agricultural commodity prices using dual input attention LSTM. Agriculture 12: 256. https://doi.org/10.3390/agriculture12020256 doi: 10.3390/agriculture12020256
![]() |
[6] |
Sharafati A, Moradi Tayyebi M, Pezeshki E, et al. (2022) Uncertainty of climate change impact on crop characteristics: A case study of Moghan plain in Iran. Theor Appl Climatol 149: 603–620. https://doi.org/10.1007/s00704-022-04074-9 doi: 10.1007/s00704-022-04074-9
![]() |
[7] |
Somporn C, Kamtuo A, Theerakulpisut P, et al. (2011) Effects of roasting degree on radical scavenging activity, phenolics and volatile compounds of Arabica coffee beans (Coffea arabica L. cv. Catimor). Int J Food Sci Technol 46: 2287–2296. https://doi.org/10.1111/j.1365-2621.2011.02748.x doi: 10.1111/j.1365-2621.2011.02748.x
![]() |
[8] |
Haryono A, Maarif MS, Suroso A, et al. (2023) The design of a contract farming model for coffee tree replanting. Economies 11: 185. https://doi.org/10.3390/economies11070185 doi: 10.3390/economies11070185
![]() |
[9] | Azis AM, Irjayanti M, Rusyandi D (2022) Visibility and information accuracy of coffee supply chain in West Java Indonesia. In: Sergi BS, Sulistiawan D (Eds.), Modeling Economic Growth in Contemporary Indonesia, Emerald Publishing Limited, 225–236. https://doi.org/10.1108/978-1-80262-431-120221014 |
[10] | Katemauswa FA (2019) Factors influencing demand forecasting and demand planning: A case at an apparel retailer. MSc Dissertation, University of Kwazulu-Natal. https://researchspace.ukzn.ac.za/handle/10413/18966 |
[11] |
Kilian B, Jones C, Pratt L, et al. (2006) Is sustainable agriculture a viable strategy to improve farm income in Central America? A case study on coffee. J Bus Res 59: 322–330. https://doi.org/10.1016/j.jbusres.2005.09.015 doi: 10.1016/j.jbusres.2005.09.015
![]() |
[12] |
Kittichotsatsawat Y, Jangkrajarng V, Tippayawong KY (2021) Enhancing coffee supply chain towards sustainable growth with big data and modern agricultural technologies. Sustainability 13: 4593. https://doi.org/10.3390/su13084593 doi: 10.3390/su13084593
![]() |
[13] |
Kruse L, Wunderlich N, Beck R (2019) Artificial intelligence for the financial services industry: What challenges organizations to succeed. Proceedings of the 52nd Hawaii International Conference on System Sciences, 6408–6417. https://doi.org/10.24251/hicss.2019.770 doi: 10.24251/hicss.2019.770
![]() |
[14] |
Utku Al, Kaya SK (2022) Deep learning based a comprehensive analysis for waste prediction. Oper Res Eng Sci: Theory Appl 5: 176–189. https://doi.org/10.31181/oresta190822135u doi: 10.31181/oresta190822135u
![]() |
[15] | Tanikić D, Manić M, Devedžić G, et al. (2010) Modelling metal cutting parameters using intelligent techniques. J Mech Eng/Strojniški Vestnik, 56: 52–62. |
[16] |
Agatonovic-Kustrin S, Beresford R (2000) Basic concepts of artificial neural network (ANN) modeling and its application in pharmaceutical research. J Pharm Biomed Anal 22: 717–727. https://doi.org/10.1016/s0731-7085(99)00272-1 doi: 10.1016/S0731-7085(99)00272-1
![]() |
[17] |
Liakos KG, Busato P, Moshou D, et al. (2018) Machine learning in agriculture: A review. Sensors 18: 2674. https://doi.org/10.3390/s18082674 doi: 10.3390/s18082674
![]() |
[18] | Khairunniza-Bejo S, Mustaffha S, Ismail WIW (2014) Application of artificial neural network in predicting crop yield: A review. J Food Sci Eng 4: 1. |
[19] |
Kittichotsatsawat Y, Tippayawong N, Tippayawong KY (2022) Prediction of arabica coffee production using artificial neural network and multiple linear regression techniques. Sci Rep 12: 14488. https://doi.org/10.1038/s41598-022-18635-5 doi: 10.1038/s41598-022-18635-5
![]() |
[20] |
Bhojani SH, Bhatt N (2020) Wheat crop yield prediction using new activation functions in neural network. Neural Comput Appl 32: 13941–13951. https://doi.org/10.1007/s00521-020-04797-8 doi: 10.1007/s00521-020-04797-8
![]() |
[21] |
Palanivel K, Surianarayanan C (2019) An approach for prediction of crop yield using machine learning and big data techniques. Int J Comput Eng Technol 10: 110–118. https://doi.org/10.34218/ijcet.10.3.2019.013 doi: 10.34218/ijcet.10.3.2019.013
![]() |
[22] |
Zhao Z, Chow TL, Rees HW, et al. (2009) Predict soil texture distributions using an artificial neural network model. Comput Electron Agric 65: 36–48. https://doi.org/10.1016/j.compag.2008.07.008 doi: 10.1016/j.compag.2008.07.008
![]() |
[23] |
Kafy AA, Rahman AF, Al Rakib A, et al. (2021) Assessment and prediction of seasonal land surface temperature change using multi-temporal Landsat images and their impacts on agricultural yields in Rajshahi, Bangladesh. Environ Challenges 4: 100147. https://doi.org/10.1016/j.envc.2021.100147 doi: 10.1016/j.envc.2021.100147
![]() |
[24] |
Kaul M, Hill RL, Walthall C (2005) Artificial neural networks for corn and soybean yield prediction. Agric Syst 85: 1–18. https://doi.org/10.1016/j.agsy.2004.07.009 doi: 10.1016/j.agsy.2004.07.009
![]() |
[25] |
Abdollahpour S, Kosari-Moghaddam A, Bannayan M (2020) Prediction of wheat moisture content at harvest time through ANN and SVR modeling techniques. Inf Proc Agric 7: 500–510. https://doi.org/10.1016/j.inpa.2020.01.003 doi: 10.1016/j.inpa.2020.01.003
![]() |
[26] |
Ustaoglu B, Cigizoglu H, Karaca M (2008) Forecast of daily mean, maximum and minimum temperature time series by three artificial neural network methods. Meteorol Appl 15: 431–445. https://doi.org/10.1002/met.83 doi: 10.1002/met.83
![]() |
[27] |
Tariq A, Yan J, Ghaffar B, et al. (2022) Flash flood susceptibility assessment and zonation by integrating analytic hierarchy process and frequency ratio model with diverse spatial data. Water 14: 3069. https://doi.org/10.3390/w14193069 doi: 10.3390/w14193069
![]() |
[28] |
Ghaderizadeh S, Abbasi-Moghadam D, Sharifi A, et al. (2022) Multiscale dual-branch residual spectral–spatial network with attention for hyperspectral image classification. IEEE J Sel Topics Appl Earth Observ Remote Sens 15: 5455–5467. https://doi.org/10.1109/jstars.2022.3188732 doi: 10.1109/JSTARS.2022.3188732
![]() |
[29] |
Zamani A, Sharifi A, Felegari S, et al. (2022) Agro climatic zoning of saffron culture in miyaneh city by using WLC method and remote sensing data. Agriculture 12: 118. https://doi.org/10.3390/agriculture12010118 doi: 10.3390/agriculture12010118
![]() |
[30] |
Kosari A, Sharifi A, Ahmadi A, et al. (2020) Remote sensing satellite's attitude control system: Rapid performance sizing for passive scan imaging mode. Aircr Eng Aerosp Technol 92: 1073–1083. https://doi.org/10.1108/aeat-02-2020-0030 doi: 10.1108/AEAT-02-2020-0030
![]() |
[31] | Pfaff B (2008) Analysis of integrated and cointegrated time series with R. Springer Science & Business Media. https://doi.org/10.1007/978-0-387-75967-8 |
[32] | Padhan PC (2012) Application of ARIMA model for forecasting agricultural productivity in India. J Agric Soc Sci 8: 50–56. |
[33] | Iqbal N, Bakhsh K, Maqbool A, et al. (2005) Use of the ARIMA model for forecasting wheat area and production in Pakistan. J Agric Soc Sci 1: 120–122. |
[34] |
Osman T, Divigalpitiya P, Arima T (2016) Using the SLEUTH urban growth model to simulate the impacts of future policy scenarios on land use in the Giza Governorate, Greater Cairo Metropolitan region. Int J Urban Sci 20: 407–426. https://doi.org/10.1080/12265934.2016.1216327 doi: 10.1080/12265934.2016.1216327
![]() |
[35] |
Kumari P, Mishra G, Srivastava C (2017) Forecasting models for predicting pod damage of pigeonpea in Varanasi region. J Agrometeorol 19: 265–269. https://doi.org/10.54386/jam.v19i3.669 doi: 10.54386/jam.v19i3.669
![]() |
[36] |
Bekuma T, Mamo G, Regassa A (2022) Modeling and forecasting of rainfall and temperature time series in East Wollega Zone, Western Ethiopia. Arabian J Geosci 15: 1377. https://doi.org/10.1007/s12517-022-10638-w doi: 10.1007/s12517-022-10638-w
![]() |
[37] |
Mahto AK, Alam MA, Biswas R, et al. (2021) Short-term forecasting of agriculture commodities in context of indian market for sustainable agriculture by using the artificial neural network. J Food Qual 2021: 9939906. https://doi.org/10.1155/2021/9939906 doi: 10.1155/2021/9939906
![]() |
[38] |
Purohit SK, Panigrahi S, Sethy PK, et al. (2021) Time series forecasting of price of agricultural products using hybrid methods. Appl Artif Intell 35: 1388–1406.. https://doi.org/10.1080/08839514.2021.1981659 doi: 10.1080/08839514.2021.1981659
![]() |
[39] | Cenas PV (2017) Forecast of agricultural crop price using time series and Kalman filter method. Asia Pac J Multidiscip Res 5: 15–21. |
[40] |
Onsree T, Tippayawong N (2021) Machine learning application to predict yields of solid products from biomass torrefaction. Renewable Energy 167: 425–432. https://doi.org/10.1016/j.renene.2020.11.099 doi: 10.1016/j.renene.2020.11.099
![]() |
[41] |
Katongtung T, Onsree T, Tippayawong KY, et al. (2023) Prediction of biocrude oil yields from hydrothermal liquefaction using a gradient tree boosting machine approach with principal component analysis. Energy Rep 9: 215–222. https://doi.org/10.1016/j.egyr.2023.08.079 doi: 10.1016/j.egyr.2023.08.079
![]() |
[42] |
Prasertpong P, Onsree T, Khuenkaeo N, et al. (2023) Exposing and understanding synergistic effects in co-pyrolysis of biomass and plastic waste via machine learning. Bioresour Technol 369: 128419. https://doi.org/10.1016/j.biortech.2022.128419 doi: 10.1016/j.biortech.2022.128419
![]() |
[43] |
Onsree T, Tippayawong N, Phithakkitnukoon S, et al. (2022) Interpretable machine-learning model with a collaborative game approach to predict yields and higher heating value of torrefied biomass. Energy 249: 123676. https://doi.org/10.1016/j.energy.2022.123676 doi: 10.1016/j.energy.2022.123676
![]() |
[44] |
Rahman MM, Islam MA, Mahboob MG, et al. (2022) Forecasting of potato production in Bangladesh using ARIMA and mixed model approach. Sch J Agric Vet Sci 10: 136–145. https://doi.org/10.36347/sjavs.2022.v09i10.001 doi: 10.36347/sjavs.2022.v09i10.001
![]() |
[45] | Sankar TJ, Pushpa P (2022) Implementation of time series stochastic modelling for zea mays production in India. Math Stat Eng Appl 71: 611–621. |
[46] |
Nassiri H, Mohammadpour SI, Dahaghin M (2022) Forecasting time trends of fatal motor vehicle crashes in Iran using an ensemble learning algorithm. Traffic Inj Prev 24: 44–49. https://doi.org/10.1080/15389588.2022.2130279 doi: 10.1080/15389588.2022.2130279
![]() |
[47] |
Gorzelany J, Belcar J, Kuźniar P, et al. (2022) Modelling of mechanical properties of fresh and stored fruit of large cranberry using multiple linear regression and machine learning. Agriculture 12: 200. https://doi.org/10.3390/agriculture12020200 doi: 10.3390/agriculture12020200
![]() |
[48] |
Salari K, Zarafshan P, Khashehchi M, et al. (2022) Modeling and predicting of water production by capacitive deionization method using artificial neural networks. Desalination 540: 115992. https://doi.org/10.1016/j.desal.2022.115992 doi: 10.1016/j.desal.2022.115992
![]() |
[49] |
Zhu X, Xiao G, Wang S (2022) Suitability evaluation of potential arable land in the Mediterranean region. J Environ Manag 313: 115011. https://doi.org/10.1016/j.jenvman.2022.115011 doi: 10.1016/j.jenvman.2022.115011
![]() |
[50] |
Wongchai W, Onsree T, Sukkam N, et al. (2022) Machine learning models for estimating above ground biomass of fast growing trees. Expert Syst Appl 199: 117186. https://doi.org/10.1016/j.eswa.2022.117186 doi: 10.1016/j.eswa.2022.117186
![]() |
[51] |
Katongtung T, Onsree T, Tippayawong N (2022) Machine learning prediction of biocrude yields and higher heating values from hydrothermal liquefaction of wet biomass and wastes. Bioresour Technol 344: 126278. https://doi.org/10.1016/j.biortech.2021.126278 doi: 10.1016/j.biortech.2021.126278
![]() |
[52] |
Pesaran MH (2007) A simple panel unit root test in the presence of cross‐section dependence. J Appl Econometrics 22: 265–312. https://doi.org/10.2139/ssrn.457280 doi: 10.2139/ssrn.457280
![]() |
[53] |
Suresh K, Krishna Priya S (2011) Forecasting sugarcane yield of Tamilnadu using ARIMA models. Sugar Tech 13: 23–26. https://doi.org/10.1007/s12355-011-0071-7 doi: 10.1007/s12355-011-0071-7
![]() |
[54] | Eni D (2015) Seasonal ARIMA modeling and forecasting of rainfall in Warri Town, Nigeria. J Geosci Environ Prot 3: 91. https://doi.org/10.4236/gep.2015.36015 |
[55] |
Sapna S, Tamilarasi A, Kumar MP (2012) Backpropagation learning algorithm based on Levenberg Marquardt Algorithm. Comp Sci Inform Technol (CS and IT) 2: 393–398. https://doi.org/10.5121/csit.2012.2438 doi: 10.5121/csit.2012.2438
![]() |
[56] |
Rawat S, Mishra AR, Gautam S, et al. (2022) Regional time series forecasting of chickpea using ARIMA and neural network models in central plains of Uttar Pradesh (India). Int J Environ Clim Change 2022: 2879–2889. https://doi.org/10.9734/ijecc/2022/v12i1131280 doi: 10.9734/ijecc/2022/v12i1131280
![]() |
[57] | Somvanshi V, Pandey O, Agrawal P, et al. (2006) Modeling and prediction of rainfall using artificial neural network and ARIMA techniques. J Ind Geophys Union 10: 141–151. |
[58] |
Dwivedi D, Kelaiya J, Sharma G (2019) Forecasting monthly rainfall using autoregressive integrated moving average model (ARIMA) and artificial neural network (ANN) model: A case study of Junagadh, Gujarat, India. J Appl Nat Sci 11: 35–41. https://doi.org/10.31018/jans.v11i1.1951 doi: 10.31018/jans.v11i1.1951
![]() |
[59] | Latifi Z, Shabanali Fami H (2022) Forecasting wheat production in Iran using time series technique and artificial neural network. J Agric Sci Technol 24: 261–273. |
[60] | Sekhar PH, Kesavulu Poola K, Bhupathi M (2020) Modelling and prediction of coastal Andhra rainfall using ARIMA and ANN models. Int J Stat Appl Math 5: 104–110. |
[61] | Paswan S, Paul A, Paul A, et al. (2022) Time series prediction for sugarcane production in Bihar using ARIMA & ANN model. The Pharma Innovation J 11: 1947–1956. |
[62] |
Zou P, Yang J, Fu J, et al. (2010) Artificial neural network and time seriesmodels for predicting soil salt and water content. Agric Water Manag 97: 2009–2019. https://doi.org/10.1016/j.agwat.2010.02.011 doi: 10.1016/j.agwat.2010.02.011
![]() |