Vehicle detection in Remote Sensing Images (RSI) is a specific application of object recognition like satellite or aerial imagery. This application is highly beneficial in different fields like defense, traffic monitoring, and urban planning. However, complex particulars about the vehicles and the surrounding background, delivered by the RSIs, need sophisticated investigation techniques depending on large data models. This is crucial though the amount of reliable and labelled training datasets is still a constraint. The challenges involved in vehicle detection from the RSIs include variations in vehicle orientations, appearances, and sizes due to dissimilar imaging conditions, weather, and terrain. Both specific architecture and hyperparameters of the Deep Learning (DL) algorithm must be tailored to the features of RS data and the nature of vehicle detection tasks. Therefore, the current study proposes the Intelligent Water Drop Algorithm with Deep Learning-Driven Vehicle Detection and Classification (IWDADL-VDC) methodology to be applied upon the Remote Sensing Images. The IWDADL-VDC technique exploits a hyperparameter-tuned DL model for both recognition and classification of the vehicles. In order to accomplish this, the IWDADL-VDC technique follows two major stages, namely vehicle detection and classification. For vehicle detection process, the IWDADL-VDC method uses the improved YOLO-v7 model. After the vehicles are detected, the next stage of classification is performed with the help of Deep Long Short-Term Memory (DLSTM) approach. In order to enhance the classification outcomes of the DLSTM model, the IWDA-based hyperparameter tuning process has been employed in this study. The experimental validation of the model was conducted using a benchmark dataset and the results attained by the IWDADL-VDC technique were promising over other recent approaches.
Citation: Thavavel Vaiyapuri, M. Sivakumar, Shridevi S, Velmurugan Subbiah Parvathy, Janjhyam Venkata Naga Ramesh, Khasim Syed, Sachi Nandan Mohanty. An intelligent water drop algorithm with deep learning driven vehicle detection and classification[J]. AIMS Mathematics, 2024, 9(5): 11352-11371. doi: 10.3934/math.2024557
[1] | Pattrawut Chansangiam, Arnon Ploymukda . Riccati equation and metric geometric means of positive semidefinite matrices involving semi-tensor products. AIMS Mathematics, 2023, 8(10): 23519-23533. doi: 10.3934/math.20231195 |
[2] | Arnon Ploymukda, Pattrawut Chansangiam . Metric geometric means with arbitrary weights of positive definite matrices involving semi-tensor products. AIMS Mathematics, 2023, 8(11): 26153-26167. doi: 10.3934/math.20231333 |
[3] | Yajun Xie, Changfeng Ma, Qingqing Zheng . On the nonlinear matrix equation Xs+AHF(X)A=Q. AIMS Mathematics, 2023, 8(8): 18392-18407. doi: 10.3934/math.2023935 |
[4] | Tong Wu, Yong Wang . Super warped products with a semi-symmetric non-metric connection. AIMS Mathematics, 2022, 7(6): 10534-10553. doi: 10.3934/math.2022587 |
[5] | Rajesh Kumar, Sameh Shenawy, Lalnunenga Colney, Nasser Bin Turki . Certain results on tangent bundle endowed with generalized Tanaka Webster connection (GTWC) on Kenmotsu manifolds. AIMS Mathematics, 2024, 9(11): 30364-30383. doi: 10.3934/math.20241465 |
[6] | Mohd Danish Siddiqi, Meraj Ali Khan, Ibrahim Al-Dayel, Khalid Masood . Geometrization of string cloud spacetime in general relativity. AIMS Mathematics, 2023, 8(12): 29042-29057. doi: 10.3934/math.20231487 |
[7] | Wenxv Ding, Ying Li, Anli Wei, Zhihong Liu . Solving reduced biquaternion matrices equation k∑i=1AiXBi=C with special structure based on semi-tensor product of matrices. AIMS Mathematics, 2022, 7(3): 3258-3276. doi: 10.3934/math.2022181 |
[8] | Muhammad Asad Iqbal, Abid Ali, Ibtesam Alshammari, Cenap Ozel . Construction of new Lie group and its geometric properties. AIMS Mathematics, 2024, 9(3): 6088-6108. doi: 10.3934/math.2024298 |
[9] | Yimeng Xi, Zhihong Liu, Ying Li, Ruyu Tao, Tao Wang . On the mixed solution of reduced biquaternion matrix equation n∑i=1AiXiBi=E with sub-matrix constraints and its application. AIMS Mathematics, 2023, 8(11): 27901-27923. doi: 10.3934/math.20231427 |
[10] | Fengxia Zhang, Ying Li, Jianli Zhao . The semi-tensor product method for special least squares solutions of the complex generalized Sylvester matrix equation. AIMS Mathematics, 2023, 8(3): 5200-5215. doi: 10.3934/math.2023261 |
Vehicle detection in Remote Sensing Images (RSI) is a specific application of object recognition like satellite or aerial imagery. This application is highly beneficial in different fields like defense, traffic monitoring, and urban planning. However, complex particulars about the vehicles and the surrounding background, delivered by the RSIs, need sophisticated investigation techniques depending on large data models. This is crucial though the amount of reliable and labelled training datasets is still a constraint. The challenges involved in vehicle detection from the RSIs include variations in vehicle orientations, appearances, and sizes due to dissimilar imaging conditions, weather, and terrain. Both specific architecture and hyperparameters of the Deep Learning (DL) algorithm must be tailored to the features of RS data and the nature of vehicle detection tasks. Therefore, the current study proposes the Intelligent Water Drop Algorithm with Deep Learning-Driven Vehicle Detection and Classification (IWDADL-VDC) methodology to be applied upon the Remote Sensing Images. The IWDADL-VDC technique exploits a hyperparameter-tuned DL model for both recognition and classification of the vehicles. In order to accomplish this, the IWDADL-VDC technique follows two major stages, namely vehicle detection and classification. For vehicle detection process, the IWDADL-VDC method uses the improved YOLO-v7 model. After the vehicles are detected, the next stage of classification is performed with the help of Deep Long Short-Term Memory (DLSTM) approach. In order to enhance the classification outcomes of the DLSTM model, the IWDA-based hyperparameter tuning process has been employed in this study. The experimental validation of the model was conducted using a benchmark dataset and the results attained by the IWDADL-VDC technique were promising over other recent approaches.
In mathematics, we are familiar with the notion of geometric mean for positive real numbers. This notion was generalized to that for positive definite matrices of the same dimension in many ways. The metric geometric mean (MGM) of two positive definite matrices A and B is defined as
A♯B=A1/2(A−1/2BA−1/2)1/2A1/2. | (1.1) |
This mean was introduced by Pusz and Woronowicz [1] and studied in more detail by Ando [2]. Algebraically, A♯B is a unique solution to the algebraic Riccati equation XA−1X=B; e.g., [3]. Geometrically, A♯B is a unique midpoint of the Riemannian geodesic interpolated from A to B, called the weighted MGM of A and B:
A♯tB=A1/2(A−1/2BA−1/2)tA1/2,0⩽t⩽1. | (1.2) |
Remarkable properties of the mean ♯t, where t∈[0,1], are monotonicity, concavity, and upper semi-continuity (according to the famous Löwner-Heinz inequality); see, e.g., [2,4] and a survey [5,Sect. 3]. Moreover, MGMs play an important role in the Riemannian geometry of the positive definite matrices; see, e.g., [6,Ch. 4].
Another kind of geometric means of positive definite matrices is the spectral geometric mean (SGM), first introduced by Fiedler and Pták [7]:
A♢B=(A−1♯B)1/2A(A−1♯B)1/2. | (1.3) |
Note that the scalar consistency holds, i.e., if AB=BA, then
A♢B=A♯B=A1/2B1/2. |
Since the SGM is based on the MGM, the SGM satisfies many nice properties as those for MGMs, for example, idempotency, homogeneity, permutation invariance, unitary invariance, self duality, and a determinantal identity. However, the SGM does not possess the monotonicity, the concavity, and the upper semi-continuity. A significant property of SGMs is that (A♢B)2 is similar to AB and, they have the same spectrum; hence, the name "spectral geometric mean". The work [7] also established a similarity relation between the MGM A♯B and the SGM A◊B when A and B are positive definite matrices of the same size. After that, Lee and Kim [8] investigated the t-weighted SGM, where t is an arbitrary real number:
A♢tB=(A−1♯B)tA(A−1♯B)t. | (1.4) |
Gan and Tam [9] extended certain results of [7] to the case of the t-weighted SGMs when t∈[0,1]. Many research topics on the SGMs have been widely studied, e.g., [10,11]. Lim [12] introduced another (weighted) geometric mean of positive definite matrices varying over Hermitian unitary matrices, including the MGM as a special case. The Lim's mean has an explicit formula in terms of MGMs and SGMs.
There are several ways to extend the classical studies of MGMs and SGMs. The notion of MGMs can be defined on symmetric cones [8,13] and reflection quasigroups [14] via algebraic-geometrical perspectives. In the framework of lineated symmetric spaces [14] and reflection quasigroups equipped with a compatible Hausdorff topology, we can define MGMs of arbitrary reals weights. The SGMs were also investigated on symmetric cones in [8]. These geometric means can be extended to those for positive (invertible) operators on a Hilbert space; see, e.g., [15,16]. The cancellability of such means has significant applications in mean equations; see, e.g., [17,18].
Another way to generalize the means (1.2) and (1.4) is to replace the traditional matrix multiplications (TMM) by the semi-tensor products (STP) ⋉. Recall that the STP is a generalization of the TMM, introduced by Cheng [19]; see more information in [20]. To be more precise, consider a matrix pair (A,B)∈Mm,n×Mp,q and let α=lcm(n,p). The STP of A and B allows the two matrices to participate the TMM through the Kronecker multiplication (denoted by ⊗) with certain identity matrices:
A⋉B=(A⊗Iα/n)(B⊗Iα/p)∈Mαmn,αqp. |
For the factor-dimension condition n=kp, we have
A⋉B=A(B⊗Ik). |
For the matching-dimension condition n=p, the product reduces to A⋉B=AB. The STP occupies rich algebraic properties as those for TMM, such as bilinearity and associativity. Moreover, STPs possess special properties that TMM does not have, for example, pseudo commutativity dealing with swap matrices, and algebraic formulations of logical functions. In the last decade, STPs were beneficial to developing algebraic state space theory, so the theory can integrate ideas and methods for finite state machines to those for control theory; see a survey in [21].
Recently, the work [22] extended the MGM notion (1.1) to any pair of positive definite matrices, where the matrix sizes satisfied the factor-dimension condition:
A♯B=A1/2⋉(A−1/2⋉B⋉A−1/2)1/2⋉A1/2. | (1.5) |
In fact, A♯B is a unique positive-definite solution of the semi-tensor Riccati equation X⋉A−1⋉X=B. After that, the MGMs of arbitrary weight t∈R were studied in [23]. In particular, when t∈[0,1], the weighted MGMs have remarkable properties, namely, the monotonicity and the upper semi-continuity. See Section 2 for more details.
The present paper is a continuation of the works [22,23]. Here we investigate SGMS involving STPs. We start with the matrix mean equation:
A−1♯X=(A−1♯B)t, |
where A and B are given positive definite matrices of different sizes, t∈R, and X is an unknown square matrix. Here, ♯ is defined by the formula (1.5). We show that this equation has a unique positive definite solution, which is defined to be the t-weighted SGM of A and B. Another characterization of weighted SGMs are obtained in terms of certain matrix equations. It turns out that this mean satisfies various properties as in the classical case. We establish a similarity relation between the MGM and the SGM of two positive definite matrices of arbitrary dimensions. Our results generalize the work [7] and relate to the work [8]. Moreover, we investigate certain matrix equations involving weighted MGMs and SGMs.
The paper is organized as follows. In Section 2, we set up basic notation and give basic results on STPs, Kronecker products, and weighted MGMs of positive definite matrices. In Section 3, we characterize the weighted SGM for positive definite matrices in terms of matrix equations, then we provide fundamental properties of weighted SGMs in Section 4. In Section 5, we investigate matrix equations involving weighted SGMs and MGMs. We conclude the whole work in Section 6.
Throughout, let Mm,n be the set of all m×n complex matrices and abbreviate Mn,n to Mn. Define Cn=Mn,1 as the set of n-dimensional complex vectors. Denote by AT and A∗ the transpose and conjugate transpose of a matrix A, respectively. The n×n identity matrix is denoted by In. The general linear group of n×n complex matrices is denoted by GLn. Let us denote the set of n×n positive definite matrices by Pn. A matrix pair (A,B)∈Mm,n×Mp,q is said to satisfy a factor-dimension condition if n∣p or p∣n. In this case, we write A≻kB when n=kp, and A≺kB when p=kn.
Recall that for any matrices A=[aij]∈Mm,n and B∈Mp,q, their Kronecker product is defined by
A⊗B=[aijB]∈Mmp,nq. |
The Kronecker operation (A,B)↦A⊗B is bilinear and associative.
Lemma 2.1 (e.g. [5]). Let (A,B)∈Mm,n×Mp,q, (C,D)∈Mn,r×Mq,s, and (P,Q)∈Mm×Mn, then
(i) (A⊗B)∗=A∗⊗B∗.
(ii) (A⊗B)(C⊗D)=(AC)⊗(BD).
(iii) If (P,Q)∈GLm×GLn, then (P⊗Q)−1=P−1⊗Q−1.
(iv) If (P,Q)∈Pm×Pn, then P⊗Q∈Pmn and (P⊗Q)1/2=P1/2⊗Q1/2.
Lemma 2.2 (e.g. [20]). Let (A,B)∈Mm,n×Mp,q and (P,Q)∈Mm×Mn, then
(i) (A⋉B)∗=B∗⋉A∗.
(ii) If (P,Q)∈GLm×GLn, then (P⋉Q)−1=Q−1⋉P−1.
(iii) det(P⋉Q)=(detP)α/m(detQ)α/n where α=lcm(m,n).
Lemma 2.3 ([23]). For any S∈Pm and X∈Mn, we have X∗⋉S⋉X∈Pα, where α=lcm(m,n).
Definition 2.4. Let (A,B)∈Pm×Pn and α=lcm(m,n). For any t∈R, the t-weighted MGM of A and B is defined by
A♯tB=A1/2⋉(A−1/2⋉B⋉A−1/2)t⋉A1/2∈Pα. | (2.1) |
Note that A♯0B=A⊗Iα/m and A♯1B=B⊗Iα/n. We simply write A♯B=A♯1/2B. We clearly have A♯tB>0 and A♯tA=A.
Lemma 2.5 ([22]). Let (A,B)∈Pm×Pn be such that A≺kB, then the Riccati equation
X⋉A−1⋉X=B |
has a unique solution X=A♯B∈Pn.
Lemma 2.6 ([23]). Let (A,B)∈Pm×Pn and X,Y∈Pn. Let t∈R and α=lcm(m,n), then
(i) Positive homogeneity: For any scalars a,b,c>0, we have c(A♯tB)=(cA)♯t(cB) and, more generally,
(aA)♯t(bB)=a1−tbt(A♯tB). | (2.2) |
(ii) Self duality: (A♯tB)−1=A−1♯tB−1.
(iii) Permutation invariance: A♯1/2B=B♯1/2A. More generally, A♯tB=B♯1−tA.
(iv) Consistency with scalars: If A⋉B=B⋉A, then A♯B=A1−t⋉Bt.
(v) Determinantal identity:
det(A♯B)=√(detA)α/m(detB)α/n. |
(vi) Cancellability: If t≠0, then the equation A♯tX=A♯tY implies X=Y.
In this section, we define and characterize weighted SGMs in terms of certain matrix equations involving MGMs and STPs.
Theorem 3.1. Let (A,B)∈Pm×Pn. Let t∈R and α=lcm(m,n), then the mean equation
A−1♯X=(A−1♯B)t | (3.1) |
has a unique solution X∈Pα.
Proof. Note that the matrix pair (A,X) satisfies the factor-dimension condition. Let Y=(A−1♯B)t and consider
X=Y⋉A⋉Y. |
Using Lemma 2.5, we obtain that Y=A−1♯X. Thus, A−1♯X=(A−1♯B)t. For the uniqueness, let Z∈Pα be such that A−1♯Z=Y. By Lemma 2.5, we get
Z=Y⋉A⋉Y=X. |
We call the matrix X in Theorem 3.1 the t-weighted SGM of A and B.
Definition 3.2. Let (A,B)∈Pm×Pn and α=lcm(m,n). For any t∈R, the t-weighted SGM of A and B is defined by
A◊tB=(A−1♯B)t⋉A⋉(A−1♯B)t∈Mα. | (3.2) |
According to Lemma 2.3, we have A◊tB∈Pα. In particular, A◊0B=A⊗Iα/m and A◊1B=B⊗Iα/n. When t=1/2, we simply write A◊B=A◊1/2B. The formula (3.2) implies that
A◊tA=A,A◊tA−1=A1−2t | (3.3) |
for any t∈R. Note that in the case n∣m, we have
A◊tB=(A−1♯B)tA(A−1♯B)t, |
i.e., Eq (3.2) reduces to the same formula (1.4) as in the classical case m=n. By Theorem 3.1, we have
A−1♯(A◊tB)=(A−1♯B)t=(B◊tA)−1♯B. |
The following theorem provides another characterization of the weighted SGMs.
Theorem 3.3. Let (A,B)∈Pm×Pn. Let t∈R and α=lcm(m,n), then the following are equivalent:
(i) X=A◊tB.
(ii) There exists a positive definite matrix Y∈Pα such that
X=Yt⋉A⋉Yt=Yt−1⋉B⋉Yt−1. | (3.4) |
Moreover, the matrix Y satisfying (3.4) is uniquely determined by Y=A−1♯B.
Proof. Let X=A◊tB. Set Y=A−1♯B∈Pα. By Definition 3.2, we have X=Yt⋉A⋉Yt. By Lemma 2.5, we get Y⋉A⋉Y=B⊗Iα/n. Hence,
Yt−1⋉B⋉Yt−1=YtY−1⋉B⋉Y−1Yt=Yt⋉A⋉Yt=X. |
To show the uniqueness, let Z∈Pα be such that
X=Zt⋉A⋉Zt=Zt−1⋉B⋉Zt−1. |
We have Z⋉A⋉Z=B⊗Iα/n. Note that the pair (A,B⊗Iα/n) satisfies the factor-dimension condition. Now, Lemma 2.5 implies that Z=A−1♯B=Y.
Conversely, suppose there exists a matrix Y∈Pα such that Eq (3.4) holds, then Y⋉A⋉Y=B. Applying Lemma 2.5, we have Y=A−1♯B. Therefore,
X=(A−1♯B)t⋉A⋉(A−1♯B)t=A♢tB. |
Fundamental properties of the weighted SGMs (3.2) are as follows.
Theorem 4.1. Let (A,B)∈Pm×Pn, t∈R, and α=lcm(m,n), then
(i) Permutation invariance: A◊tB=B◊1−tA. In particular, A◊B=B◊A.
(ii) Positive homogeneity: c(A◊tB)=(cA)◊t(cB) for all c>0. More generally, for any scalars a,b>0, we have
(aA)◊t(bB)=a1−tbt(A◊tB). |
(iii) Self-duality: (A◊tB)−1=A−1◊tB−1.
(iv) Unitary invariance: For any U∈Uα, we have
U∗(A◊tB)U=(U∗⋉A⋉U)◊t(U∗⋉B⋉U). | (4.1) |
(v) Consistency with scalars: If A⋉B=B⋉A, then A◊tB=A1−t⋉Bt.
(vi) Determinantal identity:
det(A◊tB)=(detA)(1−t)αm(detB)tαn. |
(vii) Left and right cancellability: For any t∈R−{0} and Y1,Y2∈Pn, the equation
A◊tY1=A◊tY2 |
implies Y1=Y2. For any t∈R−{1} and X1,X2∈Pm, the equation X1◊tB=X2◊tB implies X1=X2. In other words, the maps X↦A◊tX and X↦X◊tB are injective for any t≠0,1.
(viii) (A◊B)2 is positively similar to A⋉B i.e., there is a matrix P∈Pα such that
(A◊B)2=P(A⋉B)P−1. |
In particular, (A◊B)2 and A⋉B have the same eigenvalues.
Proof. Throughout this proof, let X=A◊tB and Y=A−1♯B. From Theorem 3.3, the characteristic equation (3.4) holds.
To prove (ⅰ), set Z=B◊1−tA and W=B−1♯A. By Theorem 3.3, we get
Z=W1−t⋉B⋉W1−t=W−t⋉A⋉W−t. |
It follows from Lemma 2.6(ⅱ) that
W−1=B♯A−1=A−1♯B=Y. |
Hence, X=Yt⋉A⋉Yt=W−t⋉A⋉W−t=Z, i.e., A◊tB=B◊1−tA.
The assertion (ⅱ) follows directly from the formulas (3.2) and (2.2):
(aA)◊t(bB)=(a−1A−1♯bB)t⋉(aA)⋉(a−1A−1♯bB)t=(a−1♯b)t(A−1♯B)t⋉(aA)⋉(a−1♯b)t(A−1♯B)t=(a−1♯b)ta(a−1♯b)t(A−1♯B)t⋉A⋉(A−1♯B)t=a1−tbt(A◊tB). |
To prove the self-duality (ⅲ), set W=Y−1=A♯B−1. Observe that
X−1=(Yt⋉A⋉Yt)−1=Y−t⋉A−1⋉Y−t=Wt⋉A−1⋉Wt,X−1=(Yt−1⋉B⋉Yt−1)−1=Y1−t⋉B−1⋉Y1−t=Wt−1⋉B−1⋉Wt−1. |
Theorem 3.3 now implies that
(A◊tB)−1=X−1=A−1◊tB−1. |
To prove (ⅳ), let U∈Uα and consider W=U∗⋉Y⋉U. We have
Wt⋉U∗⋉A⋉U⋉Wt=U∗⋉Yt⋉U⋉U∗⋉A⋉U⋉U∗⋉Yt⋉U=U∗⋉Yt⋉A⋉Yt⋉U=U∗⋉X⋉U, |
and, similarly,
Wt−1⋉U∗⋉B⋉U⋉Wt−1=U∗⋉Yt−1⋉B⋉Yt−1⋉U=U∗⋉X⋉U. |
By Theorem 3.3, we arrive at (4.1).
For the assertion (ⅴ), the assumption A⋉B=B⋉A together with Lemma 2.6 (ⅳ) yields
Y=A−1♯B=A−1/2⋉B1/2. |
It follows that
Yt⋉A⋉Yt=A−t/2⋉Bt/2⋉A⋉A−t/2⋉Bt/2=A1−t⋉Bt,Yt−1⋉B⋉Yt−1=A−(t−1)/2⋉B(t−1)/2⋉B⋉A−(t−1)/2⋉B(t−1)/2=A1−t⋉Bt. |
Now, Theorem 3.3 implies that A◊tB=A1−t⋉Bt. The determinantal identity (ⅵ) follows directly from the formula (1.4), Lemma 2.2(ⅲ), and Lemma 2.6(ⅴ):
det(A◊tB)=det(A−1♯B)2t(detA)αm=(detA)−αtm(detB)αtn(detA)αm=(detA)(1−t)αm(detB)tαn. |
To prove the left cancellability, let t∈R−{0} and suppose that A◊tY1=A◊tY2. We have
(A1/2⋉(A−1♯Y1)t⋉A1/2)2=A1/2⋉(A◊tY1)⋉A1/2=A1/2⋉(A◊tY2)⋉A1/2=(A1/2⋉(A−1♯Y2)t⋉A1/2)2. |
Taking the positive square root yields
A1/2⋉(A−1♯Y1)t⋉A1/2=A1/2⋉(A−1♯Y2)t⋉A1/2, |
and, thus, (A−1♯Y1)t=(A−1♯Y2)t. Since t≠0, we get A−1♯Y1=A−1♯Y2. Using the left cancellability of MGM (Lemma 2.6(ⅵ)), we obtain Y1=Y2. The right cancellability follows from the left cancellability together with the permutation invariance (ⅰ).
For the assertion (ⅷ), since A◊B=Y1/2⋉A⋉Y1/2=Y−1/2⋉B⋉Y−1/2, we have
(A◊B)2=(Y1/2⋉A⋉Y1/2)(Y−1/2⋉B⋉Y−1/2)=Y1/2(A⋉B)Y−1/2. |
Note that the matrix Y1/2 is positive definite. Thus, (A◊B)2 is positively similar to A⋉B, so they have the same eigenvalues.
Remark 4.2. Let (A,B)∈Pm×Pn. Instead of Definition 3.2, the permutation invariance (ⅰ) provides an alternative definition of A◊tB as follows:
A◊tB=(B−1♯A)1−t⋉B⋉(B−1♯A)1−t=(A♯B−1)1−t⋉B⋉(A♯B−1)1−t. |
In particular, if m∣n, we have
A◊tB=(A♯B−1)1−tB(A♯B−1)1−t. |
The assertion (ⅷ) is the reason why A◊B is called the SGM.
Now, we will show that A♯B and A◊tB are positively similar when A and B are positive definite matrices of arbitrary sizes. Before that, we need the following lemma.
Lemma 4.3. Let (A,B)∈Pm×Pn. Let t∈R and α=lcm(m,n), then there exists a unique Yt∈Pα such that
A◊tB=Yt⋉A⋉YtandB◊tA=Y−1t⋉A⋉Y−1t. |
Proof. Set Yt=(A−1♯B)t, then Yt⋉A⋉Yt=A◊tB. Using Lemma 2.6, we obtain that
Y−1t⋉B⋉Y−1t=(B−1♯A)t⋉B⋉(B−1♯A)t=B◊tA. |
To prove the uniqueness, let Zt∈Pα be such that Zt⋉A⋉Zt=A◊tB and Z−1t⋉A⋉Z−1t=B◊tA. By Lemma 2.5, we get Zt=A−1♯(A◊tB), but Theorem 3.1 says that
A−1♯(A◊tB)=(A−1♯B)t. |
Thus, Zt=Yt.
Theorem 4.4. Let (A,B)∈Pm×Pn. Let t∈R and α=lcm(m,n), then A♯B is positively similar to (A◊1−tB)1/2U(A◊tB)1/2 for some unitary U∈Mα.
Proof. By Lemma 4.3, there exists Yt∈Pα such that A◊tB=Yt⋉A⋉Yt and B◊tA=Y−1t⋉A⋉Y−1t. Using Lemmas 2.2 and 2.5, we have
Yt(A◊1−tB)Yt=B⊗Iα/n=(A♯B)⋉A−1⋉(A♯B)=(A♯B)Yt(A◊tB)−1Yt(A♯B), |
then
((A◊tB)−1/2Yt(A♯B)Yt(A◊tB)−1/2)2=(A◊tB)−1/2Y2t(A◊1−tB)Y2t(A◊tB)−1/2. |
Thus,
A♯B=Y−1t(A◊tB)1/2((A◊tB)−1/2Y2t(A◊1−tB)Y2t(A◊tB)−1/2)1/2(A◊tB)1/2Y−1t. |
Set V=(A◊tB)−1/2Y2t(A◊1−tB)1/2 and U=V−1(VV∗)1/2. Obviously, U is a unitary matrix. We obtain
A♯B=Y−1t(A◊tB)1/2(VV∗)1/2(A◊tB)1/2Y−1t=Yt(A◊1−tB)1/2V−1(VV∗)1/2(A◊tB)1/2Y−1t=Yt(A◊1−tB)1/2U(A◊tB)1/2Y−1t. |
This implies that (A◊1−tB)1/2U(A◊tB)1/2 is positive similar to A♯B.
In general, the MGM A♯tB and the SGM A◊tB are not comparable (in the Löwner partial order). We will show that A♯tB and A◊tB coincide in the case that A and B are commuting with respect to the STP. To do this, we need a lemma.
Lemma 4.5. Let (P,Q)∈Pm×Pn. If
P⋉Q⋉P⋉Q−1=Q⋉P⋉Q−1⋉P, | (4.2) |
then P⋉Q=Q⋉P.
Proof. From Eq (4.2), we have
(Q−1/2⋉P⋉Q1/2)(Q−1/2⋉P⋉Q1/2)∗=(Q−1/2⋉P⋉Q1/2)∗(Q−1/2⋉P⋉Q1/2). |
This implies that Q−1/2⋉P⋉Q1/2 is a normal matrix. Since Q−1/2⋉P⋉Q1/2 and P⊗Iα/m are similar matrices, we conclude that the eigenvalues of Q−1/2⋉P⋉Q1/2 are real and Q−1/2⋉P⋉Q1/2 is Hermitian. Hence,
Q−1/2⋉P⋉Q1/2=(Q−1/2⋉P⋉Q1/2)∗=Q1/2⋉P⋉Q−1/2. |
Therefore, P⋉Q=Q⋉P.
The next theorem generalizes [7,Theorem 5.1].
Theorem 4.6. Let (A,B)∈Pm×Pn and t∈R. If A⋉B=B⋉A, then A♯tB=A◊tB. In particular, A♯B=A◊B if and only if A⋉B=B⋉A.
Proof. Suppose A⋉B=B⋉A. By Lemma 2.6 and Theorem 4.1, we have
A♯tB=A1−t⋉Bt=A◊tB. |
Next, assume that A♯B=A◊B=X. By Lemma 2.5, we have
X⋉A−1⋉X=B⊗Iα/n. |
Set Y=A−1♯B. By Lemma 3.3, we get X=Y1/2⋉A⋉Y1/2=Y−1/2⋉B⋉Y−1/2. It follows that
Y1/2⋉X⋉Y1/2=B⊗Iα/n=X⋉A−1⋉X=X⋉Y1/2⋉X−1⋉Y1/2⋉X. |
Thus,
Y1/2⋉X⋉Y1/2⋉X−1=X⋉Y1/2⋉X−1⋉Y1/2. |
Lemma 4.5 implies that X⋉Y1/2=Y1/2⋉X. Hence,
A⋉B=A⋉Y⋉A⋉Y=Y−1/2⋉X2⋉Y1/2=X2=Y1/2⋉X2⋉Y−1/2=Y⋉A⋉Y⋉A=B⋉A. |
Theorem 4.7. Let (A,B)∈Pm×Pn and α=lcm(m,n), then the following statements are equivalent:
(i) A◊B=Iα,
(ii) A⊗Iα/m=B−1⊗Iα/n,
(iii) A♯B=Iα.
Proof. First, we show the equivalence between the statements (ⅰ) and (ⅱ). Suppose that A◊B=Iα. Letting Y=A−1♯B, we have by Theorem 3.3 that
Y1/2⋉A⋉Y1/2=Y−1/2⋉B⋉Y−1/2=Iα. |
Applying Lemma 2.1, we obtain
A⊗Iα/m=Y−1=B−1⊗Iα/n. |
Now, suppose A⊗Iα/m=B−1⊗Iα/n. By Lemma 2.1, we have
A⋉B=(A⊗Iα/m)(B⊗Iα/n)=(B−1⊗Iα/n)(B⊗Iα/n)=In⊗Iα/n=Iα, |
and similarly, B⋉A=Iα. Now, Theorem 4.1(ⅴ) implies that
A◊B=A1/2⋉B1/2=(B−1/2⊗Iα/n)(B1/2⊗Iα/n)=Iα. |
Next, we show the equivalence between (ⅱ) and (ⅲ). Suppose that A♯B=Iα, then we have
(A−1/2⋉B⋉A−1/2)1/2=A−1/2⋉Iα⋉A−1/2=A−1⊗Iα/m. |
This implies that
A−1/2⋉B⋉A−1/2=(A−1⊗Iα)2=A−2⊗Iα/m. |
Thus, B⊗Iα/n=A−1⊗Iα/m or A⊗Iα/m=B−1⊗Iα/n.
Now, suppose (ⅲ) holds, then we get A⋉B=Iα=B⋉A. It follows from Lemma 2.6 (ⅳ) that A♯B=A1/2⋉B1/2=Iα.
In particular from Theorem 4.7, when m=n, we have that A◊B=In if and only if A=B−1, if and only if, A♯B=In. This result was included in [7] and related to the work [8].
In this section, we investigate matrix equations involving MGMs and SGMs of positive definite matrices. In particular, recall that the work [23] investigated the matrix equation A♯tX=B. We discuss this matrix equation when the MGM ♯t is replaced by the SGM ◊t in the next theorem.
Theorem 5.1. Let (A,B)∈Pm×Pn where m∣n. Let t∈R−{0}, then the mean equation
A◊tX=B, | (5.1) |
in an unknown X∈Pn, is equivalent to the Riccati equation
Wt⋉A⋉Wt=B | (5.2) |
in an unknown Wt∈Pn. Moreover, Eq (5.1) has a unique solution given by
X=A◊1/tB=(A♯B−1)1−1tB(A♯B−1)1−1t. | (5.3) |
Proof. Let us denote Wt=(A−1♯X)t for each t∈R−{0}. By Definition 3.2, we have
A◊tX=(A−1♯X)t⋉A⋉(A−1♯X)t=Wt⋉A⋉Wt. |
Note that the map X↦Wt is injective due to the cancellability of the MGM ♯t (Lemma 2.6(ⅵ)). Thus, Eq (5.1) is equivalent to the Riccati equation (5.2). Now, Lemma 2.5 implies that Eq (5.2) is equivalent to Wt=A−1♯B. Thus, Eq (5.1) is equivalent to the equation
(A−1♯X)t=A−1♯B. | (5.4) |
We now solve (5.4). Indeed, we have
A−1♯X=(A−1♯B)1/t. |
According to Theorem 3.1 and Definition 3.2, this equation has a unique solution denoted by the SGM of A and B with weight 1/t. Now, Remark 4.2 provides the explicit formula (5.3) of A◊1/tB.
Remark 5.2. For the case n∣m in Theorem 5.1, we get a similar result. In particular to the case m∣n, the mean equation
A◊X=B | (5.5) |
has a unique solution X=(A−1♯B)B(A−1♯B).
Theorem 5.3. Let (A,B)∈Pm×Pn. Let t∈R−{0} and α=lcm(m,n), then the equation
(A♯X)♯t(B♯X)=Iα | (5.6) |
has a unique solution X=A−1◊tB−1∈Pα.
Proof. For the case t=0, Lemma 2.5 tells us that the equation A♯X=Iα has a unique solution
X=A−1⊗Iα/m=A−1◊0B−1. |
Now, assume that t≠0. To prove the uniqueness, let U=A♯X and V=B♯X, then
U⋉A−1⋉U=X=V⋉B−1⋉V. |
Since U♯tV=Iα, we obtain (U−1/2⋉V⋉U−1/2)t=U−1 and, thus, V=U(t−1)/t. It follows that
B⊗Iα/n=V⋉X−1⋉V=V⋉U−1⋉A⋉U−1⋉V=U−1/t⋉A⋉U−1/t. |
Using Lemma 2.5, we have that U−1/t=A−1♯B and, thus, U=(A−1♯B)−t. Hence,
X=(A−1♯B)−t⋉A−1⋉(A−1♯B)−t=(A♯B−1)t⋉A−1⋉(A♯B−1)t=A−1◊tB−1. |
Corollary 5.4. Let (A,B)∈Pm×Pn and α=lcm(m,n), then the equation
A♯X=B♯X−1 | (5.7) |
has a unique solution X=A−1◊B∈Pα.
Proof. Equation (5.7) and Lemma 2.6 imply that
(A♯X)−1=(B♯X−1)−1=B−1♯X. |
Thus, Eq (5.7) is equivalent to the following equation:
(A♯X)♯1/2(B−1♯X)=Iα. |
Now, the desired solution follows from the case t=1/2 in Theorem 5.3.
In particular, when m=n and A=B, the equation A♯X=A♯X−1 has a unique solution X=A◊A−1=A0=I by Eq (3.3).
Theorem 5.5. Let (A,B)∈Pm×Pn and α=lcm(m,n), then the equation
(A♯X)◊t(B♯X)=Iα | (5.8) |
has a unique solution X=A−1◊tB−1∈Pα.
Proof. If t=0, the equation A♯X−1=Iα has a unique solution X=A−1⊗Iα/m=A−1◊0B−1. Now, consider t≠0, and let U=A♯X and V=B♯X, then
U−1⋉A⋉U−1=X−1=V−1⋉B⋉V−1. |
Since U◊tV=Iα, we have that U=(U−1♯V)−2t, i.e., U1/(2t)=U♯V−1. Applying Lemma 2.5, we get V−1=U1/(2t)⋉U−1⋉U1/(2t)=U(1−t)/t. Hence,
B=V⋉U−1⋉A⋉U−1⋉V=U−1/t⋉A⋉U−1/t. |
Using Lemma 2.5, we have U−1/t=A−1♯B, i.e., U=(A−1♯B)−t. Thus,
X−1=(A−1♯B)t⋉A⋉(A−1♯B)t=A◊tB. |
Hence, by the self-duality of the SGM ◊t, we have
X=(A◊tB)−1=A−1◊tB−1. |
All results in this section seem to be not noticed before in the literature. In particular, from Theorems 5.3 and 5.5, when m=n and A=B, the equation A♯X=I has a unique solution X=A−1.
We characterize weighted SGMs of positive definite matrices in terms of certain matrix equations involving MGMs and STPs. Indeed, for each real number t, the unique positive solution of the matrix equation A−1♯X=(A−1♯B)t is defined to be the t-weighted SGM of A and B. We then establish several properties of the weighted SGMs such as permutation invariance, homogeneity, self-duality, unitary invariance, cancellability, and a determinantal identity. The most significant property is the fact that (A◊B)2 is positively similar to A⋉B, so the two matrices have the same spectrum. The results in Sections 3 and 5 include the classical weighted SGMs of matrices as special cases. Furthermore, we show that certain equations concerning weighted SGMs and weighted MGMs of positive definite matrices have a unique solution written explicitly as weighted SGMs of associated matrices. In particular, the equation A◊tX=B can be expressed in terms of the famous Riccati equation. For future works, we may investigate SGMs from differential-geometry viewpoints, such as geodesic property.
The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.
This research project is supported by National Research Council of Thailand (NRCT): (N41A640234). The authors would like to thank the anonymous referees for comments and suggestions.
The authors declare there is no conflicts of interest.
[1] | F. Safarov, K. Temurbek, D. Jamoljon, O. Temur, J. C. Chedjou, A. B. Abdusalomov, et al, Improved agricultural field segmentation in satellite imagery using TL-ResUNet architecture, Sensors, 22 (2022), 9784. https://doi.org/10.3390/s22249784 |
[2] | M. A. Momin, M. H. Junos, A. S. M. Khairuddin, M. S. A. Talip, Lightweight CNN model: Automated vehicle detection in aerial images. Signal Image Video P., 17 (2023), 1209–1217. https://doi.org/10.1007/s11760-022-02328-7 |
[3] |
Y. Wang, F. Peng, M. Lu, M. A. Ikbal, Information extraction of the vehicle from high-resolution remote sensing image based on convolution neural network, Recent Adv. Electr. El., 16 (2023), 168–177. https://doi.org/10.2174/2352096515666220820174654 doi: 10.2174/2352096515666220820174654
![]() |
[4] | L. Wang, Y. Shoulin, H. Alyami, A. A. Laghari, M. Rashid, J. Almotiri, et al., A novel deep learning—based single shot multibox detector model for object detection in optical remote sensing images, Geosci. Data J., 2022, 1–15. https://doi.org/10.1002/gdj3.162 |
[5] | R. Ghali, M. A. Akhloufi, Deep learning approaches for wildland fires remote sensing: Classification, detection, and segmentation, Remote Sens., 15 (2023), 1821. https://doi.org/10.3390/rs15071821 |
[6] | C. Anusha, C. Rupa, G. Samhitha, Region-based detection of ships from remote sensing satellite imagery using deep learning. In: 2022 2nd International Conference on Innovative Practices in Technology and Management (ICIPTM), 2022, https://doi.org/10.1109/ICIPTM54933.2022.9754168 |
[7] |
Y. Chen, R. Qin, G. Zhang, H. Albanwan, Spatial-temporal analysis of traffic patterns during the COVID-19 epidemic by vehicle detection using planet remote-sensing satellite images, Remote Sens., 13 (2021), 208. https://doi.org/10.3390/rs13020208 doi: 10.3390/rs13020208
![]() |
[8] |
L. K, S. Karnick, M. R. Ghalib, A. Shankar, S. Khapre, I. A. Tayubi, A novel method for vehicle detection in high-resolution aerial remote sensing images using YOLT approach, Multimed. Tools Appl., 81 (2022), 23551–23566. https://doi.org/10.1007/s11042-022-12613-9 doi: 10.1007/s11042-022-12613-9
![]() |
[9] | B. Wang, B. Xu, A feature fusion deep-projection convolution neural network for vehicle detection in aerial images, PLoS One, 16 (2021), e0250782. https://doi.org/10.1371/journal.pone.0250782 |
[10] |
J. Wang, X. Teng, Z. Li, Q. Yu, Y. Bian, J. Wei, VSAI: A multi-view dataset for vehicle detection in complex scenarios using aerial images, Drones, 6 (2022), 161. https://doi.org/10.3390/drones6070161 doi: 10.3390/drones6070161
![]() |
[11] |
M. Alajmi, H. Alamro, F. Al-Mutiri, M. Aljebreen, K. M. Othman, A. Sayed, Exploiting remote sensing imagery for vehicle detection and classification using an artificial intelligence technique, Remote Sens., 15 (2023), 4600. https://doi.org/10.3390/rs15184600 doi: 10.3390/rs15184600
![]() |
[12] |
S. Javadi, M. Dahl, M. I. Pettersson, Vehicle detection in aerial images based on 3D depth maps and deep neural networks, IEEE Access, 9 (2021), 8381–8391. https://doi.org/10.1109/ACCESS.2021.3049741 doi: 10.1109/ACCESS.2021.3049741
![]() |
[13] |
P. Gao, T. Tian, T. Zhao, L. Li, N. Zhang, J. Tian, Double FCOS: A two-stage model utilizing FCOS for vehicle detection in various remote sensing scenes, IEEE J. STARS, 15 (2022), 4730–4743. https://doi.org/10.1109/JSTARS.2022.3181594 doi: 10.1109/JSTARS.2022.3181594
![]() |
[14] |
M. Ragab, H. A. Abdushkour, A. O. Khadidos, A. M. Alshareef, K. H. Alyoubi, A. O. Khadidos, Improved deep learning-based vehicle detection for urban applications using remote sensing imagery, Remote Sens., 15 (2023), 4747. https://doi.org/10.3390/rs15194747 doi: 10.3390/rs15194747
![]() |
[15] |
C. H. Karadal, M. C. Kaya, T. Tuncer, S. Dogan, U. R. Acharya, Automated classification of remote sensing images using multileveled MobileNetV2 and DWT technique, Expert Syst. Appl., 185 (2021), 115659. https://doi.org/10.1016/j.eswa.2021.115659 doi: 10.1016/j.eswa.2021.115659
![]() |
[16] |
I. Ahmed, M. Ahmad, A. Chehri, M. M. Hassan, G. Jeon, IoT enabled deep learning based framework for multiple object detection in remote sensing images, Remote Sens., 14 (2022), 4107. https://doi.org/10.3390/rs14164107 doi: 10.3390/rs14164107
![]() |
[17] | Y. Alotaibi, K. Nagappan, G. Rani, S. Rajendran, Vehicle detection and classification using optimal deep learning on high-resolution remote sensing imagery for urban traffic monitoring, 2023. Preprint. https://doi.org/10.21203/rs.3.rs-3272891/v1 |
[18] |
S. Gadamsetty, R. Ch, A. Ch, C. Iwendi, T. R. Gadekallu, Hash-based deep learning approach for remote sensing satellite imagery detection, Water, 14 (2022), 707. https://doi.org/10.3390/w14050707 doi: 10.3390/w14050707
![]() |
[19] |
C. Xie, C. Lin, X. Zheng, B. Gong, H. Liu, Dense sequential fusion: Point cloud enhancement using foreground mask guidance for multimodal 3D object detection, IEEE T. Instrum. Meas., 73 (2024), 9501015, https://doi.org/10.1109/TIM.2023.3332935 doi: 10.1109/TIM.2023.3332935
![]() |
[20] | S. M. Alshahrani, S. S. Alotaibi, S. Al-Otaibi, M. Mousa, A. M. Hilal, A. A. Abdelmageed, et al., Optimal deep convolutional neural network for vehicle detection in remote sensing images, CMC Comput. Mater. Con, 74 (2023), 3117–3131. https://doi.org/10.32604/cmc.2023.033038 |
[21] | M. A. Ahmed, S. A. Althubiti, V. H. C. de Albuquerque, M. C. dos Reis, C. Shashidhar, T. S. Murthy, et al., Fuzzy wavelet neural network driven vehicle detection on remote sensing imagery, Comput. Electr. Eng., 109 (2023), 108765. https://doi.org/10.1016/j.compeleceng.2023.108765 |
[22] | M. Aljebreen, B. Alabduallah, H. Mahgoub, R. Allafi, M. A. Hamza, S. S. Ibrahim, et al., Integrating IoT and honey badger algorithm based ensemble learning for accurate vehicle detection and classification, Ain Shams Eng. J., 14 (2023), 102547. https://doi.org/10.1016/j.asej.2023.102547 |
[23] |
Y. Lai, R. Ma, Y. Chen, T. Wan, R. Jiao, H. He, A pineapple target detection method in a field environment based on improved YOLOv7, Appl. Sci., 13 (2023), 2691. https://doi.org/10.3390/app13042691 doi: 10.3390/app13042691
![]() |
[24] | Y. F. Shi, C. Yang, J. Wang, Y. Zheng, F. Y. Meng, L. F. Chernogor, A hybrid deep learning‐based forecasting model for the peak height of ionospheric F2 layer, Space Weather, 21 (2023), e2023SW003581. https://doi.org/10.1029/2023SW003581 |
[25] | B. O. Alijla, C. P. Lim, L. P. Wong, A. T. Khader, M. A. Al-Betar, An ensemble of intelligent water drop algorithm for feature selection optimization problem, Appl. Soft Comput., 65 (2018), 531–541. https://doi.org/10.1016/j.asoc.2018.02.003 |
[26] |
S. Razakarivony, F. Jurie, Vehicle detection in aerial imagery: A small target detection benchmark, J. Vis. Commun. Image R., 34 (2016), 187–203. https://doi.org/10.1016/j.jvcir.2015.11.002 doi: 10.1016/j.jvcir.2015.11.002
![]() |
[27] | F. Rottensteiner, G. Sohn, J. Jung, M. Gerke, C. Baillard, S. Benitez, U. Breitkopf, The ISPRS benchmark on urban object classification and 3D building reconstruction, ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci., 1–3 (2012), 293–298. |