Processing math: 100%
Research article Special Issues

An intelligent water drop algorithm with deep learning driven vehicle detection and classification

  • Vehicle detection in Remote Sensing Images (RSI) is a specific application of object recognition like satellite or aerial imagery. This application is highly beneficial in different fields like defense, traffic monitoring, and urban planning. However, complex particulars about the vehicles and the surrounding background, delivered by the RSIs, need sophisticated investigation techniques depending on large data models. This is crucial though the amount of reliable and labelled training datasets is still a constraint. The challenges involved in vehicle detection from the RSIs include variations in vehicle orientations, appearances, and sizes due to dissimilar imaging conditions, weather, and terrain. Both specific architecture and hyperparameters of the Deep Learning (DL) algorithm must be tailored to the features of RS data and the nature of vehicle detection tasks. Therefore, the current study proposes the Intelligent Water Drop Algorithm with Deep Learning-Driven Vehicle Detection and Classification (IWDADL-VDC) methodology to be applied upon the Remote Sensing Images. The IWDADL-VDC technique exploits a hyperparameter-tuned DL model for both recognition and classification of the vehicles. In order to accomplish this, the IWDADL-VDC technique follows two major stages, namely vehicle detection and classification. For vehicle detection process, the IWDADL-VDC method uses the improved YOLO-v7 model. After the vehicles are detected, the next stage of classification is performed with the help of Deep Long Short-Term Memory (DLSTM) approach. In order to enhance the classification outcomes of the DLSTM model, the IWDA-based hyperparameter tuning process has been employed in this study. The experimental validation of the model was conducted using a benchmark dataset and the results attained by the IWDADL-VDC technique were promising over other recent approaches.

    Citation: Thavavel Vaiyapuri, M. Sivakumar, Shridevi S, Velmurugan Subbiah Parvathy, Janjhyam Venkata Naga Ramesh, Khasim Syed, Sachi Nandan Mohanty. An intelligent water drop algorithm with deep learning driven vehicle detection and classification[J]. AIMS Mathematics, 2024, 9(5): 11352-11371. doi: 10.3934/math.2024557

    Related Papers:

    [1] Pattrawut Chansangiam, Arnon Ploymukda . Riccati equation and metric geometric means of positive semidefinite matrices involving semi-tensor products. AIMS Mathematics, 2023, 8(10): 23519-23533. doi: 10.3934/math.20231195
    [2] Arnon Ploymukda, Pattrawut Chansangiam . Metric geometric means with arbitrary weights of positive definite matrices involving semi-tensor products. AIMS Mathematics, 2023, 8(11): 26153-26167. doi: 10.3934/math.20231333
    [3] Yajun Xie, Changfeng Ma, Qingqing Zheng . On the nonlinear matrix equation Xs+AHF(X)A=Q. AIMS Mathematics, 2023, 8(8): 18392-18407. doi: 10.3934/math.2023935
    [4] Tong Wu, Yong Wang . Super warped products with a semi-symmetric non-metric connection. AIMS Mathematics, 2022, 7(6): 10534-10553. doi: 10.3934/math.2022587
    [5] Rajesh Kumar, Sameh Shenawy, Lalnunenga Colney, Nasser Bin Turki . Certain results on tangent bundle endowed with generalized Tanaka Webster connection (GTWC) on Kenmotsu manifolds. AIMS Mathematics, 2024, 9(11): 30364-30383. doi: 10.3934/math.20241465
    [6] Mohd Danish Siddiqi, Meraj Ali Khan, Ibrahim Al-Dayel, Khalid Masood . Geometrization of string cloud spacetime in general relativity. AIMS Mathematics, 2023, 8(12): 29042-29057. doi: 10.3934/math.20231487
    [7] Wenxv Ding, Ying Li, Anli Wei, Zhihong Liu . Solving reduced biquaternion matrices equation ki=1AiXBi=C with special structure based on semi-tensor product of matrices. AIMS Mathematics, 2022, 7(3): 3258-3276. doi: 10.3934/math.2022181
    [8] Muhammad Asad Iqbal, Abid Ali, Ibtesam Alshammari, Cenap Ozel . Construction of new Lie group and its geometric properties. AIMS Mathematics, 2024, 9(3): 6088-6108. doi: 10.3934/math.2024298
    [9] Yimeng Xi, Zhihong Liu, Ying Li, Ruyu Tao, Tao Wang . On the mixed solution of reduced biquaternion matrix equation ni=1AiXiBi=E with sub-matrix constraints and its application. AIMS Mathematics, 2023, 8(11): 27901-27923. doi: 10.3934/math.20231427
    [10] Fengxia Zhang, Ying Li, Jianli Zhao . The semi-tensor product method for special least squares solutions of the complex generalized Sylvester matrix equation. AIMS Mathematics, 2023, 8(3): 5200-5215. doi: 10.3934/math.2023261
  • Vehicle detection in Remote Sensing Images (RSI) is a specific application of object recognition like satellite or aerial imagery. This application is highly beneficial in different fields like defense, traffic monitoring, and urban planning. However, complex particulars about the vehicles and the surrounding background, delivered by the RSIs, need sophisticated investigation techniques depending on large data models. This is crucial though the amount of reliable and labelled training datasets is still a constraint. The challenges involved in vehicle detection from the RSIs include variations in vehicle orientations, appearances, and sizes due to dissimilar imaging conditions, weather, and terrain. Both specific architecture and hyperparameters of the Deep Learning (DL) algorithm must be tailored to the features of RS data and the nature of vehicle detection tasks. Therefore, the current study proposes the Intelligent Water Drop Algorithm with Deep Learning-Driven Vehicle Detection and Classification (IWDADL-VDC) methodology to be applied upon the Remote Sensing Images. The IWDADL-VDC technique exploits a hyperparameter-tuned DL model for both recognition and classification of the vehicles. In order to accomplish this, the IWDADL-VDC technique follows two major stages, namely vehicle detection and classification. For vehicle detection process, the IWDADL-VDC method uses the improved YOLO-v7 model. After the vehicles are detected, the next stage of classification is performed with the help of Deep Long Short-Term Memory (DLSTM) approach. In order to enhance the classification outcomes of the DLSTM model, the IWDA-based hyperparameter tuning process has been employed in this study. The experimental validation of the model was conducted using a benchmark dataset and the results attained by the IWDADL-VDC technique were promising over other recent approaches.



    In mathematics, we are familiar with the notion of geometric mean for positive real numbers. This notion was generalized to that for positive definite matrices of the same dimension in many ways. The metric geometric mean (MGM) of two positive definite matrices A and B is defined as

    AB=A1/2(A1/2BA1/2)1/2A1/2. (1.1)

    This mean was introduced by Pusz and Woronowicz [1] and studied in more detail by Ando [2]. Algebraically, AB is a unique solution to the algebraic Riccati equation XA1X=B; e.g., [3]. Geometrically, AB is a unique midpoint of the Riemannian geodesic interpolated from A to B, called the weighted MGM of A and B:

    AtB=A1/2(A1/2BA1/2)tA1/2,0t1. (1.2)

    Remarkable properties of the mean t, where t[0,1], are monotonicity, concavity, and upper semi-continuity (according to the famous Löwner-Heinz inequality); see, e.g., [2,4] and a survey [5,Sect. 3]. Moreover, MGMs play an important role in the Riemannian geometry of the positive definite matrices; see, e.g., [6,Ch. 4].

    Another kind of geometric means of positive definite matrices is the spectral geometric mean (SGM), first introduced by Fiedler and Pták [7]:

    AB=(A1B)1/2A(A1B)1/2. (1.3)

    Note that the scalar consistency holds, i.e., if AB=BA, then

    AB=AB=A1/2B1/2.

    Since the SGM is based on the MGM, the SGM satisfies many nice properties as those for MGMs, for example, idempotency, homogeneity, permutation invariance, unitary invariance, self duality, and a determinantal identity. However, the SGM does not possess the monotonicity, the concavity, and the upper semi-continuity. A significant property of SGMs is that (AB)2 is similar to AB and, they have the same spectrum; hence, the name "spectral geometric mean". The work [7] also established a similarity relation between the MGM AB and the SGM AB when A and B are positive definite matrices of the same size. After that, Lee and Kim [8] investigated the t-weighted SGM, where t is an arbitrary real number:

    AtB=(A1B)tA(A1B)t. (1.4)

    Gan and Tam [9] extended certain results of [7] to the case of the t-weighted SGMs when t[0,1]. Many research topics on the SGMs have been widely studied, e.g., [10,11]. Lim [12] introduced another (weighted) geometric mean of positive definite matrices varying over Hermitian unitary matrices, including the MGM as a special case. The Lim's mean has an explicit formula in terms of MGMs and SGMs.

    There are several ways to extend the classical studies of MGMs and SGMs. The notion of MGMs can be defined on symmetric cones [8,13] and reflection quasigroups [14] via algebraic-geometrical perspectives. In the framework of lineated symmetric spaces [14] and reflection quasigroups equipped with a compatible Hausdorff topology, we can define MGMs of arbitrary reals weights. The SGMs were also investigated on symmetric cones in [8]. These geometric means can be extended to those for positive (invertible) operators on a Hilbert space; see, e.g., [15,16]. The cancellability of such means has significant applications in mean equations; see, e.g., [17,18].

    Another way to generalize the means (1.2) and (1.4) is to replace the traditional matrix multiplications (TMM) by the semi-tensor products (STP) . Recall that the STP is a generalization of the TMM, introduced by Cheng [19]; see more information in [20]. To be more precise, consider a matrix pair (A,B)Mm,n×Mp,q and let α=lcm(n,p). The STP of A and B allows the two matrices to participate the TMM through the Kronecker multiplication (denoted by ) with certain identity matrices:

    AB=(AIα/n)(BIα/p)Mαmn,αqp.

    For the factor-dimension condition n=kp, we have

    AB=A(BIk).

    For the matching-dimension condition n=p, the product reduces to AB=AB. The STP occupies rich algebraic properties as those for TMM, such as bilinearity and associativity. Moreover, STPs possess special properties that TMM does not have, for example, pseudo commutativity dealing with swap matrices, and algebraic formulations of logical functions. In the last decade, STPs were beneficial to developing algebraic state space theory, so the theory can integrate ideas and methods for finite state machines to those for control theory; see a survey in [21].

    Recently, the work [22] extended the MGM notion (1.1) to any pair of positive definite matrices, where the matrix sizes satisfied the factor-dimension condition:

    AB=A1/2(A1/2BA1/2)1/2A1/2. (1.5)

    In fact, AB is a unique positive-definite solution of the semi-tensor Riccati equation XA1X=B. After that, the MGMs of arbitrary weight tR were studied in [23]. In particular, when t[0,1], the weighted MGMs have remarkable properties, namely, the monotonicity and the upper semi-continuity. See Section 2 for more details.

    The present paper is a continuation of the works [22,23]. Here we investigate SGMS involving STPs. We start with the matrix mean equation:

    A1X=(A1B)t,

    where A and B are given positive definite matrices of different sizes, tR, and X is an unknown square matrix. Here, is defined by the formula (1.5). We show that this equation has a unique positive definite solution, which is defined to be the t-weighted SGM of A and B. Another characterization of weighted SGMs are obtained in terms of certain matrix equations. It turns out that this mean satisfies various properties as in the classical case. We establish a similarity relation between the MGM and the SGM of two positive definite matrices of arbitrary dimensions. Our results generalize the work [7] and relate to the work [8]. Moreover, we investigate certain matrix equations involving weighted MGMs and SGMs.

    The paper is organized as follows. In Section 2, we set up basic notation and give basic results on STPs, Kronecker products, and weighted MGMs of positive definite matrices. In Section 3, we characterize the weighted SGM for positive definite matrices in terms of matrix equations, then we provide fundamental properties of weighted SGMs in Section 4. In Section 5, we investigate matrix equations involving weighted SGMs and MGMs. We conclude the whole work in Section 6.

    Throughout, let Mm,n be the set of all m×n complex matrices and abbreviate Mn,n to Mn. Define Cn=Mn,1 as the set of n-dimensional complex vectors. Denote by AT and A the transpose and conjugate transpose of a matrix A, respectively. The n×n identity matrix is denoted by In. The general linear group of n×n complex matrices is denoted by GLn. Let us denote the set of n×n positive definite matrices by Pn. A matrix pair (A,B)Mm,n×Mp,q is said to satisfy a factor-dimension condition if np or pn. In this case, we write AkB when n=kp, and AkB when p=kn.

    Recall that for any matrices A=[aij]Mm,n and BMp,q, their Kronecker product is defined by

    AB=[aijB]Mmp,nq.

    The Kronecker operation (A,B)AB is bilinear and associative.

    Lemma 2.1 (e.g. [5]). Let (A,B)Mm,n×Mp,q, (C,D)Mn,r×Mq,s, and (P,Q)Mm×Mn, then

    (i) (AB)=AB.

    (ii) (AB)(CD)=(AC)(BD).

    (iii) If (P,Q)GLm×GLn, then (PQ)1=P1Q1.

    (iv) If (P,Q)Pm×Pn, then PQPmn and (PQ)1/2=P1/2Q1/2.

    Lemma 2.2 (e.g. [20]). Let (A,B)Mm,n×Mp,q and (P,Q)Mm×Mn, then

    (i) (AB)=BA.

    (ii) If (P,Q)GLm×GLn, then (PQ)1=Q1P1.

    (iii) det(PQ)=(detP)α/m(detQ)α/n where α=lcm(m,n).

    Lemma 2.3 ([23]). For any SPm and XMn, we have XSXPα, where α=lcm(m,n).

    Definition 2.4. Let (A,B)Pm×Pn and α=lcm(m,n). For any tR, the t-weighted MGM of A and B is defined by

    AtB=A1/2(A1/2BA1/2)tA1/2Pα. (2.1)

    Note that A0B=AIα/m and A1B=BIα/n. We simply write AB=A1/2B. We clearly have AtB>0 and AtA=A.

    Lemma 2.5 ([22]). Let (A,B)Pm×Pn be such that AkB, then the Riccati equation

    XA1X=B

    has a unique solution X=ABPn.

    Lemma 2.6 ([23]). Let (A,B)Pm×Pn and X,YPn. Let tR and α=lcm(m,n), then

    (i) Positive homogeneity: For any scalars a,b,c>0, we have c(AtB)=(cA)t(cB) and, more generally,

    (aA)t(bB)=a1tbt(AtB). (2.2)

    (ii) Self duality: (AtB)1=A1tB1.

    (iii) Permutation invariance: A1/2B=B1/2A. More generally, AtB=B1tA.

    (iv) Consistency with scalars: If AB=BA, then AB=A1tBt.

    (v) Determinantal identity:

    det(AB)=(detA)α/m(detB)α/n.

    (vi) Cancellability: If t0, then the equation AtX=AtY implies X=Y.

    In this section, we define and characterize weighted SGMs in terms of certain matrix equations involving MGMs and STPs.

    Theorem 3.1. Let (A,B)Pm×Pn. Let tR and α=lcm(m,n), then the mean equation

    A1X=(A1B)t (3.1)

    has a unique solution XPα.

    Proof. Note that the matrix pair (A,X) satisfies the factor-dimension condition. Let Y=(A1B)t and consider

    X=YAY.

    Using Lemma 2.5, we obtain that Y=A1X. Thus, A1X=(A1B)t. For the uniqueness, let ZPα be such that A1Z=Y. By Lemma 2.5, we get

    Z=YAY=X.

    We call the matrix X in Theorem 3.1 the t-weighted SGM of A and B.

    Definition 3.2. Let (A,B)Pm×Pn and α=lcm(m,n). For any tR, the t-weighted SGM of A and B is defined by

    AtB=(A1B)tA(A1B)tMα. (3.2)

    According to Lemma 2.3, we have AtBPα. In particular, A0B=AIα/m and A1B=BIα/n. When t=1/2, we simply write AB=A1/2B. The formula (3.2) implies that

    AtA=A,AtA1=A12t (3.3)

    for any tR. Note that in the case nm, we have

    AtB=(A1B)tA(A1B)t,

    i.e., Eq (3.2) reduces to the same formula (1.4) as in the classical case m=n. By Theorem 3.1, we have

    A1(AtB)=(A1B)t=(BtA)1B.

    The following theorem provides another characterization of the weighted SGMs.

    Theorem 3.3. Let (A,B)Pm×Pn. Let tR and α=lcm(m,n), then the following are equivalent:

    (i) X=AtB.

    (ii) There exists a positive definite matrix YPα such that

    X=YtAYt=Yt1BYt1. (3.4)

    Moreover, the matrix Y satisfying (3.4) is uniquely determined by Y=A1B.

    Proof. Let X=AtB. Set Y=A1BPα. By Definition 3.2, we have X=YtAYt. By Lemma 2.5, we get YAY=BIα/n. Hence,

    Yt1BYt1=YtY1BY1Yt=YtAYt=X.

    To show the uniqueness, let ZPα be such that

    X=ZtAZt=Zt1BZt1.

    We have ZAZ=BIα/n. Note that the pair (A,BIα/n) satisfies the factor-dimension condition. Now, Lemma 2.5 implies that Z=A1B=Y.

    Conversely, suppose there exists a matrix YPα such that Eq (3.4) holds, then YAY=B. Applying Lemma 2.5, we have Y=A1B. Therefore,

    X=(A1B)tA(A1B)t=AtB.

    Fundamental properties of the weighted SGMs (3.2) are as follows.

    Theorem 4.1. Let (A,B)Pm×Pn, tR, and α=lcm(m,n), then

    (i) Permutation invariance: AtB=B1tA. In particular, AB=BA.

    (ii) Positive homogeneity: c(AtB)=(cA)t(cB) for all c>0. More generally, for any scalars a,b>0, we have

    (aA)t(bB)=a1tbt(AtB).

    (iii) Self-duality: (AtB)1=A1tB1.

    (iv) Unitary invariance: For any UUα, we have

    U(AtB)U=(UAU)t(UBU). (4.1)

    (v) Consistency with scalars: If AB=BA, then AtB=A1tBt.

    (vi) Determinantal identity:

    det(AtB)=(detA)(1t)αm(detB)tαn.

    (vii) Left and right cancellability: For any tR{0} and Y1,Y2Pn, the equation

    AtY1=AtY2

    implies Y1=Y2. For any tR{1} and X1,X2Pm, the equation X1tB=X2tB implies X1=X2. In other words, the maps XAtX and XXtB are injective for any t0,1.

    (viii) (AB)2 is positively similar to AB i.e., there is a matrix PPα such that

    (AB)2=P(AB)P1.

    In particular, (AB)2 and AB have the same eigenvalues.

    Proof. Throughout this proof, let X=AtB and Y=A1B. From Theorem 3.3, the characteristic equation (3.4) holds.

    To prove (ⅰ), set Z=B1tA and W=B1A. By Theorem 3.3, we get

    Z=W1tBW1t=WtAWt.

    It follows from Lemma 2.6(ⅱ) that

    W1=BA1=A1B=Y.

    Hence, X=YtAYt=WtAWt=Z, i.e., AtB=B1tA.

    The assertion (ⅱ) follows directly from the formulas (3.2) and (2.2):

    (aA)t(bB)=(a1A1bB)t(aA)(a1A1bB)t=(a1b)t(A1B)t(aA)(a1b)t(A1B)t=(a1b)ta(a1b)t(A1B)tA(A1B)t=a1tbt(AtB).

    To prove the self-duality (ⅲ), set W=Y1=AB1. Observe that

    X1=(YtAYt)1=YtA1Yt=WtA1Wt,X1=(Yt1BYt1)1=Y1tB1Y1t=Wt1B1Wt1.

    Theorem 3.3 now implies that

    (AtB)1=X1=A1tB1.

    To prove (ⅳ), let UUα and consider W=UYU. We have

    WtUAUWt=UYtUUAUUYtU=UYtAYtU=UXU,

    and, similarly,

    Wt1UBUWt1=UYt1BYt1U=UXU.

    By Theorem 3.3, we arrive at (4.1).

    For the assertion (ⅴ), the assumption AB=BA together with Lemma 2.6 (ⅳ) yields

    Y=A1B=A1/2B1/2.

    It follows that

    YtAYt=At/2Bt/2AAt/2Bt/2=A1tBt,Yt1BYt1=A(t1)/2B(t1)/2BA(t1)/2B(t1)/2=A1tBt.

    Now, Theorem 3.3 implies that AtB=A1tBt. The determinantal identity (ⅵ) follows directly from the formula (1.4), Lemma 2.2(ⅲ), and Lemma 2.6(ⅴ):

    det(AtB)=det(A1B)2t(detA)αm=(detA)αtm(detB)αtn(detA)αm=(detA)(1t)αm(detB)tαn.

    To prove the left cancellability, let tR{0} and suppose that AtY1=AtY2. We have

    (A1/2(A1Y1)tA1/2)2=A1/2(AtY1)A1/2=A1/2(AtY2)A1/2=(A1/2(A1Y2)tA1/2)2.

    Taking the positive square root yields

    A1/2(A1Y1)tA1/2=A1/2(A1Y2)tA1/2,

    and, thus, (A1Y1)t=(A1Y2)t. Since t0, we get A1Y1=A1Y2. Using the left cancellability of MGM (Lemma 2.6(ⅵ)), we obtain Y1=Y2. The right cancellability follows from the left cancellability together with the permutation invariance (ⅰ).

    For the assertion (ⅷ), since AB=Y1/2AY1/2=Y1/2BY1/2, we have

    (AB)2=(Y1/2AY1/2)(Y1/2BY1/2)=Y1/2(AB)Y1/2.

    Note that the matrix Y1/2 is positive definite. Thus, (AB)2 is positively similar to AB, so they have the same eigenvalues.

    Remark 4.2. Let (A,B)Pm×Pn. Instead of Definition 3.2, the permutation invariance (ⅰ) provides an alternative definition of AtB as follows:

    AtB=(B1A)1tB(B1A)1t=(AB1)1tB(AB1)1t.

    In particular, if mn, we have

    AtB=(AB1)1tB(AB1)1t.

    The assertion (ⅷ) is the reason why AB is called the SGM.

    Now, we will show that AB and AtB are positively similar when A and B are positive definite matrices of arbitrary sizes. Before that, we need the following lemma.

    Lemma 4.3. Let (A,B)Pm×Pn. Let tR and α=lcm(m,n), then there exists a unique YtPα such that

    AtB=YtAYtandBtA=Y1tAY1t.

    Proof. Set Yt=(A1B)t, then YtAYt=AtB. Using Lemma 2.6, we obtain that

    Y1tBY1t=(B1A)tB(B1A)t=BtA.

    To prove the uniqueness, let ZtPα be such that ZtAZt=AtB and Z1tAZ1t=BtA. By Lemma 2.5, we get Zt=A1(AtB), but Theorem 3.1 says that

    A1(AtB)=(A1B)t.

    Thus, Zt=Yt.

    Theorem 4.4. Let (A,B)Pm×Pn. Let tR and α=lcm(m,n), then AB is positively similar to (A1tB)1/2U(AtB)1/2 for some unitary UMα.

    Proof. By Lemma 4.3, there exists YtPα such that AtB=YtAYt and BtA=Y1tAY1t. Using Lemmas 2.2 and 2.5, we have

    Yt(A1tB)Yt=BIα/n=(AB)A1(AB)=(AB)Yt(AtB)1Yt(AB),

    then

    ((AtB)1/2Yt(AB)Yt(AtB)1/2)2=(AtB)1/2Y2t(A1tB)Y2t(AtB)1/2.

    Thus,

    AB=Y1t(AtB)1/2((AtB)1/2Y2t(A1tB)Y2t(AtB)1/2)1/2(AtB)1/2Y1t.

    Set V=(AtB)1/2Y2t(A1tB)1/2 and U=V1(VV)1/2. Obviously, U is a unitary matrix. We obtain

    AB=Y1t(AtB)1/2(VV)1/2(AtB)1/2Y1t=Yt(A1tB)1/2V1(VV)1/2(AtB)1/2Y1t=Yt(A1tB)1/2U(AtB)1/2Y1t.

    This implies that (A1tB)1/2U(AtB)1/2 is positive similar to AB.

    In general, the MGM AtB and the SGM AtB are not comparable (in the Löwner partial order). We will show that AtB and AtB coincide in the case that A and B are commuting with respect to the STP. To do this, we need a lemma.

    Lemma 4.5. Let (P,Q)Pm×Pn. If

    PQPQ1=QPQ1P, (4.2)

    then PQ=QP.

    Proof. From Eq (4.2), we have

    (Q1/2PQ1/2)(Q1/2PQ1/2)=(Q1/2PQ1/2)(Q1/2PQ1/2).

    This implies that Q1/2PQ1/2 is a normal matrix. Since Q1/2PQ1/2 and PIα/m are similar matrices, we conclude that the eigenvalues of Q1/2PQ1/2 are real and Q1/2PQ1/2 is Hermitian. Hence,

    Q1/2PQ1/2=(Q1/2PQ1/2)=Q1/2PQ1/2.

    Therefore, PQ=QP.

    The next theorem generalizes [7,Theorem 5.1].

    Theorem 4.6. Let (A,B)Pm×Pn and tR. If AB=BA, then AtB=AtB. In particular, AB=AB if and only if AB=BA.

    Proof. Suppose AB=BA. By Lemma 2.6 and Theorem 4.1, we have

    AtB=A1tBt=AtB.

    Next, assume that AB=AB=X. By Lemma 2.5, we have

    XA1X=BIα/n.

    Set Y=A1B. By Lemma 3.3, we get X=Y1/2AY1/2=Y1/2BY1/2. It follows that

    Y1/2XY1/2=BIα/n=XA1X=XY1/2X1Y1/2X.

    Thus,

    Y1/2XY1/2X1=XY1/2X1Y1/2.

    Lemma 4.5 implies that XY1/2=Y1/2X. Hence,

    AB=AYAY=Y1/2X2Y1/2=X2=Y1/2X2Y1/2=YAYA=BA.

    Theorem 4.7. Let (A,B)Pm×Pn and α=lcm(m,n), then the following statements are equivalent:

    (i) AB=Iα,

    (ii) AIα/m=B1Iα/n,

    (iii) AB=Iα.

    Proof. First, we show the equivalence between the statements (ⅰ) and (ⅱ). Suppose that AB=Iα. Letting Y=A1B, we have by Theorem 3.3 that

    Y1/2AY1/2=Y1/2BY1/2=Iα.

    Applying Lemma 2.1, we obtain

    AIα/m=Y1=B1Iα/n.

    Now, suppose AIα/m=B1Iα/n. By Lemma 2.1, we have

    AB=(AIα/m)(BIα/n)=(B1Iα/n)(BIα/n)=InIα/n=Iα,

    and similarly, BA=Iα. Now, Theorem 4.1(ⅴ) implies that

    AB=A1/2B1/2=(B1/2Iα/n)(B1/2Iα/n)=Iα.

    Next, we show the equivalence between (ⅱ) and (ⅲ). Suppose that AB=Iα, then we have

    (A1/2BA1/2)1/2=A1/2IαA1/2=A1Iα/m.

    This implies that

    A1/2BA1/2=(A1Iα)2=A2Iα/m.

    Thus, BIα/n=A1Iα/m or AIα/m=B1Iα/n.

    Now, suppose (ⅲ) holds, then we get AB=Iα=BA. It follows from Lemma 2.6 (ⅳ) that AB=A1/2B1/2=Iα.

    In particular from Theorem 4.7, when m=n, we have that AB=In if and only if A=B1, if and only if, AB=In. This result was included in [7] and related to the work [8].

    In this section, we investigate matrix equations involving MGMs and SGMs of positive definite matrices. In particular, recall that the work [23] investigated the matrix equation AtX=B. We discuss this matrix equation when the MGM t is replaced by the SGM t in the next theorem.

    Theorem 5.1. Let (A,B)Pm×Pn where mn. Let tR{0}, then the mean equation

    AtX=B, (5.1)

    in an unknown XPn, is equivalent to the Riccati equation

    WtAWt=B (5.2)

    in an unknown WtPn. Moreover, Eq (5.1) has a unique solution given by

    X=A1/tB=(AB1)11tB(AB1)11t. (5.3)

    Proof. Let us denote Wt=(A1X)t for each tR{0}. By Definition 3.2, we have

    AtX=(A1X)tA(A1X)t=WtAWt.

    Note that the map XWt is injective due to the cancellability of the MGM t (Lemma 2.6(ⅵ)). Thus, Eq (5.1) is equivalent to the Riccati equation (5.2). Now, Lemma 2.5 implies that Eq (5.2) is equivalent to Wt=A1B. Thus, Eq (5.1) is equivalent to the equation

    (A1X)t=A1B. (5.4)

    We now solve (5.4). Indeed, we have

    A1X=(A1B)1/t.

    According to Theorem 3.1 and Definition 3.2, this equation has a unique solution denoted by the SGM of A and B with weight 1/t. Now, Remark 4.2 provides the explicit formula (5.3) of A1/tB.

    Remark 5.2. For the case nm in Theorem 5.1, we get a similar result. In particular to the case mn, the mean equation

    AX=B (5.5)

    has a unique solution X=(A1B)B(A1B).

    Theorem 5.3. Let (A,B)Pm×Pn. Let tR{0} and α=lcm(m,n), then the equation

    (AX)t(BX)=Iα (5.6)

    has a unique solution X=A1tB1Pα.

    Proof. For the case t=0, Lemma 2.5 tells us that the equation AX=Iα has a unique solution

    X=A1Iα/m=A10B1.

    Now, assume that t0. To prove the uniqueness, let U=AX and V=BX, then

    UA1U=X=VB1V.

    Since UtV=Iα, we obtain (U1/2VU1/2)t=U1 and, thus, V=U(t1)/t. It follows that

    BIα/n=VX1V=VU1AU1V=U1/tAU1/t.

    Using Lemma 2.5, we have that U1/t=A1B and, thus, U=(A1B)t. Hence,

    X=(A1B)tA1(A1B)t=(AB1)tA1(AB1)t=A1tB1.

    Corollary 5.4. Let (A,B)Pm×Pn and α=lcm(m,n), then the equation

    AX=BX1 (5.7)

    has a unique solution X=A1BPα.

    Proof. Equation (5.7) and Lemma 2.6 imply that

    (AX)1=(BX1)1=B1X.

    Thus, Eq (5.7) is equivalent to the following equation:

    (AX)1/2(B1X)=Iα.

    Now, the desired solution follows from the case t=1/2 in Theorem 5.3.

    In particular, when m=n and A=B, the equation AX=AX1 has a unique solution X=AA1=A0=I by Eq (3.3).

    Theorem 5.5. Let (A,B)Pm×Pn and α=lcm(m,n), then the equation

    (AX)t(BX)=Iα (5.8)

    has a unique solution X=A1tB1Pα.

    Proof. If t=0, the equation AX1=Iα has a unique solution X=A1Iα/m=A10B1. Now, consider t0, and let U=AX and V=BX, then

    U1AU1=X1=V1BV1.

    Since UtV=Iα, we have that U=(U1V)2t, i.e., U1/(2t)=UV1. Applying Lemma 2.5, we get V1=U1/(2t)U1U1/(2t)=U(1t)/t. Hence,

    B=VU1AU1V=U1/tAU1/t.

    Using Lemma 2.5, we have U1/t=A1B, i.e., U=(A1B)t. Thus,

    X1=(A1B)tA(A1B)t=AtB.

    Hence, by the self-duality of the SGM t, we have

    X=(AtB)1=A1tB1.

    All results in this section seem to be not noticed before in the literature. In particular, from Theorems 5.3 and 5.5, when m=n and A=B, the equation AX=I has a unique solution X=A1.

    We characterize weighted SGMs of positive definite matrices in terms of certain matrix equations involving MGMs and STPs. Indeed, for each real number t, the unique positive solution of the matrix equation A1X=(A1B)t is defined to be the t-weighted SGM of A and B. We then establish several properties of the weighted SGMs such as permutation invariance, homogeneity, self-duality, unitary invariance, cancellability, and a determinantal identity. The most significant property is the fact that (AB)2 is positively similar to AB, so the two matrices have the same spectrum. The results in Sections 3 and 5 include the classical weighted SGMs of matrices as special cases. Furthermore, we show that certain equations concerning weighted SGMs and weighted MGMs of positive definite matrices have a unique solution written explicitly as weighted SGMs of associated matrices. In particular, the equation AtX=B can be expressed in terms of the famous Riccati equation. For future works, we may investigate SGMs from differential-geometry viewpoints, such as geodesic property.

    The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.

    This research project is supported by National Research Council of Thailand (NRCT): (N41A640234). The authors would like to thank the anonymous referees for comments and suggestions.

    The authors declare there is no conflicts of interest.



    [1] F. Safarov, K. Temurbek, D. Jamoljon, O. Temur, J. C. Chedjou, A. B. Abdusalomov, et al, Improved agricultural field segmentation in satellite imagery using TL-ResUNet architecture, Sensors, 22 (2022), 9784. https://doi.org/10.3390/s22249784
    [2] M. A. Momin, M. H. Junos, A. S. M. Khairuddin, M. S. A. Talip, Lightweight CNN model: Automated vehicle detection in aerial images. Signal Image Video P., 17 (2023), 1209–1217. https://doi.org/10.1007/s11760-022-02328-7
    [3] Y. Wang, F. Peng, M. Lu, M. A. Ikbal, Information extraction of the vehicle from high-resolution remote sensing image based on convolution neural network, Recent Adv. Electr. El., 16 (2023), 168–177. https://doi.org/10.2174/2352096515666220820174654 doi: 10.2174/2352096515666220820174654
    [4] L. Wang, Y. Shoulin, H. Alyami, A. A. Laghari, M. Rashid, J. Almotiri, et al., A novel deep learning—based single shot multibox detector model for object detection in optical remote sensing images, Geosci. Data J., 2022, 1–15. https://doi.org/10.1002/gdj3.162
    [5] R. Ghali, M. A. Akhloufi, Deep learning approaches for wildland fires remote sensing: Classification, detection, and segmentation, Remote Sens., 15 (2023), 1821. https://doi.org/10.3390/rs15071821
    [6] C. Anusha, C. Rupa, G. Samhitha, Region-based detection of ships from remote sensing satellite imagery using deep learning. In: 2022 2nd International Conference on Innovative Practices in Technology and Management (ICIPTM), 2022, https://doi.org/10.1109/ICIPTM54933.2022.9754168
    [7] Y. Chen, R. Qin, G. Zhang, H. Albanwan, Spatial-temporal analysis of traffic patterns during the COVID-19 epidemic by vehicle detection using planet remote-sensing satellite images, Remote Sens., 13 (2021), 208. https://doi.org/10.3390/rs13020208 doi: 10.3390/rs13020208
    [8] L. K, S. Karnick, M. R. Ghalib, A. Shankar, S. Khapre, I. A. Tayubi, A novel method for vehicle detection in high-resolution aerial remote sensing images using YOLT approach, Multimed. Tools Appl., 81 (2022), 23551–23566. https://doi.org/10.1007/s11042-022-12613-9 doi: 10.1007/s11042-022-12613-9
    [9] B. Wang, B. Xu, A feature fusion deep-projection convolution neural network for vehicle detection in aerial images, PLoS One, 16 (2021), e0250782. https://doi.org/10.1371/journal.pone.0250782
    [10] J. Wang, X. Teng, Z. Li, Q. Yu, Y. Bian, J. Wei, VSAI: A multi-view dataset for vehicle detection in complex scenarios using aerial images, Drones, 6 (2022), 161. https://doi.org/10.3390/drones6070161 doi: 10.3390/drones6070161
    [11] M. Alajmi, H. Alamro, F. Al-Mutiri, M. Aljebreen, K. M. Othman, A. Sayed, Exploiting remote sensing imagery for vehicle detection and classification using an artificial intelligence technique, Remote Sens., 15 (2023), 4600. https://doi.org/10.3390/rs15184600 doi: 10.3390/rs15184600
    [12] S. Javadi, M. Dahl, M. I. Pettersson, Vehicle detection in aerial images based on 3D depth maps and deep neural networks, IEEE Access, 9 (2021), 8381–8391. https://doi.org/10.1109/ACCESS.2021.3049741 doi: 10.1109/ACCESS.2021.3049741
    [13] P. Gao, T. Tian, T. Zhao, L. Li, N. Zhang, J. Tian, Double FCOS: A two-stage model utilizing FCOS for vehicle detection in various remote sensing scenes, IEEE J. STARS, 15 (2022), 4730–4743. https://doi.org/10.1109/JSTARS.2022.3181594 doi: 10.1109/JSTARS.2022.3181594
    [14] M. Ragab, H. A. Abdushkour, A. O. Khadidos, A. M. Alshareef, K. H. Alyoubi, A. O. Khadidos, Improved deep learning-based vehicle detection for urban applications using remote sensing imagery, Remote Sens., 15 (2023), 4747. https://doi.org/10.3390/rs15194747 doi: 10.3390/rs15194747
    [15] C. H. Karadal, M. C. Kaya, T. Tuncer, S. Dogan, U. R. Acharya, Automated classification of remote sensing images using multileveled MobileNetV2 and DWT technique, Expert Syst. Appl., 185 (2021), 115659. https://doi.org/10.1016/j.eswa.2021.115659 doi: 10.1016/j.eswa.2021.115659
    [16] I. Ahmed, M. Ahmad, A. Chehri, M. M. Hassan, G. Jeon, IoT enabled deep learning based framework for multiple object detection in remote sensing images, Remote Sens., 14 (2022), 4107. https://doi.org/10.3390/rs14164107 doi: 10.3390/rs14164107
    [17] Y. Alotaibi, K. Nagappan, G. Rani, S. Rajendran, Vehicle detection and classification using optimal deep learning on high-resolution remote sensing imagery for urban traffic monitoring, 2023. Preprint. https://doi.org/10.21203/rs.3.rs-3272891/v1
    [18] S. Gadamsetty, R. Ch, A. Ch, C. Iwendi, T. R. Gadekallu, Hash-based deep learning approach for remote sensing satellite imagery detection, Water, 14 (2022), 707. https://doi.org/10.3390/w14050707 doi: 10.3390/w14050707
    [19] C. Xie, C. Lin, X. Zheng, B. Gong, H. Liu, Dense sequential fusion: Point cloud enhancement using foreground mask guidance for multimodal 3D object detection, IEEE T. Instrum. Meas., 73 (2024), 9501015, https://doi.org/10.1109/TIM.2023.3332935 doi: 10.1109/TIM.2023.3332935
    [20] S. M. Alshahrani, S. S. Alotaibi, S. Al-Otaibi, M. Mousa, A. M. Hilal, A. A. Abdelmageed, et al., Optimal deep convolutional neural network for vehicle detection in remote sensing images, CMC Comput. Mater. Con, 74 (2023), 3117–3131. https://doi.org/10.32604/cmc.2023.033038
    [21] M. A. Ahmed, S. A. Althubiti, V. H. C. de Albuquerque, M. C. dos Reis, C. Shashidhar, T. S. Murthy, et al., Fuzzy wavelet neural network driven vehicle detection on remote sensing imagery, Comput. Electr. Eng., 109 (2023), 108765. https://doi.org/10.1016/j.compeleceng.2023.108765
    [22] M. Aljebreen, B. Alabduallah, H. Mahgoub, R. Allafi, M. A. Hamza, S. S. Ibrahim, et al., Integrating IoT and honey badger algorithm based ensemble learning for accurate vehicle detection and classification, Ain Shams Eng. J., 14 (2023), 102547. https://doi.org/10.1016/j.asej.2023.102547
    [23] Y. Lai, R. Ma, Y. Chen, T. Wan, R. Jiao, H. He, A pineapple target detection method in a field environment based on improved YOLOv7, Appl. Sci., 13 (2023), 2691. https://doi.org/10.3390/app13042691 doi: 10.3390/app13042691
    [24] Y. F. Shi, C. Yang, J. Wang, Y. Zheng, F. Y. Meng, L. F. Chernogor, A hybrid deep learning‐based forecasting model for the peak height of ionospheric F2 layer, Space Weather, 21 (2023), e2023SW003581. https://doi.org/10.1029/2023SW003581
    [25] B. O. Alijla, C. P. Lim, L. P. Wong, A. T. Khader, M. A. Al-Betar, An ensemble of intelligent water drop algorithm for feature selection optimization problem, Appl. Soft Comput., 65 (2018), 531–541. https://doi.org/10.1016/j.asoc.2018.02.003
    [26] S. Razakarivony, F. Jurie, Vehicle detection in aerial imagery: A small target detection benchmark, J. Vis. Commun. Image R., 34 (2016), 187–203. https://doi.org/10.1016/j.jvcir.2015.11.002 doi: 10.1016/j.jvcir.2015.11.002
    [27] F. Rottensteiner, G. Sohn, J. Jung, M. Gerke, C. Baillard, S. Benitez, U. Breitkopf, The ISPRS benchmark on urban object classification and 3D building reconstruction, ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci., 1–3 (2012), 293–298.
  • Reader Comments
  • © 2024 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(1712) PDF downloads(109) Cited by(0)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog