Citation: Jun He. A SOR-like AVM for the maximal correlation problem[J]. AIMS Mathematics, 2018, 3(1): 253-262. doi: 10.3934/Math.2018.1.253
[1] | Ling Rao . Portfolio selection based on uncertain fractional differential equation. AIMS Mathematics, 2022, 7(3): 4304-4314. doi: 10.3934/math.2022238 |
[2] | Chengjin Tang, Jiahao Guo, Yinghui Dong . Optimal investment based on performance measure with a stochastic benchmark. AIMS Mathematics, 2025, 10(2): 2750-2770. doi: 10.3934/math.2025129 |
[3] | Linsen Song, Gaoli Sheng . A two-step smoothing Levenberg-Marquardt algorithm for real-time pricing in smart grid. AIMS Mathematics, 2024, 9(2): 4762-4780. doi: 10.3934/math.2024230 |
[4] | Dan Li, Xiang Li . A note on Boussinesq maximal estimate. AIMS Mathematics, 2024, 9(1): 1819-1830. doi: 10.3934/math.2024088 |
[5] | Limin Guo, Lishan Liu, Ying Wang . Maximal and minimal iterative positive solutions for $ p $-Laplacian Hadamard fractional differential equations with the derivative term contained in the nonlinear term. AIMS Mathematics, 2021, 6(11): 12583-12598. doi: 10.3934/math.2021725 |
[6] | Ahmed M.A. El-Sayed, Eman M.A. Hamdallah, Hameda M. A. Alama . Multiple solutions of a Sturm-Liouville boundary value problem of nonlinear differential inclusion with nonlocal integral conditions. AIMS Mathematics, 2022, 7(6): 11150-11164. doi: 10.3934/math.2022624 |
[7] | Naoto Kajiwara . Solution formula for generalized two-phase Stokes equations and its applications to maximal regularity: Model problems. AIMS Mathematics, 2024, 9(7): 18186-18210. doi: 10.3934/math.2024888 |
[8] | T. K. Buvaneshwari, D. Anuradha . Solving bi-objective bi-item solid transportation problem with fuzzy stochastic constraints involving normal distribution. AIMS Mathematics, 2023, 8(9): 21700-21731. doi: 10.3934/math.20231107 |
[9] | Paul Augustine Ejegwa, Nasreen Kausar, John Abah Agba, Francis Ugwuh, Emre Özbilge, Ebru Ozbilge . Determination of medical emergency via new intuitionistic fuzzy correlation measures based on Spearman's correlation coefficient. AIMS Mathematics, 2024, 9(6): 15639-15670. doi: 10.3934/math.2024755 |
[10] | Yue Zhao, Meixia Li, Xiaowei Pan, Jingjing Tan . Partial symmetric regularized alternating direction method of multipliers for non-convex split feasibility problems. AIMS Mathematics, 2025, 10(2): 3041-3061. doi: 10.3934/math.2025142 |
Canonical correlation analysis plays an important role for assessing the relationship between sets of variables. In order to find the linear combination of the set of variables, the maximal correlation problem (MCP) is proposed by Hotelling [1,2]. When we consider the optimal solution to the MCP, which then corresponds to the global maximizer of the following equality constrained optimization problem:
{maximizer(x):=xTAx,subject to‖xi‖2=1, i=1,2,⋯,m, | (1.1) |
where
A=(A11A12⋯A1mA21A22⋯A2m⋮⋮⋱⋮Am1Am2⋯Amm)∈Rn×n |
is a symmetric and positive definite matrix.
Using the method of Lagrange multipliers, we will get the following equations:
{Ax=Λx,‖xi‖2=1, i=1,2,⋯,m. | (1.2) |
where
Λ:=diag{λ1I[n1],λ2I[n2],⋯,λmI[nm]} |
with
The Horst-Jacobi algorithm as a generalization of the so called power method is introduced to solve MEP in [5], its convergence theory was established much later in [3]. The Gauss-Seidel algorithm has also been suggested in [3] and its monotone convergence is recently established in [7]. Based on a core engine in seeking global maximum of the MCP, the authors proposed an alternating variable method (AVM) in [9]. This algorithm is proved to enjoy the global and monotone convergence and it is also shown that for a nonnegative irreducible matrix
In this paper, we propose the SOR-like AVM and the monotone convergence of the SOR-like AVM is proved.
If we express the objective function
ri(x)=(xTiAiixi+2xTi∑j≠iAijxj)+∑k≠i∑j≠ixTkAkjxj, |
It is very clear that if
max‖xi‖2=1ri(x∗1,⋯,x∗i−1,xi,x∗i+1,⋯,x∗m). | (2.1) |
Then we can get the following AVM algorithm, which have shown its efficient in [9].
Algorithm 1 (The framework of the alternating variable method (AVM)).
Select
while the stoping criterion is not met do
for
end for
end while
Denote
x(k+1):=[(x(k+1)1)T,⋯,(x(k+1)i−1)T,(x(k+1)i)T,⋯,(x(k+1)m)T]T∈Rn,g(k)i:=i−1∑j=1Aijx(k+1)j+m∑j=i+1Aijx(k)j∈Rni,m(k)(xi):=ri(x(k+1)1,⋯x(k+1)i−1,x(k)i,x(k)i+1,⋯,x(k)m)∈R,c(k)i:=m(k)(xi)−(x(k)i)TAiix(k)i−2(x(k)i)Tg(k)i∈R. | (2.2) |
Then, the
x(k+1)i:=argmax‖xi‖2=1m(k)(xi)=(x(k)i)TAiix(k)i+2(x(k)i)Tg(k)i+c(k)i. |
Let
(−Aii+λ(k+1)iI[ni])x(k+1)i=g(k)i, | (2.3) |
‖x(k+1)i‖2=1, | (2.4) |
(−Aii+λ(k+1)iI[ni]) is positive semidefinite. | (2.5) |
If we split
A=D+U+UT, | (2.6) |
where
D=diag{A11,⋯,Amm∈Rn×n} | (2.7) |
is the block diagonal matrix of
Λ(k+1)x(k+1)=(D+UT)x(k+1)+Ux(k). | (2.8) |
We give the following SOR-like AVM.
Algorithm 2 (SOR-like AVM).
Select
while the stoping criterion is not met do
for
end for
end while
From the Algorithm 2, let
λ(k+1)iy(k+1)i=i∑j=1Aijx(k+1)j+m∑j=i+1Aijx(k)j, |
and
ξ(k)ix(k+1)i=(1−ω(k)i)x(k)i+ω(k)iy(k+1)i. | (2.9) |
Then, we get
λ(k+1)iξ(k)ix(k+1)i=λ(k+1)i(1−ω(k)i)x(k)i+ω(k)ii∑j=1Aijx(k+1)j+ω(k)im∑j=i+1Aijx(k)j. | (2.10) |
If we define
Ξ(k):=diag{ξ(k)1I[n1],⋯,ξ(k)mI[nm]}, | (2.11) |
and
Ω(k):=diag{ω(k)1I[n1],⋯,ω(k)mI[nm]}, | (2.12) |
then Algorithm 2 may be written in matrix form:
[Λ(k+1)Ξ(k)−Ω(k)(UT+D)]x(k+1)=[(I−Ω(k))Λ(k+1)+Ω(k)U]x(k). | (2.13) |
Then, we can get
r(x(k+1))=(x(k+1))TAx(k+1)=(x(k+1))T(D+U+UT)x(k+1)=(x(k+1))T[U+(Ω(k))−1Λ(k+1)Ξ(k)−(Ω(k))−1(Λ(k+1)Ξ(k)−Ω(k)(UT+D))]x(k+1)=(x(k+1))T[U+(Ω(k))−1Λ(k+1)Ξ(k)]x(k+1)−(x(k+1))T(Ω(k))−1[(I−Ω(k))Λ(k+1)+Ω(k)U]x(k), | (2.14) |
and
r(x(k))=(x(k))TAx(k)=(x(k))T(D+U+UT)x(k)=(x(k))T[((I−Ω(k))Λ(k+1)+Ω(k)U)(Ω(k))−1−(I−Ω(k))Λ(k+1)(Ω(k))−1+UT+D]x(k)=(x(k+1))T[Λ(k+1)Ξ(k)−Ω(k)(U+D)](Ω(k))−1x(k)−(x(k))T[−(I−Ω(k))Λ(k+1)(Ω(k))−1+UT+D]x(k). | (2.15) |
Hence, by (2.14) and (2.15),
Δr(x(k))=r(x(k+1))−r(x(k))=(x(k+1))TUx(k+1)−(x(k))TUx(k)+(x(k+1))TDx(k)−(x(k))TDx(k)+m∑i=1λ(k+1)iω(k)i(ξ(k)i+1−ω(k)i)(1−(x(k+1)i)Tx(k)i) | (2.16) |
=12[r(x(k+1))−r(x(k))]−12(s(k))TDs(k)+12(s(k))T[(Ω(k))−1Λ(k+1)(Ξ(k)+I−Ω(k))]s(k), | (2.17) |
where
Δr(x(k))=(s(k))T[−D+(Ω(k))−1Λ(k+1)(Ξ(k)+I−Ω(k))]s(k). | (2.18) |
Theorem 2.1. Let
limk→+∞s(k)=0. | (2.19) |
Proof. Under these assumptions, and by (2.18), we know that
Let
{−D+(Ω(k))−1Λ(k+1)(Ξ(k)+I−Ω(k))} |
is uniformly positive definite, then we have
−ai+λ(k+1)i(ξki−ω(k)i+1)ω(k)i>0. |
By direct computation, we have
ω(k)i<λ(k+1)i(ξki+1)λ(k+1)i+ai<λ(k+1)i(ξki+1)λ(k+1)i=ξki+1. | (2.20) |
By the equation (2.9), if
ξki‖x(k+1)i‖2=‖(1−ω(k)i)x(k)i+ω(k)iy(k+1)i‖≥ω(k)i−(ω(k)i−1)=1. |
Then, we have
ω(k)i<2. |
And we find that, if
Lemma 2.2. ([1, Lemma 4.4]). Let
Theorem 2.3. Under the assumptions of Theorem 2.1, and assume further that A has n distinct eigenvalues,
Proof. From the compactness, we know that
ˉξˉx=(1−ω)ˉx+ωlimk→+∞y(k+1)i, |
and
‖limk→+∞y(k+1)i‖2=‖ˉx‖2=1. |
Then we get
limk→+∞y(k+1)i=ˉx. |
Therefore,
limk→+∞δx(k)=limk→+∞(Ax(k)−Λx(k))=limk→+∞(Ax(k)−Λy(k))=limk→+∞[(U+UT+D)x(k)−(UT+D)x(k)−Ux(k−1)]=limk→+∞Us(k−1)=0. | (2.21) |
So, subsequences
From the compactness and the result that the MEP has only finitely many solutions [3], we know that
|λ(k+1)i−λ(k)i|≤i∑j=1‖Aij‖2‖x(k+1)i−x(k)i‖2+m∑j=i+1‖Aij‖2‖x(k)i−x(k−1)i‖2. |
By Theorem 2.1, we see that
We shall show that the SOR-like AVM is able to converge globally to the global maximizer of the MCP when
Theorem 2.4. Suppose that A is a nonnegative irreducible matrix. For any
Proof. Similar to the proof of the Theorem 3.6 in [9].
In this section, we present our numerical experiments of the SOR-like AVM to show its efficiency, and most importantly, the effectiveness in seeking the global maximum of the MCP. All of our tests were conducted in MATLAB 7.0. We run the algorithm starting from
Example 1. We choose the symmetric and positive definite matrix
Example 2. The matrix
A=(IA12A13AT12IA23AT13AT23I),A12=(0.6360.1260.059−0.0210.6330.0490.0160.1570.521), |
A13=(0.6260.1950.0590.0350.4590.1290.0480.2380.426),A23=(0.7090.050−0.0020.0390.5320.1900.0670.2580.299), |
with
In Table 1, under the columns "Avg. Iter. #" are the average numbers of iterations needed to meet the stopping criterion. Under columns "% to Global" are the sample probabilities, out of the
Example | | Avg. Iter.# | % to Global |
Example 1 | | 99.44 | 100.00 |
| 33.08 | 100.00 | |
Example 2 | | 22.59 | 100.00 |
| 12.83 | 100.00 |
Furthermore, to demonstrate more clearly the performance of each algorithm, we plot the history of
x(0)=[1√50,…,1√50,1√82,…,1√82]T, |
while the sequence
x(0)=[1,0,1,0,0,1,0,0,0]T. |
In this paper, we prove the monotone convergence of the SOR-like alternating variable method. The SOR-like AVM shows its computational advantage, for different choice of the
This research is supported by Science and technology Foundation of Guizhou Province (Qian Ke He Ji Chu [2016]1161); Guizhou Province natural science foundation in China (Qian Jiao He KY [2016]255); The doctoral scientific research foundation of Zunyi Normal College(BS[2015]09);Highlevel innovative talents of Guizhou Province(Zun Ke He Ren Cai[2017]8).
The author declares no conflicts of interest in this paper.
[1] | H. Hotelling, The most predictable criterion, J. Educ. Pyschol., 26 (1935), 139–142. |
[2] | H. Hotelling, Relations between two sets of variates, Biometrika, 28 (1936), 321–377. |
[3] | M. T. Chu, J. L. Watterson, On a multivariate eigenvalue problem, part Ⅰ: Algebraic theory and a power method, SIAM J. Sci. Comput., 14 (1993), 1089–1106. |
[4] | M. Hanafi, J. M. F. Ten Berge, Global optimality of the successive Maxbet algorithm, Psychometrika, 68 (2003), 97–103. |
[5] | P. Horst, Relations among m sets of measures, Psychometrika, 26 (1961), 129–149. |
[6] | J. Nocedal, S. J. Wright, Numerical Optimization, 2nd edn. Springer, New York, 2006. |
[7] | L. H. Zhang, M. T. Chu, Computing absolute maximum correlation, IMA J. Numer. Anal., 32 (2012), 163–184. |
[8] | L. H. Zhang, L. Z. Liao, L. M. Sun, Towards the global solution of the maximal correlation problem, J. Global Optim., 49 (2011), 91–107. |
[9] | L. H. Zhang, L. Z. Liao, (2012) An alternating variable method for the maximal correlation problem, J. Global Optim., 54 (2012), 199–218. |
Example | | Avg. Iter.# | % to Global |
Example 1 | | 99.44 | 100.00 |
| 33.08 | 100.00 | |
Example 2 | | 22.59 | 100.00 |
| 12.83 | 100.00 |