In this paper, we mainly consider the regularized estimation problem of parameters in the multiplicative regression model, where the response variable is always positive. Hao, Lin, and Zhao, Comput. Stat. Data An., 103 (2016) investigated an adaptive variable selection method via the least product relative error (LPRE) criterion and lasso-type penalty with fixed or diverging number of covariates and showed the resultant estimator achieves the oracle property. However, the alternating direction method of multipliers (ADMM) algorithm proposed by the authors is based on the least square approximation of the LPRE loss function, where a well-behaved initial estimator must be determined in advance, and the convergence is not validated. Through careful introduction of auxiliary variables and a three-block reformulation, our ADMM algorithm eliminates sensitivity to initial values while ensuring convergence. In addition, by virtue of the symmetric Bregman (SB) divergence and natural extensions of compatibility and weak cone invertibility factors, we establish nonasymptotic oracle inequalities for the $ \ell_1 $ estimation error and prediction error measured by the SB divergence of the lasso penalized LPRE estimator. The proposed method is shown to be very efficient owing to the fact that almost each derived subproblem has a closed-form solution. Extensive simulation studies are conducted to evaluate the finite-sample performance of the proposal. Finally, a real data set is analyzed to illustrate the practical utility of our proposed method.
Citation: Mingzhen Wan, Wei Chen. Nonasymptotic oracle inequalities and alternating direction method of multipliers algorithm for adaptive lasso penalized multiplicative regression[J]. AIMS Mathematics, 2026, 11(3): 6866-6909. doi: 10.3934/math.2026283
In this paper, we mainly consider the regularized estimation problem of parameters in the multiplicative regression model, where the response variable is always positive. Hao, Lin, and Zhao, Comput. Stat. Data An., 103 (2016) investigated an adaptive variable selection method via the least product relative error (LPRE) criterion and lasso-type penalty with fixed or diverging number of covariates and showed the resultant estimator achieves the oracle property. However, the alternating direction method of multipliers (ADMM) algorithm proposed by the authors is based on the least square approximation of the LPRE loss function, where a well-behaved initial estimator must be determined in advance, and the convergence is not validated. Through careful introduction of auxiliary variables and a three-block reformulation, our ADMM algorithm eliminates sensitivity to initial values while ensuring convergence. In addition, by virtue of the symmetric Bregman (SB) divergence and natural extensions of compatibility and weak cone invertibility factors, we establish nonasymptotic oracle inequalities for the $ \ell_1 $ estimation error and prediction error measured by the SB divergence of the lasso penalized LPRE estimator. The proposed method is shown to be very efficient owing to the fact that almost each derived subproblem has a closed-form solution. Extensive simulation studies are conducted to evaluate the finite-sample performance of the proposal. Finally, a real data set is analyzed to illustrate the practical utility of our proposed method.
| [1] |
S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein, Distributed optimization and statistical learning via the alternating direction method of multipliers, Found. Trends Inf. Ret., 3 (2011), 1–122. https://doi.org/10.1561/2200000016 doi: 10.1561/2200000016
|
| [2] |
C. Chen, B. He, Y. Ye, X. Yuan, The direct extension of admm for multi-block convex minimization problems is not necessarily convergent, Math. Program., 155 (2016), 57–79. https://doi.org/10.1561/2200000016 doi: 10.1561/2200000016
|
| [3] |
K. Chen, S. Guo, Y. Lin, Z. Ying, Least absolute relative error estimation, J. Am. Stat. Assoc., 105 (2010), 1104–1112. https://doi.org/10.1198/jasa.2010.tm09307 doi: 10.1198/jasa.2010.tm09307
|
| [4] |
K. Chen, Y. Lin, Z. Wang, Z. Ying, Least product relative error estimation, J. Multivariate Anal., 144 (2016), 91–98. https://doi.org/10.1016/j.jmva.2015.10.017 doi: 10.1016/j.jmva.2015.10.017
|
| [5] |
Y. Chen, H. Liu, J. Ma, Local least product relative error estimation for single-index varying-coefficient multiplicative model with positive responses, J. Comput. Appl. Math., 415 (2022), 114478. https://doi.org/10.1016/j.cam.2022.114478 doi: 10.1016/j.cam.2022.114478
|
| [6] |
Y. Chen, H. Ming, H. Yang, Efficient variable selection for high-dimensional multiplicative models: a novel LPRE-based approach, Stat. Papers, 65 (2024), 3713–3737. https://doi.org/10.1007/s00362-024-01545-1 doi: 10.1007/s00362-024-01545-1
|
| [7] |
B. Efron, T. Hastie, I. Johnstone, R. Tibshirani, Least angle regression, Ann. Statist., 32 (2004), 407–451. https://doi.org/10.1214/009053604000000067 doi: 10.1214/009053604000000067
|
| [8] |
A. T. Hammad, I. Elbatal I, E. M. Almetwally, M. M. Abd El-Raouf, M. A. El-Qurashi, Ahmed M. Gemeay, A novel robust estimator for addressing multicollinearity and outliers in Beta regression: simulation and application, AIMS Math., 10 (2025), 21549–21580. https://doi.org/10.3934/math.2025958 doi: 10.3934/math.2025958
|
| [9] |
M. Hao, Y. Lin, X. Zhao, A relative error-based approach for variable selection, Comput. Stat. Data. An., 103 (2016), 250–262. https://doi.org/10.1016/j.csda.2016.05.013 doi: 10.1016/j.csda.2016.05.013
|
| [10] |
D. Hu, Local least product relative error estimation for varying coefficient multiplicative regression model, Acta. Math. Appl. Sin. Engl. Ser., 35 (2019), 274–286. https://doi.org/10.1007/s10255-018-0794-2 doi: 10.1007/s10255-018-0794-2
|
| [11] |
J. Huang, T. Sun, Z. Ying, Y. Yu, C. H. Zhang, Oracle inequalities for the lasso in the cox model, Ann. Statist., 41 (2013), 1142–1165. https://doi.org/10.1214/13-AOS1098 doi: 10.1214/13-AOS1098
|
| [12] |
T. M. Khoshgoftaar, B. B. Bhattacharyya, G. D. Richardson, Predicting software errors, during development, using nonlinear regression models: a comparative study, IEEE T. Reliab., 41 (1992), 390–395. https://doi.org/10.1109/24.159804 doi: 10.1109/24.159804
|
| [13] |
X. Li, L. Mo, X. Yuan, J. Zhang, Linearized alternating direction method of multipliers for sparse group and fused lasso models, Comput. Stat. Data. An., 79 (2014), 203–221. https://doi.org/10.1016/j.csda.2014.05.017 doi: 10.1016/j.csda.2014.05.017
|
| [14] |
X. Liu, Y. Lin, Z. Wang, Group variable selection for relative error regression, J. Stat. Plan. Infer., 175 (2016), 40–50. https://doi.org/10.1016/j.jspi.2016.02.006 doi: 10.1016/j.jspi.2016.02.006
|
| [15] |
P. Liu, L. Chen, M. Bai, An accelerated semi-proximal ADMM with applications to multi-block sparse optimization problems, J. Sci. Comput., 104 (2025), 30. https://doi.org/10.1007/s10915-025-02951-9 doi: 10.1007/s10915-025-02951-9
|
| [16] |
S. C. Narula, J. F. Wellington, Prediction, linear regression and the minimum sum of relative errors, Technometrics, 19 (1977), 185–190. https://doi.org/10.1080/00401706.1977.10489526 doi: 10.1080/00401706.1977.10489526
|
| [17] |
H. Park, L. A. Stefanski, Relative-error prediction, Stat. Probabil. Lett., 40 (1998), 227–236. https://doi.org/10.1016/S0167-7152(98)00088-1 doi: 10.1016/S0167-7152(98)00088-1
|
| [18] |
O. F. Salih Al-Rawi, Z. Y. Algamal, Geospatial modeling of under-five mortality in Iraq based on geographic weighted regression model, Netw. Model. Anal. Health Inform. Bioinform., 15 (2026), 5. https://doi.org/10.1007/s13721-025-00694-z doi: 10.1007/s13721-025-00694-z
|
| [19] |
P. J. Bickel, Y. Ritov, A. B. Tsybakov, Simultaneous analysis of Lasso and Dantzig selector, Ann. Statist., 37 (2009), 1705–1732. https://doi.org/10.1214/08-AOS620 doi: 10.1214/08-AOS620
|
| [20] |
S. A. van de Geer, High-dimensional generalized linear models and the lasso, Ann. Statist., 36 (2008), 614–645. https://doi.org/10.1214/009053607000000929 doi: 10.1214/009053607000000929
|
| [21] |
S. A. van de Geer, P. Bühlmann, On the conditions used to prove oracle results for the lasso, Electron. J. Statist., 3 (2009), 1360–1392. https://doi.org/10.1214/09-EJS506 doi: 10.1214/09-EJS506
|
| [22] |
H. Wang, C. Leng., Unified lasso estimation by least squares approximation, J. Am. Stat. Assoc., 102 (2007), 1039–1048. https://doi.org/10.1198/016214507000000509 doi: 10.1198/016214507000000509
|
| [23] |
X. Xia, Z. Liu, H. Yang, Regularized estimation for the least absolute relative error models with a diverging number of covariates, Comput. Stat. Data. An., 96 (2016), 104–119. https://doi.org/10.1016/j.csda.2015.10.012 doi: 10.1016/j.csda.2015.10.012
|
| [24] | F. Ye, C. H. Zhang, Rate minimaxity of the lasso and dantzig selector for the lq loss in lr balls, J. Mach. Learn. Res., 11 (2010), 3519–3540. |
| [25] |
H. Zhang, L. Sun, Y. Zhou, J. Huang, Oracle inequalities and selection consistency for weighted lasso in high-dimensional additive hazards model, Stat. Sinica, 27 (2017), 1903–1920. https://doi.org/10.5705/ss.202015.0075 doi: 10.5705/ss.202015.0075
|
| [26] |
J. Zhang, Z. Feng, H. Peng, Estimation and hypothesis test for partial linear multiplicative models, Comput. Stat. Data. An., 128 (2018), 87–103. https://doi.org/10.1016/j.csda.2018.06.017 doi: 10.1016/j.csda.2018.06.017
|
| [27] |
J. Zhang, J. Zhu, Z. Feng, Estimation and hypothesis test for single-index multiplicative models, Test, 28 (2019), 242–268. https://doi.org/10.1007/s11749-018-0586-2 doi: 10.1007/s11749-018-0586-2
|
| [28] |
Q. Zhang, Q. Wang, Local least absolute relative error estimating approach for partially linear multiplicative model, Stat. Sinica, 23 (2013), 1091–1116. https://doi.org/10.5705/ss.2012.133 doi: 10.5705/ss.2012.133
|