Regularized Nonlinear Least Trimmed Squares Estimator in the Presence of Multicollinearity and Outliers
American Journal of Theoretical and Applied Statistics
Volume 7, Issue 4, July 2018, Pages: 156-162
Received: Apr. 17, 2018; Accepted: May 29, 2018; Published: Jun. 29, 2018
Views 1078      Downloads 63
Authors
George Kemboi Kirui Keitany, Department of Statistics and Actuarial Science, Kenyatta University (KU), Nairobi, Kenya
Ananda Omutokoh Kube, Department of Statistics and Actuarial Science, Kenyatta University (KU), Nairobi, Kenya
Joseph Mutua Mutisya, Department of Statistics and Actuarial Science, Kenyatta University (KU), Nairobi, Kenya
Fundi Daniel Muriithi, Department of Statistics and Actuarial Science, Kenyatta University (KU), Nairobi, Kenya
Article Tools
Follow on us
Abstract
This study proposes a regularized robust Nonlinear Least Trimmed squares estimator that relies on an Elastic net penalty in nonlinear regression. Regularization parameter selection was done using a robust cross-validation criterion and estimation through Newton Raphson iteration algorthm for the oprimal model coefficients. Monte Carlo simulation was conducted to verify the theoretical properties outlined in the methodology both for scenarios of presence and absence of multicollinearity and existence of outliers. The proposed procedure performed well compared to the NLS and NLTS in a viewpoint of yielding relatively lower values of MSE and Bias. Furthermore, a real data analysis demonstrated satisfactory performance of the suggested technique.
Keywords
Elastic Net, Multicollinearity, Regularization, Nonlinear Least Trimmed Squares, Outliers
To cite this article
George Kemboi Kirui Keitany, Ananda Omutokoh Kube, Joseph Mutua Mutisya, Fundi Daniel Muriithi, Regularized Nonlinear Least Trimmed Squares Estimator in the Presence of Multicollinearity and Outliers, American Journal of Theoretical and Applied Statistics. Vol. 7, No. 4, 2018, pp. 156-162. doi: 10.11648/j.ajtas.20180704.14
Copyright
Copyright © 2018 Authors retain the copyright of this article.
This article is an open access article distributed under the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
References
[1]
Ando, T., Konishi, S., Imoto, S. (2008). Nonlinear regression modeling via regularized radial basis function networks, Journal of Statistical Planning and Inference, Volume 138, Issue 11, Pages 3616-3633.
[2]
Batterham, A., Tolfrey, K., and George, K. Nevills (1997). explanation of kleibers 0.75 mass exponent: an artifact of collinearity problems in least squares models? Journal of Applied Physiology, 82:693-697.
[3]
Cizek, P. (2001). Nonlinear least trimmed squares. SFB Discussion paper Humboldt University, 25/2001.
[4]
Farnoosh, R., J. Ghasemian, J., Fard, O. S. (2012). Comp. and Appl. Math. - no.2, 31, 323-338.
[5]
Hashem, H. (2014). Regularized and robust regression methods for high-dimensional data. Department of Mathematical Sciences, Brunel University.
[6]
Hang, R., Liu, J. J., Q., Song, H., Zhu, F., and Pei, H. (2017). Graph Regularized Nonlinear Ridge Regression for Remote Sensing Data Analysis. IEEE, 10: 277 – 285.
[7]
Jiang, X., Jiang, J., and Song, X. (2012). Oracle model selection for nonlinear models based on weighted composite quantile regression. Statistica Sinica, 22:1479-1506.
[8]
Kamal, D. and Ali, B. (2015). H. Robust linear regression using l1-penalized mm-estimation for high dimensional data. American Journal of Theoretical and Applied Statistics, 4(3):77-84.
[9]
Khademzadeh, A., Khademzadeh, A., D, P. C. P., D, M. S. P., and Anagnostopoulos, G. C. (2013). Large-scale non-linear regression within the mapreduce framework.
[10]
Lim, C. (2015). Robust ridge regression estimators for nonlinear models with applications to high throughput screening assay data. Statistics in medicine.
[11]
Ohlsson, H. (2010). Regularization for sparseness and smoothness applications in system identication and signal processing. Linkoping University Electronic Press.
[12]
Park, H. (2013). Robust regression modelling via l1 type regularization. Dept. of Mathematics Chuo University.
[13]
Ramalho, E. and Ramalho, J. (2014). Moment-based estimation of nonlinear regression models with boundary outcomes and endogeneity, with applications to nonnegative and fractional responses. CEFAGE-UE Working Paper 2014/09.
[14]
Sima, D. (Apr. 2006). Regularization techniques in modeling and parameter estimation. PhD thesis.
[15]
Tabatabai, M. A., Kengwoung-Keumo, J. J., Eby, W. M., Bae, S., and Manne, U. (2014). A new robust method for nonlinear regression. J Biomet Biostat, 5.
[16]
Tateishi, S., Matsui, H., and Konishi, S. (2009). Nonlinear regression via the lasso-type regularization. Journal of statistical planning and inference.
[17]
Tikhonov, A. N. (1943). On the stability of inverse problems. Dokl. Akad. Nauk SSSR, 39:176-179.
[18]
Zou, H. and Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society, B67: 301-320.
[19]
Zucker, D. M., Gorfine, M., Li, Y., Tadesse, M., & Spiegelman, D. (2013). A Regularization Corrected Score Method for Nonlinear Regression Models with Covariate Error. Biometrics, 69(1), 80–90.
ADDRESS
Science Publishing Group
1 Rockefeller Plaza,
10th and 11th Floors,
New York, NY 10020
U.S.A.
Tel: (001)347-983-5186