| Peer-Reviewed

Identifying the Limitation of Stepwise Selection for Variable Selection in Regression Analysis

Received: 25 July 2015    Accepted: 6 August 2015    Published: 18 September 2015
Views:       Downloads:
Abstract

In application, one major difficulty a researcher may face in fitting a multiple regression is the problem of selecting significant relevant variables, especially when there are many independent variables to select from as well as having in mind the principle of parsimony; a comparative study of the limitation of stepwise selection for selecting variables in multiple regression analysis was carried out. Regression analysis in its bi-variate and multiple cases and stepwise selection (forward selection, backward elimination and stepwise selection) was employed for this study comparing the zero-order correlations and Beta (β) weights to give a clearer picture of the limitation of stepwise selection. Subsequently, from the comparisons, it was evident that including the suspected predictor (suppressor) variable that was not significant in the bi-variate case as suggested by the stepwise selection improved the beta weight of other predictors in the model and the overall predictability of the model as argued.

Published in American Journal of Theoretical and Applied Statistics (Volume 4, Issue 5)
DOI 10.11648/j.ajtas.20150405.22
Page(s) 414-419
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2024. Published by Science Publishing Group

Keywords

Stepwise Selection, Suppression Effect, Regressor Weights, Correlation

References
[1] Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2013). Applied multiple regression/correlation analysis for the behavioral sciences (Revised ed.). New York: Routledge.
[2] Conger, A. J. (1974). A revised definition for suppressor variables: A guide to their identification and interpretation. Educational and Psychological Measurement , 35-46.
[3] Darlington, R. B. (1968). Multiple regression in psychological research and practice. Psychological Bulletin , 161-182.
[4] Horst, P. (1941). The prediction of personal adjustment. Social Science Research Council Bulletin , 431-436.
[5] John, N., William, W., & Michael, H. K. (1983). Stepwise Selection. In N. John, W. William, & H. K. Michael, Applied Linear regression Models (pp. 430-434). Illinois: Richard D Irwin Inc.
[6] Lancaster, B. P. (1999). Defining and interpreting suppressor effects: Advantages and limitations. Southwest Educational Research Association, San Antonio , 1-21.
[7] Liebscher, G. (2012). A Universal Selection Method in Linear Regression Models. Open Journal of Statistics , 153-162.
[8] Loukas, A. P. (2005). Early adolescent social and overt aggression: Examining the roles of social anxiety and maternal psychological control. Journal of Youth and Adolescence , 335-345.
[9] Mendershausen, H. (1939). Clearing variates in confluence analysis. Journal of the American Statistical Association , 93-105.
[10] Nathans, L. L. (2012). Interpreting Multiple Linear Regression: A Guidebook of Variable Importance. Practical Assessment, Research & Evaluation , 17, 123-136.
[11] Pedhazur, E. J. (1997). Multiple regression in behavioral research. New York: Holt, Rinehart & Winston.
[12] Shanta, P., & Williams, E. (2010). Suppressor Variables in Social Work Research: Ways to Identify in Multiple Regression Models. Journal of the Society for Social Work and Research , 28-40.
[13] Shieh, G. (2006). Suppression situations in multiple linear regression. Educational and Psychological Measurement , 435-447.
[14] Yao, J. (2013). Precision Analysis and Parameter Inversion in the Stepwise Deployment of a Mixed Constellation. Open Journal of Statistics , 390-397.
Cite This Article
  • APA Style

    Akinwande Michael Olusegun, Hussaini Garba Dikko, Shehu Usman Gulumbe. (2015). Identifying the Limitation of Stepwise Selection for Variable Selection in Regression Analysis. American Journal of Theoretical and Applied Statistics, 4(5), 414-419. https://doi.org/10.11648/j.ajtas.20150405.22

    Copy | Download

    ACS Style

    Akinwande Michael Olusegun; Hussaini Garba Dikko; Shehu Usman Gulumbe. Identifying the Limitation of Stepwise Selection for Variable Selection in Regression Analysis. Am. J. Theor. Appl. Stat. 2015, 4(5), 414-419. doi: 10.11648/j.ajtas.20150405.22

    Copy | Download

    AMA Style

    Akinwande Michael Olusegun, Hussaini Garba Dikko, Shehu Usman Gulumbe. Identifying the Limitation of Stepwise Selection for Variable Selection in Regression Analysis. Am J Theor Appl Stat. 2015;4(5):414-419. doi: 10.11648/j.ajtas.20150405.22

    Copy | Download

  • @article{10.11648/j.ajtas.20150405.22,
      author = {Akinwande Michael Olusegun and Hussaini Garba Dikko and Shehu Usman Gulumbe},
      title = {Identifying the Limitation of Stepwise Selection for Variable Selection in Regression Analysis},
      journal = {American Journal of Theoretical and Applied Statistics},
      volume = {4},
      number = {5},
      pages = {414-419},
      doi = {10.11648/j.ajtas.20150405.22},
      url = {https://doi.org/10.11648/j.ajtas.20150405.22},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajtas.20150405.22},
      abstract = {In application, one major difficulty a researcher may face in fitting a multiple regression is the problem of selecting significant relevant variables, especially when there are many independent variables to select from as well as having in mind the principle of parsimony; a comparative study of the limitation of stepwise selection for selecting variables in multiple regression analysis was carried out. Regression analysis in its bi-variate and multiple cases and stepwise selection (forward selection, backward elimination and stepwise selection) was employed for this study comparing the zero-order correlations and Beta (β) weights to give a clearer picture of the limitation of stepwise selection. Subsequently, from the comparisons, it was evident that including the suspected predictor (suppressor) variable that was not significant in the bi-variate case as suggested by the stepwise selection improved the beta weight of other predictors in the model and the overall predictability of the model as argued.},
     year = {2015}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Identifying the Limitation of Stepwise Selection for Variable Selection in Regression Analysis
    AU  - Akinwande Michael Olusegun
    AU  - Hussaini Garba Dikko
    AU  - Shehu Usman Gulumbe
    Y1  - 2015/09/18
    PY  - 2015
    N1  - https://doi.org/10.11648/j.ajtas.20150405.22
    DO  - 10.11648/j.ajtas.20150405.22
    T2  - American Journal of Theoretical and Applied Statistics
    JF  - American Journal of Theoretical and Applied Statistics
    JO  - American Journal of Theoretical and Applied Statistics
    SP  - 414
    EP  - 419
    PB  - Science Publishing Group
    SN  - 2326-9006
    UR  - https://doi.org/10.11648/j.ajtas.20150405.22
    AB  - In application, one major difficulty a researcher may face in fitting a multiple regression is the problem of selecting significant relevant variables, especially when there are many independent variables to select from as well as having in mind the principle of parsimony; a comparative study of the limitation of stepwise selection for selecting variables in multiple regression analysis was carried out. Regression analysis in its bi-variate and multiple cases and stepwise selection (forward selection, backward elimination and stepwise selection) was employed for this study comparing the zero-order correlations and Beta (β) weights to give a clearer picture of the limitation of stepwise selection. Subsequently, from the comparisons, it was evident that including the suspected predictor (suppressor) variable that was not significant in the bi-variate case as suggested by the stepwise selection improved the beta weight of other predictors in the model and the overall predictability of the model as argued.
    VL  - 4
    IS  - 5
    ER  - 

    Copy | Download

Author Information
  • Department of Mathematics, Ahmadu Bello University, Zaria, Nigeria

  • Department of Mathematics, Ahmadu Bello University, Zaria, Nigeria

  • Department of Mathematics, Usman Danfodiyo University, Sokoto, Nigeria

  • Sections