SELECTING THE “TRUE” REGRESSION MODEL: A NEW RANKING METHOD
Keywords:
multiple linear regression, information criteria, bootstrapping, ranking methodDOI:
https://doi.org/10.17654/0972361723025Abstract
Statistical regression modeling is widely used in many areas of scientific research. An optimum regression model, containing most suitable predictors their combinations, is a desire of every researcher. This paper deals with selecting a “true” regression model. The paper is based upon a new method of identifying a “true” regression model for an available data set. The method is based upon bootstrapping and averaging some available criteria and then, assuming multivariate normality of averages, using the method of Zhu [1] to select a true regression model. The results are supported by an extensive simulation study. We found that our proposed method is best in selecting the “true” regression model in simulation study as well as in the real data application. It is, therefore, suggested that the proposed method can be used to pick a suitable regression model.
Received: February 13, 2023; Accepted: March 14, 2023; Published: April 12, 2023
References
J. Zhu, Data envelopment analysis vs principal component analysis: an illustrative study of economic performance of Chinese cities, European J. Oper. Res. 111 (1998), 50-61.
SAS Institute, Inc., SAS/STAT User’s Guide SAS Online Doc 9.1.2., SAS Institute Inc., Cary, NC, 2004.
J. Neter, M. H. Kutner, C. J. Nachtsheim and W. Wasserman, Applied Linear Regression Model, 3rd ed., Richard D. Irwin, Inc., Chicago, 1996.
H. Akaike, Fitting autoregressive models for prediction, Ann. Inst. Statist. Math. 21 (1969), 243-247.
G. G. Judge, W. E. Griffiths, R. C. Hill and T. Lee, Theory and Practice of Econometrics, Wiley, New York, 1980.
T. Sawa, Information criteria for discriminating among alternative regression models, Econometrica 46 (1978), 1273-1291.
G. Schwarz, Estimating the dimension of a model, Ann. Statist. 6 (1978), 461-464.
T. Amemiya, Estimation in nonlinear simultaneous equation models, Paper presented at Institut National de La Statistique et Des Etudes Ecnomiques, Paris, E. Malinvaued, ed., Cahiers Du Seminarire D’econometrie, No 19, 1976.
T. Amemiya, Advanced Econometrics, Harvard University Press, Cambridge, 1985.
R. R. Hocking, The analysis and selection of variables in linear regression, Biometrics 32 (1976), 1-49.
Ali Hussein Al-Marshadi, New weighted information criteria to select the true regression model, Australian Journal of Basic and Applied Sciences 5(3) (2011), 317-321.
Ali Hussein Al-Marshadi, Selecting the covariance structure in mixed model using statistical methods calibration, J. Math. Stat. 10(3) (2014), 309-315.
B. Efron, Estimating the error rate of a prediction rule: improvement on cross-validation, J. Amer. Statist. Assoc. 78 (1983), 316-331.
B. Efron, How biased is the apparent error rate of a prediction rule? J. Amer. Statist. Assoc. 81 (1986), 416-470.
B. Efron and R. J. Tibshirani, Introduction to the Bootstrap, New York, Chapman and Hall, 1993.
Xin Yan and Xiaogang Su, Linear Regression Analysis: Theory and Computing, World Scientific Publishing Co Pte Ltd., 2009.
M. Schmidt, D. P. Schneider and J. E. Gunn, Spectroscopic CCD surveys for quasars at large redshift, The Astronomical Journal 110(1) (1995), 70.
W. Mendenhall and T. Sinich, Regression Analysis: A Second Course in Statistics, 6th ed., Upper Saddle River, Prentice Hall, NJ, 2003.
R. Khattree and N. D. Naik, Multivariate Data Reduction and Discrimination with SAS Software, SAS Institute Inc., Cary, NC, USA, 2000.
Downloads
Published
Issue
Section
License
Copyright (c) 2023 Pushpa Publishing House, Prayagraj, India

This work is licensed under a Creative Commons Attribution 4.0 International License.
____________________________
Attribution: Credit Pushpa Publishing House as the original publisher, including title and author(s) if applicable.
No Derivatives: Modifying or creating derivative works not allowed without written permission.
Contact Pushpa Publishing House for more info or permissions.
Journal Impact Factor: 