Selection of Variables in Quantile Regression (Linear Lasso- Goal Programming)

Authors

  • Neveen Sayed Ahmed
  • Elham Abdul-Razik Ismail

Keywords:

Quantile Regression - Linear Lasso- Selection of Variables - goal programming - estimated risk - relative estimated risk

Abstract

Quantile regression is a statistical technique intended to estimate, and conduct inference about the conditional quantile functions. Since Koenker and Bassett (1978) introduced quantile regression, which models conditional quantiles as functions of predictors. The quantile regression can give complete information about the relationship between the response variable and covariates on the entire conditional distribution, and has no distributional assumption about the error term in the model. The study evaluates the performance of three methods; two methods of linear programming linear lasso ( 12L1"> -Lasso, 12L2"> -Lasso) and one method of Goal programming. The three methods are used to select the best subset of variables and estimate the parameters of the quantile regression equation when four error distributions, with two different sample sizes and two different parameters for each error distribution. The study found that the estimated risk and relative estimated risk which produced from Goal programming method is less than ER and ERE of ( 12L1"> -Lasso and 12L2"> -Lasso methods.

References

Alhamzawi, R., Yu, K. and Benoit, T. (2012). Bayesian adaptive lasso quantile regression. Statistical Modelling, 12(3) , 279-297.

Efron, B., Hastie, T., Johnstone, I. and Tibshirani, R. (2004). Least angle regression. The Annals of Statistics, 32, 407-499.

Fan, J. and Lv, J. (2010). Aselective overview of variable selection in high dimensional feature space. Statistica Sinica, 20, 101-148.

Ismail, A. R. (2003). Curve fitting for data with outliers using fuzzy linear programming. Unpublished Ph.D. thesis, Faculty of Commerce, Al-Azhar University-Girls' Branch.

Koenker, R. and Bassett, G. (1978). Regression quantiles. Econometrica, 46, 33-50.

Li, Q., Xi, R. and Lin, N. (2010). Bayesian regularized quantile regression. Bayesian Analysis, 5, 1-24.

Li, Y. and Zhu, J. (2008). L1-Normal quantile regression. Journal of Computational and Graphical Statistics, 17, 163-185.

Miller, A. (1990). Subset selection regression. Chapmanand Hall.

Osborne, M., Presnell, B. and Turlach, B. (2000). On the Lasso and its dual. Journal of Computational and Graphical Statistics, 9, 319-337.

Schmidt, E., Berg, M., Fried, L. and Murphy, k. (2007). Group sparsity via linear time projection. Technical report. TR, Department of Computer Science, University of British Columba, Vancouver, July.

Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society, Series B, 58, 267-288.

Zou, H. and Yuan, M. (2008). Regularized simultaneous model selection in multiple quantiles regression. Computational Statistics and Data Analysis, 52, 5296-5304.

Downloads

Published

2014-12-15

How to Cite

Ahmed, N. S., & Ismail, E. A.-R. (2014). Selection of Variables in Quantile Regression (Linear Lasso- Goal Programming). Asian Journal of Applied Sciences, 2(6). Retrieved from https://ajouronline.com/index.php/AJAS/article/view/2011