[PDF] On Model Selection In Robust Linear Regression - eBooks Review

On Model Selection In Robust Linear Regression


On Model Selection In Robust Linear Regression
DOWNLOAD

Download On Model Selection In Robust Linear Regression PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get On Model Selection In Robust Linear Regression book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages. If the content not found or just blank you must refresh this page



On Model Selection In Robust Linear Regression


On Model Selection In Robust Linear Regression
DOWNLOAD
Author : Guoqi Qian
language : en
Publisher:
Release Date : 1996

On Model Selection In Robust Linear Regression written by Guoqi Qian and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 1996 with categories.




The Robustness Of Model Selection Rules


The Robustness Of Model Selection Rules
DOWNLOAD
Author : Jochen A. Jungeilges
language : en
Publisher: LIT Verlag Münster
Release Date : 1992

The Robustness Of Model Selection Rules written by Jochen A. Jungeilges and has been published by LIT Verlag Münster this book supported file pdf, txt, epub, kindle and other format this book has been release on 1992 with Business & Economics categories.




Regression And Time Series Model Selection


Regression And Time Series Model Selection
DOWNLOAD
Author : Allan D. R. McQuarrie
language : en
Publisher: World Scientific
Release Date : 1998

Regression And Time Series Model Selection written by Allan D. R. McQuarrie and has been published by World Scientific this book supported file pdf, txt, epub, kindle and other format this book has been release on 1998 with Mathematics categories.


This important book describes procedures for selecting a model from a large set of competing statistical models. It includes model selection techniques for univariate and multivariate regression models, univariate and multivariate autoregressive models, nonparametric (including wavelets) and semiparametric regression models, and quasi-likelihood and robust regression models. Information-based model selection criteria are discussed, and small sample and asymptotic properties are presented. The book also provides examples and large scale simulation studies comparing the performances of information-based model selection criteria, bootstrapping, and cross-validation selection methods over a wide range of models.



Computing Stochastic Complexity For Robust Linear Regression Model Selection


Computing Stochastic Complexity For Robust Linear Regression Model Selection
DOWNLOAD
Author : Guoqi Qian
language : en
Publisher:
Release Date : 1997

Computing Stochastic Complexity For Robust Linear Regression Model Selection written by Guoqi Qian and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 1997 with categories.




Essays On Robust Model Selection And Model Averaging For Linear Models


Essays On Robust Model Selection And Model Averaging For Linear Models
DOWNLOAD
Author : Le Chang
language : en
Publisher:
Release Date : 2017

Essays On Robust Model Selection And Model Averaging For Linear Models written by Le Chang and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2017 with categories.


Model selection is central to all applied statistical work. Selecting the variables for use in a regression model is one important example of model selection. This thesis is a collection of essays on robust model selection procedures and model averaging for linear regression models. In the first essay, we propose robust Akaike information criteria (AIC) for MM-estimation and an adjusted robust scale based AIC for M and MM-estimation. Our proposed model selection criteria can maintain their robust properties in the presence of a high proportion of outliers and the outliers in the covariates. We compare our proposed criteria with other robust model selection criteria discussed in previous literature. Our simulation studies demonstrate a significant outperformance of robust AIC based on MM-estimation in the presence of outliers in the covariates. The real data example also shows a better performance of robust AIC based on MM-estimation. The second essay focuses on robust versions of the "Least Absolute Shrinkage and Selection Operator" (lasso). The adaptive lasso is a method for performing simultaneous parameter estimation and variable selection. The adaptive weights used in its penalty term mean that the adaptive lasso achieves the oracle property. In this essay, we propose an extension of the adaptive lasso named the Tukey-lasso. By using Tukey's biweight criterion, instead of squared loss, the Tukey-lasso is resistant to outliers in both the response and covariates. Importantly, we demonstrate that the Tukey-lasso also enjoys the oracle property. A fast accelerated proximal gradient (APG) algorithm is proposed and implemented for computing the Tukey-lasso. Our extensive simulations show that the Tukey-lasso, implemented with the APG algorithm, achieves very reliable results, including for high-dimensional data where p>n. In the presence of outliers, the Tukey-lasso is shown to offer substantial improvements in performance compared to the adaptive lasso and other robust implementations of the lasso. Real data examples further demonstrate the utility of the Tukey-lasso. In many statistical analyses, a single model is used for statistical inference, ignoring the process that leads to the model being selected. To account for this model uncertainty, many model averaging procedures have been proposed. In the last essay, we propose an extension of a bootstrap model averaging approach, called bootstrap lasso averaging (BLA). BLA utilizes the lasso for model selection. This is in contrast to other forms of bootstrap model averaging that use AIC or Bayesian information criteria (BIC). The use of the lasso improves the computation speed and allows BLA to be applied even when the number of variables p is larger than the sample size n. Extensive simulations confirm that BLA has outstanding finite sample performance, in terms of both variable and prediction accuracies, compared with traditional model selection and model averaging methods. Several real data examples further demonstrate an improved out-of-sample predictive performance of BLA.



The Quest For Robust Model Selection Methods In Linear Regression


The Quest For Robust Model Selection Methods In Linear Regression
DOWNLOAD
Author : Prakash Borpatra Gohain
language : en
Publisher:
Release Date : 2022

The Quest For Robust Model Selection Methods In Linear Regression written by Prakash Borpatra Gohain and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2022 with categories.




Robust And Misspecification Resistant Model Selection In Regression Models With Information Complexity And Genetic Algorithms


Robust And Misspecification Resistant Model Selection In Regression Models With Information Complexity And Genetic Algorithms
DOWNLOAD
Author :
language : en
Publisher:
Release Date : 2007

Robust And Misspecification Resistant Model Selection In Regression Models With Information Complexity And Genetic Algorithms written by and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2007 with categories.


In this dissertation, we develop novel computationally efficient model subset selection methods for multiple and multivariate linear regression models which are both robust and misspecification resistant. Our approach is to use a three-way hybrid method which employs the information theoretic measure of complexity (ICOMP) computed on robust M-estimators as model subset selection criteria, integrated with genetic algorithms (GA) as the subset model searching engine. Despite the rich literature on the robust estimation techniques, bridging the theoretical and applied aspects related to robust model subset selection has been somewhat neglected. A few information criteria in the multiple regression literature are robust. However, none of them is model misspecification resistant and none of them could be generalized to the misspecified multivariate regression. In this dissertation, we introduce for the first time both robust and misspecification resistant information complexity (ICOMP) criterion to fill in the gap in the literature. More specifically in multiple linear regression, we introduce robust M-estimators with misspecification resistant ICOMP and use the new information criterion as the fitness function in GA to carry out the model subset selection. For multivariate linear regression, we derive the two-stage robust Mahalanobis distance (RMD) estimator and introduce this RMD estimator in the computation of information criteria. The new information criteria are used as the fitness function in the GA to perform the model subset selection. Comparative studies on the simulated data for both multiple and multivariate regression show that the robust and misspecification resistant ICOMP outperforms the other robust information criteria and the non-robust ICOMP computed using OLS (or MLE) when the data contain outliers and error terms in the model deviate from a normal distribution. Compared with the all possible model subset selection, GA combined with the robust and misspecification resistant information criteria is proved to be an effective method which can quickly find the a near optimal subset, if not the best, without having to search the whole subset model space.



Robust Variable Selection In Linear Regression Models


Robust Variable Selection In Linear Regression Models
DOWNLOAD
Author : Shokrya Saleha A. Alshqaq
language : en
Publisher:
Release Date : 2015

Robust Variable Selection In Linear Regression Models written by Shokrya Saleha A. Alshqaq and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2015 with Linear models (Statistics) categories.




Mixed Integer Linear Programming Robust Regression With Feature Selection


Mixed Integer Linear Programming Robust Regression With Feature Selection
DOWNLOAD
Author : Oleksii Omelchenko
language : en
Publisher:
Release Date : 2014

Mixed Integer Linear Programming Robust Regression With Feature Selection written by Oleksii Omelchenko and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2014 with categories.


We introduce a Mixed-Integer Linear Programming approach for building Regression models. These models can detect potential outliers and have a built-in Feature Selection technique. We demonstrate how to build a linear regression model as well as a multidimensional piece-wise linear regression model that can simulate non-linear models. We compare our techniques with the existing statistical approaches for building regression models with different feature selection algorithms by comparing the results of predictions for 3 real-world data sets. All experiments show that our approach is useful in case where the number of training instances is less than the number of predictors, more stable and provides better results than Stepwise regression, which is the most used linear regression technique in cases when we deal with too many features in the model while having fewer observations.



Robust Model Selection And Outlier Detection In Linear Regressions


Robust Model Selection And Outlier Detection In Linear Regressions
DOWNLOAD
Author : Lauren McCann (Ph. D.)
language : en
Publisher:
Release Date : 2006

Robust Model Selection And Outlier Detection In Linear Regressions written by Lauren McCann (Ph. D.) and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2006 with categories.


(cont.) Finally, we discuss the problem of outlier detection. In addition to model selection, outliers can adversely influence many other outcomes of regression-based data analysis. We describe a new outlier diagnostic tool, which we call diagnostic data traces. This tool can be used to detect outliers and study their influence on a variety of regression statistics. We demonstrate our tool on several data sets, which are considered benchmarks in the field of outlier detection.