[PDF] Essays On Robust Model Selection And Model Averaging For Linear Models - eBooks Review

Essays On Robust Model Selection And Model Averaging For Linear Models


Essays On Robust Model Selection And Model Averaging For Linear Models
DOWNLOAD

Download Essays On Robust Model Selection And Model Averaging For Linear Models PDF/ePub or read online books in Mobi eBooks. Click Download or Read Online button to get Essays On Robust Model Selection And Model Averaging For Linear Models book now. This website allows unlimited access to, at the time of writing, more than 1.5 million titles, including hundreds of thousands of titles in various foreign languages. If the content not found or just blank you must refresh this page



Essays On Robust Model Selection And Model Averaging For Linear Models


Essays On Robust Model Selection And Model Averaging For Linear Models
DOWNLOAD
Author : Le Chang
language : en
Publisher:
Release Date : 2017

Essays On Robust Model Selection And Model Averaging For Linear Models written by Le Chang and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2017 with categories.


Model selection is central to all applied statistical work. Selecting the variables for use in a regression model is one important example of model selection. This thesis is a collection of essays on robust model selection procedures and model averaging for linear regression models. In the first essay, we propose robust Akaike information criteria (AIC) for MM-estimation and an adjusted robust scale based AIC for M and MM-estimation. Our proposed model selection criteria can maintain their robust properties in the presence of a high proportion of outliers and the outliers in the covariates. We compare our proposed criteria with other robust model selection criteria discussed in previous literature. Our simulation studies demonstrate a significant outperformance of robust AIC based on MM-estimation in the presence of outliers in the covariates. The real data example also shows a better performance of robust AIC based on MM-estimation. The second essay focuses on robust versions of the "Least Absolute Shrinkage and Selection Operator" (lasso). The adaptive lasso is a method for performing simultaneous parameter estimation and variable selection. The adaptive weights used in its penalty term mean that the adaptive lasso achieves the oracle property. In this essay, we propose an extension of the adaptive lasso named the Tukey-lasso. By using Tukey's biweight criterion, instead of squared loss, the Tukey-lasso is resistant to outliers in both the response and covariates. Importantly, we demonstrate that the Tukey-lasso also enjoys the oracle property. A fast accelerated proximal gradient (APG) algorithm is proposed and implemented for computing the Tukey-lasso. Our extensive simulations show that the Tukey-lasso, implemented with the APG algorithm, achieves very reliable results, including for high-dimensional data where p>n. In the presence of outliers, the Tukey-lasso is shown to offer substantial improvements in performance compared to the adaptive lasso and other robust implementations of the lasso. Real data examples further demonstrate the utility of the Tukey-lasso. In many statistical analyses, a single model is used for statistical inference, ignoring the process that leads to the model being selected. To account for this model uncertainty, many model averaging procedures have been proposed. In the last essay, we propose an extension of a bootstrap model averaging approach, called bootstrap lasso averaging (BLA). BLA utilizes the lasso for model selection. This is in contrast to other forms of bootstrap model averaging that use AIC or Bayesian information criteria (BIC). The use of the lasso improves the computation speed and allows BLA to be applied even when the number of variables p is larger than the sample size n. Extensive simulations confirm that BLA has outstanding finite sample performance, in terms of both variable and prediction accuracies, compared with traditional model selection and model averaging methods. Several real data examples further demonstrate an improved out-of-sample predictive performance of BLA.



Essays On Model Averaging


Essays On Model Averaging
DOWNLOAD
Author :
language : en
Publisher:
Release Date : 2012

Essays On Model Averaging written by and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2012 with categories.


This dissertation is a collection of three essays on model averaging, organized in the form of three chapters. The first chapter proposes a new model averaging estimator for the linear regression model with heteroskedastic errors. We address the issues of how to assign the weights for candidate models optimally and how to make inference based on the averaging estimator. We first derive the asymptotic distribution of the averaging estimator with fixed weights in a local asymptotic framework, which allows us to characterize the optimal weights. The optimal weights are obtained by minimizing the asymptotic mean squared error. Second, we propose a plug-in estimator of the optimal weights and use these estimated weights to construct a plug-in averaging estimator of the parameter of interest. We derive the asymptotic distribution of the proposed estimator. Third, we show that confidence intervals based on normal approximations lead to distorted inference in this context. We suggest a plug-in method to construct confidence intervals, which have good finite-sample coverage probabilities. The second chapter investigates model combination in a predictive regression. We derive the mean squared forecast error (MSFE) of the model averaging estimator in a local asymptotic framework. We show that the optimal model weights which minimize the MSFE depend on the local parameters and the covariance matrix of the predictive regression. We propose a plug-in estimator of the optimal weights and use these estimated weights to construct the forecast combination. The third chapter proposes a model averaging approach to reduce the mean squared error (MSE) and weighted integrated mean squared error (WIMSE) of kernel estimators of regression functions. At each point of estimation, we construct a weighted average of the local constant and local linear estimators. The optimal local and global weights for averaging are chosen to minimize the MSE and WIMSE of the averaging estimator, respectively. We propose two data-driven approaches for bandwidth and weight selection and derive the rate of convergence of the cross-validated weights to their optimal benchmark values.



The Robustness Of Model Selection Rules


The Robustness Of Model Selection Rules
DOWNLOAD
Author : Jochen A. Jungeilges
language : en
Publisher: LIT Verlag Münster
Release Date : 1992

The Robustness Of Model Selection Rules written by Jochen A. Jungeilges and has been published by LIT Verlag Münster this book supported file pdf, txt, epub, kindle and other format this book has been release on 1992 with Business & Economics categories.




Regression And Time Series Model Selection


Regression And Time Series Model Selection
DOWNLOAD
Author : Allan D. R. McQuarrie
language : en
Publisher: World Scientific
Release Date : 1998

Regression And Time Series Model Selection written by Allan D. R. McQuarrie and has been published by World Scientific this book supported file pdf, txt, epub, kindle and other format this book has been release on 1998 with Mathematics categories.


This important book describes procedures for selecting a model from a large set of competing statistical models. It includes model selection techniques for univariate and multivariate regression models, univariate and multivariate autoregressive models, nonparametric (including wavelets) and semiparametric regression models, and quasi-likelihood and robust regression models. Information-based model selection criteria are discussed, and small sample and asymptotic properties are presented. The book also provides examples and large scale simulation studies comparing the performances of information-based model selection criteria, bootstrapping, and cross-validation selection methods over a wide range of models.



Essays In Honor Of Subal Kumbhakar


Essays In Honor Of Subal Kumbhakar
DOWNLOAD
Author : Christopher F. Parmeter
language : en
Publisher: Emerald Group Publishing
Release Date : 2024-04-05

Essays In Honor Of Subal Kumbhakar written by Christopher F. Parmeter and has been published by Emerald Group Publishing this book supported file pdf, txt, epub, kindle and other format this book has been release on 2024-04-05 with Business & Economics categories.


It is the editor’s distinct privilege to gather this collection of papers that honors Subhal Kumbhakar’s many accomplishments, drawing further attention to the various areas of scholarship that he has touched.



Model Selection And Inference


Model Selection And Inference
DOWNLOAD
Author : Kenneth P. Burnham
language : en
Publisher: Springer Science & Business Media
Release Date : 2013-11-11

Model Selection And Inference written by Kenneth P. Burnham and has been published by Springer Science & Business Media this book supported file pdf, txt, epub, kindle and other format this book has been release on 2013-11-11 with Mathematics categories.


Statisticians and applied scientists must often select a model to fit empirical data. This book discusses the philosophy and strategy of selecting such a model using the information theory approach pioneered by Hirotugu Akaike. This approach focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data. The book includes practical applications in biology and environmental science.



Model Selection And Multimodel Inference


Model Selection And Multimodel Inference
DOWNLOAD
Author : Kenneth P. Burnham
language : en
Publisher: Springer Science & Business Media
Release Date : 2007-05-28

Model Selection And Multimodel Inference written by Kenneth P. Burnham and has been published by Springer Science & Business Media this book supported file pdf, txt, epub, kindle and other format this book has been release on 2007-05-28 with Mathematics categories.


A unique and comprehensive text on the philosophy of model-based data analysis and strategy for the analysis of empirical data. The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data. It contains several new approaches to estimating model selection uncertainty and incorporating selection uncertainty into estimates of precision. An array of examples is given to illustrate various technical issues. The text has been written for biologists and statisticians using models for making inferences from empirical data.



The Quest For Robust Model Selection Methods In Linear Regression


The Quest For Robust Model Selection Methods In Linear Regression
DOWNLOAD
Author : Prakash Borpatra Gohain
language : en
Publisher:
Release Date : 2022

The Quest For Robust Model Selection Methods In Linear Regression written by Prakash Borpatra Gohain and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2022 with categories.




On Model Selection In Robust Linear Regression


On Model Selection In Robust Linear Regression
DOWNLOAD
Author : Guoqi Qian
language : en
Publisher:
Release Date : 1996

On Model Selection In Robust Linear Regression written by Guoqi Qian and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 1996 with categories.




Robust And Misspecification Resistant Model Selection In Regression Models With Information Complexity And Genetic Algorithms


Robust And Misspecification Resistant Model Selection In Regression Models With Information Complexity And Genetic Algorithms
DOWNLOAD
Author :
language : en
Publisher:
Release Date : 2007

Robust And Misspecification Resistant Model Selection In Regression Models With Information Complexity And Genetic Algorithms written by and has been published by this book supported file pdf, txt, epub, kindle and other format this book has been release on 2007 with categories.


In this dissertation, we develop novel computationally efficient model subset selection methods for multiple and multivariate linear regression models which are both robust and misspecification resistant. Our approach is to use a three-way hybrid method which employs the information theoretic measure of complexity (ICOMP) computed on robust M-estimators as model subset selection criteria, integrated with genetic algorithms (GA) as the subset model searching engine. Despite the rich literature on the robust estimation techniques, bridging the theoretical and applied aspects related to robust model subset selection has been somewhat neglected. A few information criteria in the multiple regression literature are robust. However, none of them is model misspecification resistant and none of them could be generalized to the misspecified multivariate regression. In this dissertation, we introduce for the first time both robust and misspecification resistant information complexity (ICOMP) criterion to fill in the gap in the literature. More specifically in multiple linear regression, we introduce robust M-estimators with misspecification resistant ICOMP and use the new information criterion as the fitness function in GA to carry out the model subset selection. For multivariate linear regression, we derive the two-stage robust Mahalanobis distance (RMD) estimator and introduce this RMD estimator in the computation of information criteria. The new information criteria are used as the fitness function in the GA to perform the model subset selection. Comparative studies on the simulated data for both multiple and multivariate regression show that the robust and misspecification resistant ICOMP outperforms the other robust information criteria and the non-robust ICOMP computed using OLS (or MLE) when the data contain outliers and error terms in the model deviate from a normal distribution. Compared with the all possible model subset selection, GA combined with the robust and misspecification resistant information criteria is proved to be an effective method which can quickly find the a near optimal subset, if not the best, without having to search the whole subset model space.