Confidence Intervals for the Slope in a Linear Errors-in-Variables Regression Model

Author(s):  
Leon Jay Gleser
Filomat ◽  
2017 ◽  
Vol 31 (15) ◽  
pp. 4845-4856
Author(s):  
Konrad Furmańczyk

We study consistency and asymptotic normality of LS estimators in the EV (errors in variables) regression model under weak dependent errors that involve a wide range of linear and nonlinear time series. In our investigations we use a functional dependence measure of Wu [16]. Our results without mixing conditions complete the known asymptotic results for independent and dependent data obtained by Miao et al. [7]-[10].


2006 ◽  
Vol 31 (3) ◽  
pp. 311-325 ◽  
Author(s):  
Gregory Camilli

A simple errors-in-variables regression model is given in this article for illustrating the method of marginal maximum likelihood (MML). Given suitable estimates of reliability, error variables, as nuisance variables, can be integrated out of likelihood equations. Given the closed form expression of the resulting marginal likelihood, the effects of error can be more clearly demonstrated. Derivations are given in detail to provide a detailed example of the marginalization strategy, and to prepare students for understanding more advanced applications of MML.


1996 ◽  
Vol 12 (3) ◽  
pp. 569-580 ◽  
Author(s):  
Paul Rilstone ◽  
Michael Veall

The usual standard errors for the regression coefficients in a seemingly unrelated regression model have a substantial downward bias. Bootstrapping the standard errors does not seem to improve inferences. In this paper, Monte Carlo evidence is reported which indicates that bootstrapping can result in substantially better inferences when applied to t-ratios rather than to standard errors.


1987 ◽  
Vol 15 (1) ◽  
pp. 220-233 ◽  
Author(s):  
Leon Jay Gleser ◽  
Raymond J. Carroll ◽  
Paul P. Gallo

Sign in / Sign up

Export Citation Format

Share Document