Identification Of Outliers In Time Series Data

In regression analysis, data sets usually contain unusual observations that produces undesirable effects on least squares estimates, this unusual observations are refer to as outliers. Detecting these unusual observations prior data analysis is an important aspect of model building. However, many...

全面介绍

Saved in:
书目详细资料
主要作者: Adewale Asiata Omotoyosi
格式: Thesis
语言:en_US
主题:
标签: 添加标签
没有标签, 成为第一个标记此记录!
实物特征
总结:In regression analysis, data sets usually contain unusual observations that produces undesirable effects on least squares estimates, this unusual observations are refer to as outliers. Detecting these unusual observations prior data analysis is an important aspect of model building. However, many regression diagnostics techniques have been introduced to detect these outliers. This research compares the performance of five regression diagnostics techniques based on Ordinary Least Square (OLS) estimators namely; standardized residuals, studentized residuals, Hadi's influence measure, Welsch Kuh distance and Cook's distance to detect and identify outliers. It is known that OLS is not robust in the presence of multiple outliers and high leverage points. Therefore, several robust regression models are used as alternative and its approach is more reliable and appropriate method for solving this problem. The robust regressions are M-estimation, Least Absolute Deviation (Ll), Least Median Square (LMS) and Least Trimmed Square (LTS). The comparisons are made via simulation studies and real data. This research also study the critical values of each techniques and our own critical values are computed for this research. Our results have shown that in some cases diagnostics based on OLS and some robust estimators give similar outcomes, they detect the same percentage of correct outlier detection. The results also shows that Least Trimmed Square is the best among all its counterparts followed by LMS, M estimator and L1 perform least.