Modification Of Regression Models To Solve Heterogeneity Problem Using Seaweed Drying Data

During the seaweed’s drying process, a lot of drying parameters are involved. One of the problems in regression analysis is the impact of heterogeneity parameters. The seaweed data was collected using sensor-smart farming technology attached to the v-Groove Hybrid Solar Drier. The proposed method...

全面介紹

Saved in:
書目詳細資料
主要作者: Joshua, Ibidoja Olayemi
格式: Thesis
語言:English
出版: 2023
主題:
在線閱讀:http://eprints.usm.my/60406/1/IBIDOJA%20OLAYEMI%20JOSHUA%20-%20TESIS24.pdf
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:During the seaweed’s drying process, a lot of drying parameters are involved. One of the problems in regression analysis is the impact of heterogeneity parameters. The seaweed data was collected using sensor-smart farming technology attached to the v-Groove Hybrid Solar Drier. The proposed method used the variance inflation factor to identify the heterogeneity parameters. To determine the 15, 25, 35, and 45 highranking important parameters for the seaweed, models such as ridge, random forest, support vector machine, bagging, boosting, LASSO, and elastic net are used before heterogeneity, after heterogeneity, and for the modified model. To reduce the outliers, robust regressions such as M Huber, M Hampel, M Bi Square, MM, and S estimators are used. Before the heterogeneity parameters were excluded from the model, the hybrid model of the ridge with the M Hampel estimator showed that better significant results were obtained with 2.14% outliers. After the heterogeneity parameters were excluded from the model, the support vector machine with the MM estimator showed that better significant results were obtained with 2.09% outliers. For the modified model, LASSO with M Bi square estimator showed that better significant results were obtained with 1.31% outliers. For future studies, the impact of heterogeneity using a hybrid model with imbalanced data or missing values can be investigated. Ensemble machine learning algorithms such as stacking, XGBoost, and AdaBoost can be used.