Obstacles mapping based on 3-D perception for mobile robot navigation

Many previous researchers have offered two-dimensional mapping for robotic navigation. However, since two-dimensional mapping is only able to detect the barriers in planar fields, researchers are looking for other better ways to discover the obstacles in the spherical area. The disadvantage of two-d...

Full description

Saved in:
Bibliographic Details
Main Author: Achmad, M. S. Hendriyawan
Format: Thesis
Language:English
Published: 2020
Subjects:
Online Access:http://umpir.ump.edu.my/id/eprint/30397/1/Obstacles%20mapping%20based%20on%203-D%20perception%20for%20mobile.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
id my-ump-ir.30397
record_format uketd_dc
spelling my-ump-ir.303972021-01-06T08:04:54Z Obstacles mapping based on 3-D perception for mobile robot navigation 2020-07 Achmad, M. S. Hendriyawan TK Electrical engineering. Electronics Nuclear engineering Many previous researchers have offered two-dimensional mapping for robotic navigation. However, since two-dimensional mapping is only able to detect the barriers in planar fields, researchers are looking for other better ways to discover the obstacles in the spherical area. The disadvantage of two-dimensional mapping for robot navigation is that it is unable to detect the barriers that have elevation differences. This research offers several steps in order to build a three-dimensional map. The first step is to develop the mobile robot as a test-bed platform. Robot projects the obstacles by measuring the distance uses depth camera to get obstacles geometry information in the form of point-cloud that show the position of landmarks on X, Y, and Z coordinate. The second step offers a method of estimating robot translation and rotation accurately using sensors fusion technique, which is a combination of wheel odometry, visual odometry, and inertial odometry. Wheel odometry estimates the position of the robot based on information on wheel rotation speed without being affected by the presence of light, magnetism, or gravity vectors, but wheel odometry has error accumulation issue. Visual odometry performs estimation functions based on visual images with the combination of Features from Accelerated Segment Test (FAST) and singular value decomposition (SVD) methods. However, visual odometry is very dependent on the presence of light and texture of the object, the less light and texture of the object, the higher the error of position estimation. Inertial odometry uses Magnetic-Angular-Gravity (MARG) measurement then combines the three measurements through the Madgwick method to produce accurate position estimation values. However, inertial odometry is only able to estimate rotational motion. This study offers a fusion method based on the Extended Kalman Filter (EKF) to produce a new estimation output that eliminates the weaknesses of each estimation result (wheel odometry, visual odometry, inertial odometry). The third step is the registration of three-dimensional map based on robot pose estimation and depth measurement. All these issues are examined and investigated from an estimation-theoretic perspective through mathematical analysis. The theories have been validated through experimental investigations. The results of position estimation test using multi-sensor fusion techniques based on the EKF method for 120 seconds in the area of 10m x 10m show the average value of X axis translation error of 7.6cm, Y axis translation error of 8.5cm, roll rotation error of 0.678○, pitch rotation error of 0.491○, and yaw rotation errors are 0.483○. The visual results show a 3-D map which successfully reconstructed has a minimal fracture or overlapping, and represent the same situation as the reality. 2020-07 Thesis http://umpir.ump.edu.my/id/eprint/30397/ http://umpir.ump.edu.my/id/eprint/30397/1/Obstacles%20mapping%20based%20on%203-D%20perception%20for%20mobile.pdf pdf en public phd doctoral Universiti Malaysia Pahang Faculty of Electrical and Electronics Engineering
institution Universiti Malaysia Pahang Al-Sultan Abdullah
collection UMPSA Institutional Repository
language English
topic TK Electrical engineering
Electronics Nuclear engineering
spellingShingle TK Electrical engineering
Electronics Nuclear engineering
Achmad, M. S. Hendriyawan
Obstacles mapping based on 3-D perception for mobile robot navigation
description Many previous researchers have offered two-dimensional mapping for robotic navigation. However, since two-dimensional mapping is only able to detect the barriers in planar fields, researchers are looking for other better ways to discover the obstacles in the spherical area. The disadvantage of two-dimensional mapping for robot navigation is that it is unable to detect the barriers that have elevation differences. This research offers several steps in order to build a three-dimensional map. The first step is to develop the mobile robot as a test-bed platform. Robot projects the obstacles by measuring the distance uses depth camera to get obstacles geometry information in the form of point-cloud that show the position of landmarks on X, Y, and Z coordinate. The second step offers a method of estimating robot translation and rotation accurately using sensors fusion technique, which is a combination of wheel odometry, visual odometry, and inertial odometry. Wheel odometry estimates the position of the robot based on information on wheel rotation speed without being affected by the presence of light, magnetism, or gravity vectors, but wheel odometry has error accumulation issue. Visual odometry performs estimation functions based on visual images with the combination of Features from Accelerated Segment Test (FAST) and singular value decomposition (SVD) methods. However, visual odometry is very dependent on the presence of light and texture of the object, the less light and texture of the object, the higher the error of position estimation. Inertial odometry uses Magnetic-Angular-Gravity (MARG) measurement then combines the three measurements through the Madgwick method to produce accurate position estimation values. However, inertial odometry is only able to estimate rotational motion. This study offers a fusion method based on the Extended Kalman Filter (EKF) to produce a new estimation output that eliminates the weaknesses of each estimation result (wheel odometry, visual odometry, inertial odometry). The third step is the registration of three-dimensional map based on robot pose estimation and depth measurement. All these issues are examined and investigated from an estimation-theoretic perspective through mathematical analysis. The theories have been validated through experimental investigations. The results of position estimation test using multi-sensor fusion techniques based on the EKF method for 120 seconds in the area of 10m x 10m show the average value of X axis translation error of 7.6cm, Y axis translation error of 8.5cm, roll rotation error of 0.678○, pitch rotation error of 0.491○, and yaw rotation errors are 0.483○. The visual results show a 3-D map which successfully reconstructed has a minimal fracture or overlapping, and represent the same situation as the reality.
format Thesis
qualification_name Doctor of Philosophy (PhD.)
qualification_level Doctorate
author Achmad, M. S. Hendriyawan
author_facet Achmad, M. S. Hendriyawan
author_sort Achmad, M. S. Hendriyawan
title Obstacles mapping based on 3-D perception for mobile robot navigation
title_short Obstacles mapping based on 3-D perception for mobile robot navigation
title_full Obstacles mapping based on 3-D perception for mobile robot navigation
title_fullStr Obstacles mapping based on 3-D perception for mobile robot navigation
title_full_unstemmed Obstacles mapping based on 3-D perception for mobile robot navigation
title_sort obstacles mapping based on 3-d perception for mobile robot navigation
granting_institution Universiti Malaysia Pahang
granting_department Faculty of Electrical and Electronics Engineering
publishDate 2020
url http://umpir.ump.edu.my/id/eprint/30397/1/Obstacles%20mapping%20based%20on%203-D%20perception%20for%20mobile.pdf
_version_ 1783732145257185280