Development Of Eye Gaze Estimation System Using Two Cameras

Eye Gaze is the direction where a person is looking at. It is suitable to be used as a type of natural Human Computer Interface (HCI). Current researches uses infrared or LED to locate the iris of the user to have better gaze estimation accuracy compared to researches that does not. Infrared and LED...

Full description

Saved in:
Bibliographic Details
Main Author: Neoh , Yu Zun
Format: Thesis
Language:English
Published: 2017
Subjects:
Online Access:http://eprints.usm.my/39588/1/NEOH_YU_ZUN_24_Pages.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Eye Gaze is the direction where a person is looking at. It is suitable to be used as a type of natural Human Computer Interface (HCI). Current researches uses infrared or LED to locate the iris of the user to have better gaze estimation accuracy compared to researches that does not. Infrared and LED are intrusive to human eyes and might cause damage to the cornea and the retina of the eye. This research suggests a non-intrusive approach to locate the iris of the user. By using two remote cameras to capture the images of the user, a better accuracy gaze estimation system can be achieved. The system uses Haar cascade algorithms to detect the face and eye regions. The iris detection uses Hough Circle Transform algorithm to locate the position of the iris, which is critical for the gaze estimation calculation. To enable the system to track the eye and the iris location of the user in real time, the system uses CAMshift (Continuously Adaptive Meanshift) to track the eye and iris of the user. The parameters of the eye and iris are then collected and are used to calculate the gaze direction of the user. The left and right camera achieves 70.00% and 74.67% accuracy respectively. When two cameras are used to estimate the gaze direction, 88.67% accuracy is achieved. This shows that by using two cameras, the accuracy of gaze estimation is improved.