Human Spontaneous Emotion Detection System
Having smart computerized system which can understand and instantly gives appropriate response to human is the utmost motive in human and computer interaction (HCI) field.It is argued either HCI is considered advance if human could not have natural and comfortable interaction like human to human int...
Saved in:
Main Author: | |
---|---|
Format: | Thesis |
Language: | English English |
Published: |
2018
|
Subjects: | |
Online Access: | http://eprints.utem.edu.my/id/eprint/23332/1/Human%20Spontaneous%20Emotion%20Detection%20System.pdf http://eprints.utem.edu.my/id/eprint/23332/2/Human%20Spontaneous%20Emotion%20Detection%20System.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
id |
my-utem-ep.23332 |
---|---|
record_format |
uketd_dc |
institution |
Universiti Teknikal Malaysia Melaka |
collection |
UTeM Repository |
language |
English English |
topic |
T Technology (General) T Technology (General) |
spellingShingle |
T Technology (General) T Technology (General) Radin Monawir, Radin Puteri Hazimah Human Spontaneous Emotion Detection System |
description |
Having smart computerized system which can understand and instantly gives appropriate response to human is the utmost motive in human and computer interaction (HCI) field.It is argued either HCI is considered advance if human could not have natural and comfortable interaction like human to human interaction.Besides,despite of several studies regarding emotion detection system, current system mostly tested in laboratory environment and using mimic emotion.Realizing the current system research lack of real life or genuine emotion input,this research work comes up with the idea of developing a system that able to recognize human emotion through facial expression.Therefore,the aims of this study are threefold which are to enhance the algorithm to detect spontaneous emotion,to develop spontaneous facial expression database and to verify the algorithm performance.This project used Matlab programming language,specifically Viola Jones method for features tracking and extraction,then pattern matching for emotion classification purpose.Mouth feature is used as main features to identify the emotion of the expression.For verification purpose,the mimic and spontaneous database which are obtained from internet,open source database or novel (own) developed databases are used.Basically,the performance of the system is indicated by emotion detection rate and average execution time.At the end of this study,it is found that this system is suitable for recognizing spontaneous facial expression (63.28%) compared to posed facial expression (51.46%).The verification even better for positive emotion with 71.02% detection rate compared to 48.09% for negative emotion detection rate.Finally,overall detection rate of 61.20% is considered good since this system can execute result within 3s and use spontaneous input data which known as highly susceptible to noise. |
format |
Thesis |
qualification_name |
Master of Philosophy (M.Phil.) |
qualification_level |
Master's degree |
author |
Radin Monawir, Radin Puteri Hazimah |
author_facet |
Radin Monawir, Radin Puteri Hazimah |
author_sort |
Radin Monawir, Radin Puteri Hazimah |
title |
Human Spontaneous Emotion Detection System |
title_short |
Human Spontaneous Emotion Detection System |
title_full |
Human Spontaneous Emotion Detection System |
title_fullStr |
Human Spontaneous Emotion Detection System |
title_full_unstemmed |
Human Spontaneous Emotion Detection System |
title_sort |
human spontaneous emotion detection system |
granting_institution |
UTeM |
granting_department |
Faculty Of Manufacturing Engineering |
publishDate |
2018 |
url |
http://eprints.utem.edu.my/id/eprint/23332/1/Human%20Spontaneous%20Emotion%20Detection%20System.pdf http://eprints.utem.edu.my/id/eprint/23332/2/Human%20Spontaneous%20Emotion%20Detection%20System.pdf |
_version_ |
1747834036971110400 |
spelling |
my-utem-ep.233322022-02-21T11:36:16Z Human Spontaneous Emotion Detection System 2018 Radin Monawir, Radin Puteri Hazimah T Technology (General) TA Engineering (General). Civil engineering (General) Having smart computerized system which can understand and instantly gives appropriate response to human is the utmost motive in human and computer interaction (HCI) field.It is argued either HCI is considered advance if human could not have natural and comfortable interaction like human to human interaction.Besides,despite of several studies regarding emotion detection system, current system mostly tested in laboratory environment and using mimic emotion.Realizing the current system research lack of real life or genuine emotion input,this research work comes up with the idea of developing a system that able to recognize human emotion through facial expression.Therefore,the aims of this study are threefold which are to enhance the algorithm to detect spontaneous emotion,to develop spontaneous facial expression database and to verify the algorithm performance.This project used Matlab programming language,specifically Viola Jones method for features tracking and extraction,then pattern matching for emotion classification purpose.Mouth feature is used as main features to identify the emotion of the expression.For verification purpose,the mimic and spontaneous database which are obtained from internet,open source database or novel (own) developed databases are used.Basically,the performance of the system is indicated by emotion detection rate and average execution time.At the end of this study,it is found that this system is suitable for recognizing spontaneous facial expression (63.28%) compared to posed facial expression (51.46%).The verification even better for positive emotion with 71.02% detection rate compared to 48.09% for negative emotion detection rate.Finally,overall detection rate of 61.20% is considered good since this system can execute result within 3s and use spontaneous input data which known as highly susceptible to noise. 2018 Thesis http://eprints.utem.edu.my/id/eprint/23332/ http://eprints.utem.edu.my/id/eprint/23332/1/Human%20Spontaneous%20Emotion%20Detection%20System.pdf text en public http://eprints.utem.edu.my/id/eprint/23332/2/Human%20Spontaneous%20Emotion%20Detection%20System.pdf text en validuser http://plh.utem.edu.my/cgi-bin/koha/opac-detail.pl?biblionumber=112299 mphil masters UTeM Faculty Of Manufacturing Engineering 1. Adolphs, R., 2002. Neural Systems For Recognizing Emotion. Current Opinion In Neurobiology, 12(2), pp.169-177. 2. Babiker, A., Faye, I., Prehn, K. and Malik, A., 2015. Machine learning to differentiate between positive and negative emotions using pupil diameter. Frontiers in Psychology. 6(DEC), pp. 1–10. 3. Bao, S., Xu, S., Zhang, L., Yan, R., Su, Z., Han, D., & Yu, Y., 2012. Mining Social Emotions from Affective Text. IEEE Transactions on Knowledge and Data Engineering, 24(9), pp.1658–1670. 4. Calder, A. J., Burton, A. M., Miller, P., Young, A. W., & Akamatsu, S., 2001. A Principal Component Analysis of Facial Expressions. Vision Research, 41(9), pp. 1179–1208. 5. Camras, L. A., Lambrecht, L., & Michel, G. F., 1996. Infant “Surprise” Expressions As Coordinative Motor Structures. Journal of Nonverbal Behavior, 20(3), pp.183-195. 6. Chang, Y., Vieira, M., Turk, M., & Velho, L., 2005. Automatic 3D Facial Expression Analysis In Videos. Analysis and Modelling of Faces and Gestures, 3723, pp. 293–307. 7. Chen, L.L., Zhao, Y., Ye, P.F., Zhang, J. and Zou, J.Z., 2017. Detecting Driving Stress In Physiological Signals Based On Multimodal Feature Analysis And Kernel Classifiers. Expert Systems with Applications, 85, pp.279-291.95 8. Cohen, I., Garg, A. and Huang, T.S., 2000. Emotion Recognition from Facial Expressions Using Multilevel HMM. Neural Information Processing Systems (2), pp. 1-7 9. Cohn, J. F., Ambadar, Z., & Ekman, P. ,2007. Observer-Based Measurement Of Facial Expression With The Facial Action Coding System. The handbook of emotion elicitation and assessment, pp. 203-221. 10. Dailey, M. N., Joyce, C., Lyons, M. J., Kamachi, M., Ishi, H., Gyoba, J., & Cottrell, G. W., 2010. Evidence And A Computational Explanation Of Cultural Differences In Facial Expression Recognition. Emotion, Washington, D.C., 10(6), pp. 874–893. 11. D’Mello, S.K. and Graesser, A., 2010. Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features. User Modeling and UserAdapted Interaction, 20(2), pp.147-187. 12. Dixit, B.A. and Gaikwad, A.N., 2015. Statistical moments based facial expression analysis. In Advance Computing Conference (IACC), 2015 IEEE International, pp. 552-557. 13. Eisert, P., & Girod, B. 1997. Facial Expression Analysis For Model-Based Coding Of Video Sequences. ITG FACHBERICHT, pp. 33-38. 14. Ekman, P., 1973. Cross-Cultural Studies Of Facial Expression. Darwin and facial expression: A Century Of Research In Review, 169222, pp. 1-6.96 15. Ekman, P., 1982. Methods for Measuring Facial Action. Handbook of methods in nonverbal behavior research, pp.45-90. 16. Ekman, P. 1997. Should We Call It Expression Or Communication?. Innovation: The European Journal of Social Science Research, 10(4), pp. 333-344. 17. Ekman, P. and Friesen, W.V., 2003. Unmasking the face: A guide to recognizing emotions from facial clues. Ishk. 18. Ekman, P. and Friesen, W.V., 1975. Unmasking The Face: A Guide To Recognising Emotions From Facial Expressions. Consulting Psychologists Press CA, 19, pp.75. 19. El Ayadi, M., Kamel, M. S., & Karray, F., 2011. Survey On Speech Emotion Recognition: Features, Classification Schemes, And Databases. Pattern Recognition, 44(3), pp. 572–587. 20. Elfenbein, H. A., & Ambady, N., 2003. Universals and Cultural Differences in Recognizing Emotions. Universals And Cultural Differences In Communicating, pp. 159–164. 21. Emerich, S., Lupu, E. and Apatean, A, 2009. Emotions recognition by speech and facial expressions analysis. European Signal Processing Conference. (Eusipco), pp. 1617–1621. 22. Essa, I. A., & Pentland, A. P. , 1997. Coding, Analysis, Interpretation, And Recognition Of Facial Expressions. IEEE Transactions On Pattern Analysis And Machine Intelligence, 19(7), pp. 757-763.97 23. Fasel, B., & Luettin, J., 2003. Automatic Facial Expression Analysis: A Survey. Pattern Recognition, 36(1), pp. 259–275. 24. Fellenz, W., Taylor, J., Tsapatsoulis, N., & Kollias, S., 1999. Comparing Template-Based, Feature-Based And Supervised Classification Of Facial Expressions From Static Images. Computational Intelligence and Applications, 19(9), pp. 9-14. 25. France, D. J., Shiavi, R. G., Silverman, S., Silverman, M., & Wilkes, M., 2000. Acoustical Properties Of Speech As Indicators Of Depression And Suicidal Risk. IEEE transactions on Biomedical Engineering, 47(7), pp. 829-837. 26. Gendron, M., Roberson, D., van der Vyver, J. M., & Barrett, L. F., 2014. Perceptions Of Emotion From Facial Expressions Are Not Culturally Universal: Evidence From A Remote Culture.. Emotion, 14(2), pp. 251–62. 27. Gupta, V.M. and Dipesh, S., 2014. A Study of Various Face Detection Methods. International Journal of Advanced Research in Computer and Communication Engineering. 3(5), pp. 2278–1021. 28. Happy, S. L., Member, S., Patnaik, P., & Routray, A. , 2017. The Indian Spontaneous Expression Database for Emotion Recognition. IEEE Transactions on Affective Computing, 8(1), pp. 1–13.98 29. Hong, H., Neven, H., & Von der Malsburg, C., 1998. Online Facial Expression Recognition Based On Personalized Galleries. Automatic Face and Gesture Recognition, 1998. Proceedings. Third IEEE International Conference on , pp. 354-359. 30. Huang, X., Chen, D., Huang, Y., Han, X., & Chen, Y. W. ,2013. Automatic Prediction Of Trait Anxiety Degree Using Recognition Rates Of Facial Emotions. In 2013 6th International Conference on Advanced Computational Intelligence, ICACI 2013 – Proceedings, pp. 272–275. 31. Huang, Y.H. and Fuh, C.S., 2007. Face Detection and Smile Detection. Proceedings of IPPR Conference on Computer Vision, Graphics and Image Porcessing, Shitou, Taiwan, A. 5, pp. 108-115. 32. Hume, D. (2008). Emotions and Moods. Organizational Behavior., pp.258–297. 33. Kraut, R. E., & Johnston, R. E., 1979. Social And Emotional Messages Of Smiling: An Ethological Approach. Journal Of Personality And Social Psychology, 37(9), pp. 1539-1553. 34. Lanitis, A., Taylor, C. J., & Cootes, T. F., 1997. Automatic Interpretation And Coding Of Face Images Using Flexible Models. IEEE Transactions on Pattern Analysis and machine intelligence, 19(7), pp. 743-756. 35. Lee, D., Kwak, S. S., & Kim, M., 2007. Application Of Biological Signal To The Expressions Of Robotic Emotions. Proceedings - IEEE International Workshop on Robot and Human Interactive Communication, pp. 1119–1124.99 36. Lien, J. J. J., Kanade, T., Cohn, J. F., & Li, C. C., 2000. Detection, Tracking, And Classification Of Action Units In Facial Expression. Robotics and Autonomous Systems, 31(3), pp. 131–146. 37. Lin, J., Miao, C., & Shen, Z., 2012. A FCM Based Approach For Emotion Prediction In Educational Game. In Computing and Convergence Technology (ICCCT), 2012 7th International Conference on, pp. 980-986. 38. Litman, D., Forbes, K., & Silliman, S., 2003. Towards Emotion Prediction In Spoken Tutoring Dialogues. In Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology: companion volume of the Proceedings of HLT-NAACL 2003--short papers, 2, pp. 52-54. 39. Litman, D. J., & Forbes-Riley, K., 2004. Predicting Student Emotions In Computer-Human Tutoring Dialogues. Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics - ACL ’04, pp. 351-359. 40. Liu, S. S., Tian, Y. T., & Li, D. , 2009. New Research Advances Of Facial Expression Recognition. In Machine Learning and Cybernetics, 2009 International Conference on, 2, pp. 1150-1155. 41. Lucey, P., Cohn, J., Lucey, S., Matthews, I., Sridharan, S., & Prkachin, K. 2009. Automatically detecting pain using facial actions. In Proceedings Of The International Conference On Affective Computing And Intelligent Interaction, pp. 1–8.100 42. Mehrabian. A., 1968. Communication Without Words.Psych. Today, 4(2), pp. 53–56. 43. Mencattini, A., Martinelli, E., Costantini, G., Todisco, M., Basile, B., Bozzali, M., & Di Natale, C., 2014. Speech Emotion Recognition Using Amplitude Modulation Parameters And A Combined Feature Selection Procedure. Knowledge-Based Systems, 63, pp. 68–81. 44. Meng, H., Huang, D., Wang, H., Yang, H., AI-Shuraifi, M. and Wang, Y., 2013. Depression Recognition Based on Dynamic Facial and Vocal Expression Features Using Partial Least Square Regression. Avec ’13., pp. 21–30. 45. Networking, A., Volume, A., Computer, E., Engineering, B., Computer, E. and Engineering, B., 2016. Speech Emotion Recognition Using Fuzzy Logic Classifier., pp. 2817–2822. 46. Nicholas, G., Rotaru, M., & Litman, D. J., 2006. Exploiting Word-Level Features For Emotion Prediction. In Spoken Language Technology Workshop, 2006. IEEE, pp. 110-113. 47. Pantic, M. and Bartlett, M.S., 2007. Machine analysis of facial expressions. In Face recognition. InTech. 48. Pantic, M., & Rothkrantz, L. J. M., 2000. Automatic Analysis Of Facial Expressions: The State Of The Art. IEEE Transactions On Pattern Analysis And Machine Intelligence, 22(12), pp. 1424-1445.101 49. Quazi, M.T., Mukhopadhyay, S.C., Suryadevara, N.K. and Huang, Y.M., 2012. Towards the smart sensors based human emotion recognition. 2012 IEEE International Instrumentation and Measurement Technology Conference Proceedings, pp. 2365–2370. 50. Ranjeet, S. and Kaur, M., 2013. Face Recognition and Detection using Viola-Jones and Cross Correlation Method. International Journal of Science and Research (IJSR). 4(1), pp. 2498–2501. 51. Rowley, H.A., Baluja, S. and Kanade, T., 1998. Neural Network-Based Face Detection. IEEE Transactions On Pattern Analysis And Machine Intelligence, 20(1), pp.23- 38. 52. Russell, J. A., Camras, L., Chovil, N., Craig, K., Ekman, P., Ells-, P., Wagner, H., 1994. Is There Universal Recognition of Emotion From Facial Expression ? A Review of the CrossCultural Studies, 115 (1), pp. 102–141. 53. Samal, A. and Iyengar, P. a., 1992. Automatic recognition and analysis of human faces and facial expressions: a survey. Pattern Recognition. 25(1),pp. 65–77. 54. Sanchez-Mendoza, D., Masip, D. and Lapedriza, A. ,2015. Emotion recognition from midlevel features. Pattern Recognition Letters. 67,pp. 66–74. 55. Sebe, N., Lew, M.S., Sun, Y., Cohen, I., Gevers, T. and Huang, T.S., 2007. Authentic facial expression analysis. Image and Vision Computing. 25(12), pp.1856–1863.102 56. Shen, P., Changjun, Z., & Chen, X., 2011. Automatic Speech Emotion Recognition using Support Vector Machine. Proceedings of 2011 International Conference on Electronic & Mechanical Engineering and Information Technology, 2, pp. 621–625. 57. Soni, L.N., Datar, A. and Datar, P.S. (2017). Viola-Jones Algorithm Based Approach for Face Detection of African Origin People and Newborn Infants. 51(2), pp. 75–81. 58. Surakka, V., Sams, M. and Hietanen, J.K., 1999. Modulation of neutral face evaluation by laterally presented emotional expressions. Perceptual and motor skills, 88(2), pp.595-606. 59. Takahashi, K., 2004. Remarks on emotion Recognition from multi-modal bio-potential signals. 2004 IEEE International Conference on Industrial Technology, 2004. IEEE ICIT ’04., 3, pp. 95–100. 60. Valenza, G., Citi, L., Lanata, A., Scilingo, E. P., & Barbieri, R., 2013. A Nonlinear Heartbeat Dynamics Model Approach For Personalized Emotion Recognition. Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, pp. 2579–2582. 61. Valenza, G., Citi, L., Lanatá, A., Scilingo, E.P. and Barbieri, R., 2014. Revealing Real-Time Emotional Responses: a Personalized Assessment based on Heartbeat Dynamics. Scientific reports. pp. 1-13 62. Valenza, G., Lanata, A., & Scilingo, E. P., 2012. Oscillations Of Heart Rate And Respiration Synchronize During Affective Visual Stimulation. IEEE Transactions on Information Technology in Biomedicine, 16 (4), pp. 683–690.103 63. Vukadinovic, D. and Pantic, M. ,2005. Fully automatic facial feature point detection using Gabor feature based boosted classifiers. Proc. IEEE Int’l Conf. Systems, Man and Cybernetics. pp. 1692–1698. 64. Vural, E., Çetin, M., Erçil, A., Littlewort, G., Bartlett, M. and Movellan, J., 2008. Automated drowsiness detection for improved driving safety. 65. Wijayasekara, D., & Manic, M., 2013. Human Machine Interaction Via Brain Activity Monitoring. 2013 6th International Conference on Human System Interactions (HSI), pp. 103–109. 66. Yang, C. H., Wang, J. L., Lin, K. L., Kuo, Y. H., & Cheng, K. S., 2010. Negative Emotion Detection Using The Heart Rate Recovery And Time For Twelve-Beats Heart Rate Decay After Exercise Stress Test. In Neural Networks (IJCNN), The 2010 International Joint Conference on, pp. 1-6. 67. Zhang, L., Mistry, K., Jiang, M., Neoh, S.C. and Hossain, M.A., 2015. Adaptive facial point detection and emotion recognition for a humanoid robot. Computer Vision and Image Understanding, 140, pp.93-114. 68. Zhang, Q., Chen, X., Zhan, Q., Yang, T. and Xia, S., 2017. Respiration-based emotion recognition with deep learning. Computers in Industry. pp. 84–90. 69. Zhu, J. and Thagard, P., 2002. Emotion and action. Philosophical psychology, 15(1), pp.19- 36. |