The Academic Perspective Procedia publishes Academic Platform symposiums papers as three volumes in a year. DOI number is given to all of our papers.
Publisher : Academic Perspective
Journal DOI : 10.33793/acperpro
Journal eISSN : 2667-5862
Nowadays, when computer usage is widespread, innovative studies on human-computer interaction have gained momentum. One of these innovations is the determination of the emotional states of the humans by computerized systems. In this study, it is aimed to develop a method other than learning algorithms, which is different from the literature. In this study, respectively; human face detection, detection of face markers, face markers based on the creation of attributes, attributes were made on various calculations. This study was performed using the Raspberry Pi 3 hardware and Python software language. The proposed method was tested in the original dataset including 27 different individuals (15 females, 12 males) and the performance of the system was determined as 96.3%.
[1] I. A. Essa, A. P. Pentland, “Coding, analysis, interpretation and recognition of facial expressions”, IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 757–763, 1997.
[2] F. B. Mandal, S. Srivastava, “Real Time Facial Expression Recognition using a Novel Method”, The International Journal of Multimedia & Its Applications, 4(2), 2012.
[3] S. Borra and N.K. Jayant, “Attendance Management System Using Hybrid Face Recognition Techniques”, Conference on Advances in Signal Processing (CASP2016), 2016.
[4] K. Sehairi et al., “Real-time implementation of human action recognition system based on motion analysis ”, Studies in Computational Intelligence, 143-164, 2016.
[5] C. O. Reilly et al., “Recent developments in the study of rapid human movements with the kinematic theory: Applications to handwriting and signature synthesis”, Pattern Recognition Letters, 35(1), 224-235, 2014.
[6] G. Castellano et al., “Recognising Human Emotions from Body Movement and Gesture Dynamics”, 2nd international conference on Affective Computing and Intelligent Interaction, 71-82, 2007.
[7] A. Salem et al., “Analysis of strong password using keystroke dynamics authentication in touch screen devices”, Cybersecurity and Cyberforensics Conference CCC 2016, 15-21, 2016.
[8] P. Ekman, D. Keltner, “Universal Facial Expression of Emotion: An Old Controversy and New Findings”, J Nonverbal Behav, 21(1), 3-21, 1997.
[9] D. Roberson, L. Damjanovic, M. Kikutani, “Show and tell: The role of language in categorizing facial expressions of emotion. ”, Emotion Review, 2, 255-260, 2010.
[10] T. Takehara, N. Suzuki, “Differential processes of emotion space over time.”, North American Journal of Psychology, 3, 217-228, 2001.
[11] T. Fujimura, YT. Matsuda, “Categorical and dimensional perceptions in decoding emotional facial expressions”, Cognition and Emotion, 26(4), 587-601, 2012.
[12] D. E. King “Dlib-ml: A machine learning toolkit”, Journal of Machine Learning Research,10(Jul),1755-1758, 2009.
[13] G. Bradski, “The opencv library”, Dr Dobb's J. Software Tools, 25,120-125, 2000.
[14] T. E. Oliphant, “A guide to NumPy”, Trelgol Publishing, 2006.
[15] F. Pedregosa, et al. “Scikit-learn: Machine learning in Python”, Journal of Machine Learning Research, 12, 2825-2830, 2011.
[16] P. Viola, M. J. Jones, “Robust real-time face detection”, International Journal of Computer Vision, 57(2), 137-154, 2004.
Cite
@article{acperproISITES2019ID3, author={Akça, Beyza Nur and Çubukçu, Burakhan and Yüzgeç, Uğur}, title={Detection of Happiness Emotion on Images}, journal={Academic Perspective Procedia}, eissn={2667-5862}, volume={2}, year=2019, pages={324-333}}
Akça, B. , Çubukçu, B. , Yüzgeç, U.. (2019). Detection of Happiness Emotion on Images. Academic Perspective Procedia, 2 (3), 324-333. DOI: 10.33793/acperpro.02.03.3
%0 Academic Perspective Procedia (ACPERPRO) Detection of Happiness Emotion on Images% A Beyza Nur Akça , Burakhan Çubukçu , Uğur Yüzgeç% T Detection of Happiness Emotion on Images% D 11/22/2019% J Academic Perspective Procedia (ACPERPRO)% P 324-333% V 2% N 3% R doi: 10.33793/acperpro.02.03.3% U 10.33793/acperpro.02.03.3