Real Time Pain Detection Using Facial Action Units in Telehealth System
During the Covid-19 pandemic, to reduce staff exposure to ill people, minimize the impact of patient surges on facilities, and preserve personal protective equipment, the recommendations are made by the World Health Organization to change the way that health care is delivered. Several telehealth systems are utilized including live audio-video interaction or real-time telephone typically with a patient using a computer, smartphone, or tablet. During these appointments, the doctors need to know the pain levels of the patient to be able to prescribe the correct medicine and diagnose the disease proficiently. In this paper, a real-time 4- pain levels recognition based on facial expression during telehealth is proposed. Generally, the pain is measured via verbal communication, normally the patient’s self-report. However, if the patient has a disability and unable to communicate with others due to being impaired mentally or having breathing problems or the child self-reporting may not be a perfect way to measure the pain. The proposed system consists of two methods to detect pain from a patient’s facial expressions. The AAM_Based method detects the face and facial landmarks from each video frame using Active Appearance Model AAM, these landmarks are used to compute the facial features. The AU_Based method uses Facial Action Units AU which objectively describes facial muscle activations that are considered as Region of Interest. Support Vector Machine classifier is utilized to detect the levels of pain. A labeled dataset such as Biovid is used to train test, and the AAM_based method, while and UNBC is used for the second method. The findings show that it is possible to depend on facial expression to detect pain level 1 and level 4 very accurately, while it is very tricky to detect pain level 2, and 3 because the AUs for them are similar for most of the patients.
Al-Eidan, M. R.; Al-Khalifa, H.; Al-Salman, (2020) A. Deep-Learning-Based Models for Pain Recognition: A Systematic Review. Appl. Sci., 10, 5984.
Anwar, S., Milanova, M., Bigazzi, A., Bocchi, L. and Guazzini, A. (2016) Real time intention recognition," IECON 2016 - 42nd Annual Conference of the IEEE Industrial Electronics Society, Florence, pp. 1021-1024, DOI: 10.1109/IECON.2016.7794016.
Ayzenberg, Y., Hemandez J. and, Picard R., (2012) FEEL:Frequent EDA and Event Logging- A Mobile Social Interaction Stress Mointoring System, CHI.
Cootes, T. and Taylor, C. (1994) Modeling object appearance using the grey-level surface. In E. Hancock, editor, 5th British Machine Vision Conference, pages 479–488, York, England. BMVA Press.
Craig, K. D. (1992) “The facial expression of pain Better than a thousand words?” APS Journal, vol. 1, no. 3, pp. 153–162. 1, 4.
Craig, K. D., Prkachin, K. M. and Grunau, R. E. (2011) The facial expression of pain, in Handbook of Pain Assessment, D. C. Turk and R. Melzack, Eds. Guilford Press, 2011. 1, 4.
Egede J. O. et al., "EMOPAIN Challenge 2020: Multimodal Pain Evaluation from Facial and Bodily Expressions," 2020 15th IEEE International Conference on Automatic Face and Gesture Recognition (FG 2020), 2020, pp. 849-856, doi: 10.1109/FG47880.2020.00078.
Ekman, P. and Friesen, W. (1978) Manual for Facial Action Coding System, Palo Alto: Consulting Psychologists Press.
Friesen, W., Ekman, P., and, Hager, J. (2002) Facial action coding system: Research nexus in Salt Lake City UT USA: Network Research Information.
Ghazal Bargshady, Xujuan Zhou, Ravinesh C. Deo, Jeffrey Soar, Frank Whittaker, Hua Wang, (2020) Enhanced deep learning algorithm development to detect pain intensity from facial expression images”, Expert Systems with Applications, Volume 149, 2020, 113305, ISSN 0957-4174, https://doi.org/10.1016/j.eswa.2020.113305. (http://www.sciencedirect.com/science/article/pii/S0957417420301305)
Hardy, Q. (2020) COVID-19 And Our Surprising Digital Transformation.
Hasan, M. K. G. M. T. Ahsan, Ahamed, S. I., Love, R. and Salim, R. (2016) Pain Level Detection from Facial Image Captured by Smartphone, J. Inf. Process, vol. 24, no. 4, pp. 598-608.
Isaac, M., Yaffe-Bellany, D. and Weise, K. (2020) Workplace vs. Coronavirus: ‘No One Has a Playbook for This.
Kächele, M., Thiam, P., Amirian, M. Werner, P. Walter, S., Schwenker, F. and Palm, G. (2015) Multimodal Data Fusion for Person-Independent, Continuous Estimation of Pain Intensity, in Engineering Applications of Neural Networks, Springer, pp. 275–285, DOI: 10.1007/978-3-319-23983-5_26. 1.
Kaltwang, S., Rudovic, O. and Pantic, M. (2012) Continuous Pain Intensity Estimation from Facial Expressions, in Advances in Visual Computing, Springer, pp. 368–377.
Kong, Y., Posada-Quintero, H. F. and Chon, K. H. (2020) Pain Detection using a Smartphone in Real Time*, 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, pp. 4526-4529, DOI: 10.1109/EMBC44109.2020.9176077.
Kunz, M. and Lautenbacher, S. (2014) The faces of pain: a cluster analysis of individual differences in facial activity patterns of pain, European Journal of Pain, vol. 18, no. 6, pp. 813–823.
Kunz, M., D. Meixner, and Lautenbacher, S. (2019) “Facial muscle movements encoding pain–A systematic review,” Pain, vol. 160, no. 3, pp. 535–549, Mar. 2019.
Lucey, P., Cohn, J. F., Prkachin, K. M., Solomon, P. E., Chew, S. and Matthews, I. (2012) Painful monitoring: Automatic pain monitoring using the UNBC-McMaster shoulder pain expression archive database, Image and Vision Computing, vol. 30, no. 3, pp. 197–205.
Mervosh, S. Lu, D. and Swales, V. (2020) See Which States and Cities Have Told Residents to Stay at Home.
Pikulkaew K. and V. Chouvatut, "Enhanced Pain Detection and Movement of Motion with Data Augmentation based on Deep Learning," 2021 13th International Conference on Knowledge and Smart Technology (KST), 2021, pp. 197-201, doi: 10.1109/KST51265.2021.9415827.
Prkachin, K. and Solomon, P. (2008) The structure reliability and validity of pain expression: Evidence from patients with shoulder pain, Pain, vol. 139, pp. 267-274.
Prkachin, K. M. and Craig, K. D. (1995) Expressing pain: The communication and interpretation of facial pain signals, Journal of Nonverbal Behavior, vol. 19, no. 4, pp. 191–205, Dec. 1995. 1, 4.
Prkachin, K.M., Solomon, P., Lucy, P., Cohn, J. F. and Matthrews, I. (2011) Painful data: The UNBC-McMaster shoulder pain expression archive database, Proc. IEEE Int'l Conf. Automatic Face and Gesture Recognition.
Rosser, B. A. and Eccleston, C. (2012) Smartphone applications for pain management, J. Telemed. Telecare, vol. 17, no. 6, pp. 308-312.
Rudovic, O., Pavlovic, V. and Pantic, M. (2015) Context-Sensitive Dynamic Ordinal Regression for Intensity Estimation of Facial Action Units, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 37, no. 5, pp. 944–958.
Saha, P., Roy, S. D., Bhowmik, M. K. and Ghosh, A. K. (2016) An approach for automatic pain detection through facial expression", Procedia Comput. Sci., vol. 84, pp. 99-106.
Semwal A. and N. D. Londhe, "Automated Facial Expression based Pain Assessment Using Deep Convolutional Neural Network," 2020 3rd International Conference on Intelligent Sustainable Systems (ICISS), 2020, pp. 366-370, doi: 10.1109/ICISS49785.2020.9316099.
Sikka, K., Dhall, A. and Bartlett, M. S. (2014) Classification and weakly supervised pain localization using multiple segment representation, Image and Vision Computing, vol. 32, no. 10, pp. 659–670.
The biovid heat pain database, [online] Available: http://www.iikt.ovgu.de/BioVid.html [Accessed 8 Feb. 2021].
Werner, P. Al-Hamadi, A. and Niese, R. (2012) Pain Recognition and Intensity Rating based on Comparative Learning,” in IEEE International Conference on Image Processing (ICIP), pp. 2313–2316. 1.
Xu, X., & de Sa, V. R. (2020). Exploring Multidimensional Measurements for Pain Evaluation using Facial Action Units.
Copyright (c) 2021 dalya abdullah anwar
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
At Zanco Journal, we're dedicated to protecting your rights as an author, and ensuring that any and all legal information and copyright regulations are addressed. Whether an author is published with Zanco Journal or any other publisher, we hold ourselves and our colleagues to the highest standards of ethics, responsibility and legal obligation