Machine learning model of facial expression outperforms models using analgesia nociception index and vital signs to predict postoperative pain intensity: a pilot study

Insun Park, Jae Hyon Park, Jongjin Yoon, Hyo Seok Na, Ah Young Oh, Junghee Ryu, Bon Wook Koo

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Background: Few studies have evaluated the use of automated artificial intelligence (AI)-based pain recognition in postoperative settings or the correlation with pain intensity. In this study, various machine learning (ML)-based models using facial expressions, the analgesia nociception index (ANI), and vital signs were developed to predict postoperative pain intensity, and their performances for predicting severe postoperative pain were com-pared. Methods: In total, 155 facial expressions from patients who underwent gastrectomy were recorded postoperatively; one blinded anesthesiologist simultaneously recorded the ANI score, vital signs, and patient self-assessed pain intensity based on the 11-point numerical rating scale (NRS). The ML models’ area under the receiver operating characteristic curves (AUROCs) were calculated and compared using DeLong’s test. Results: ML models were constructed using facial expressions, ANI, vital signs, and different combinations of the three datasets. The ML model constructed using facial expressions best predicted an NRS ≥ 7 (AUROC 0.93) followed by the ML model combining facial expressions and vital signs (AUROC 0.84) in the test-set. ML models constructed using combined physiological signals (vital signs, ANI) performed better than models based on individual parameters for predicting NRS ≥ 7, although the AUROCs were inferior to those of the ML model based on facial expressions (all P < 0.05). Among these parameters, absolute and relative ANI had the worst AUROCs (0.69 and 0.68, respectively) for predicting NRS ≥ 7. Conclusions: The ML model constructed using facial expressions best predicted severe postoperative pain (NRS ≥ 7) and outperformed models constructed from physiological signals.

Original languageEnglish
Pages (from-to)195-204
Number of pages10
JournalKorean journal of anesthesiology
Volume77
Issue number2
DOIs
StatePublished - 1 Apr 2024
Externally publishedYes

Bibliographical note

Publisher Copyright:
© The Korean Society of Anesthesiologists, 2024.

Keywords

  • Artificial intelligence
  • Facial expression
  • Machine learning
  • Pain measurement
  • Postoperative pain
  • Vital signs

Fingerprint

Dive into the research topics of 'Machine learning model of facial expression outperforms models using analgesia nociception index and vital signs to predict postoperative pain intensity: a pilot study'. Together they form a unique fingerprint.

Cite this