Skip to main content
McMaster University Menu Search

Personal tools

You are here: Home / Publications / DEVELOPMENT OF A MULTI-MODAL INTEGRATIVE SYSTEM TO ANALYZE COGNITIVE LOAD AND TEAM DYNAMICS DURING MEDICAL EVENT MANAGEMENT ON SIMULATED LONG-DURATION SPACE MISSIONS

Roger Daglius Dias, Thomas Doyle, J. Robertson, J. Thorgrimson, A. Gupta, B. Mormann, Charles Pozner, D. Smink, S. Lipsitz, David Musson, and S. Yule (2019)

DEVELOPMENT OF A MULTI-MODAL INTEGRATIVE SYSTEM TO ANALYZE COGNITIVE LOAD AND TEAM DYNAMICS DURING MEDICAL EVENT MANAGEMENT ON SIMULATED LONG-DURATION SPACE MISSIONS

NASA Human Research Program Investigators' Workshop.

INTRODUCTION: In-flight medical events pose significant risk to crew health and mission success on future long-duration exploration missions (LDEMs). The cognitive load imposed by medical-related tasks will play a critical role in how well flight teams can manage medical events in space. Moreover, teamwork and coordinated crew interactions are essential behaviors for high reliability and high performance teams. The aim of this study is to develop countermeasures against these risks to support crew health and mission success. We have developed a multi-modal monitoring and analytics system using unobtrusive physiological sensors and computer vision to measure team cognitive load and team dynamics during simulated medical events on LDEMs. METHODS: An immersive spacecraft simulator was built in a medical simulation center, allowing the investigation of team behaviors and performance during simulated in-flight medical event management. During planned simulations, each crew member uses two wearable devices: a chest strap heart rate sensor and a brain activity/accelerometer/gyroscope headband. These sensors provide the following physiological metrics: heart rate, heart rate variability parameters, 4-channel electroencephalography (EEG), head acceleration and angular velocity, blink rate and jaw clenching rate. A GoPro camera records high-definition videos at 30 fps and an open-source deep learning enabled computer vision software (OpenPose v1.3.0) is used to track the position of multiple team members over time. We used the Interact software (Mangold v16.0) to synchronize video, physiological signals and motion tracking data. The NASA-TLX app is used to assess self-reported cognitive workload and independent raters assess crew non-technical skills using the Space Flight Resource Management Medical Tool (SFRM-MED), a behavioral maker system developed in prior work, RESULTS: A multi-modal behavioral analytics dashboard was developed (Figure 1), enabling the visualization and time series analysis of objective metrics of cognitive load and team dynamics. The raw data from all sensors were integrated in a SQL relational database, allowing the calculation of several metrics of cognitive load (e.g. interbeat intervals-RR, frontal lobe electrical activity) and team dynamics (e.g. team proximity, total amount of movement, motion speed).

Document Actions