Resources


Database Credentialed Access

Symile-MIMIC: a multimodal clinical dataset of chest X-rays, electrocardiograms, and blood labs from MIMIC-IV

Adriel Saporta, Aahlad Manas Puli, Mark Goldstein, et al.

A multimodal clinical dataset consisting of CXRs, ECGs, and blood labs, designed to evaluate Symile, a simple contrastive loss that accommodates any number of modalities and allows any model to produce representations for each modality.

database cxr ecg chest x-ray mimic contrastive learning model multimodal electrocardiogram

Published: Jan. 28, 2025. Version: 1.0.0


Database Open Access

ScientISST MOVE: Annotated Wearable Multimodal Biosignals recorded during Everyday Life Activities in Naturalistic Environments

João Areias Saraiva, Mariana Abreu, Ana Sofia Carmo, et al.

Multimodal (ECG, EMG, EDA, PPG, TEMP, ACC) biosignal dataset of everyday activities. Created with 3 wearable devices based on ScientISST Sense and Empatica E4.

greet lift uncontrolled environments run jump gesticulate walk wearable multimodal

Published: March 25, 2024. Version: 1.0.1


Database Credentialed Access

Symile-MIMIC: a multimodal clinical dataset of chest X-rays, electrocardiograms, and blood labs from MIMIC-IV

Adriel Saporta, Aahlad Manas Puli, Mark Goldstein, et al.

A multimodal clinical dataset consisting of CXRs, ECGs, and blood labs, designed to evaluate Symile, a simple contrastive loss that accommodates any number of modalities and allows any model to produce representations for each modality.

database cxr ecg chest x-ray mimic contrastive learning model multimodal electrocardiogram

Published: Jan. 28, 2025. Version: 1.0.0


Database Open Access

ScientISST MOVE: Annotated Wearable Multimodal Biosignals recorded during Everyday Life Activities in Naturalistic Environments

João Areias Saraiva, Mariana Abreu, Ana Sofia Carmo, et al.

Multimodal (ECG, EMG, EDA, PPG, TEMP, ACC) biosignal dataset of everyday activities. Created with 3 wearable devices based on ScientISST Sense and Empatica E4.

greet lift uncontrolled environments run jump gesticulate walk wearable multimodal

Published: March 25, 2024. Version: 1.0.1


Database Open Access

A multi-camera and multimodal dataset for posture and gait analysis

Manuel Palermo, João Mendes Lopes, João André, et al.

Multimodal dataset with 166k samples for vision-based applications with a smart walker used in gait and posture rehabilitation. It is equipped with a pair of Depth cameras with data synchronized with an inertial MoCap system worn by the participant.

computer vision inertial motion capture smart walker gait and posture analysis depth rehabilitation human pose estimation deep learning

Published: Nov. 1, 2021. Version: 1.0.0


Database Open Access

Multimodal Synchronized Motion Capture, Force Plate, and Radar Dataset of the One-Legged Stand Test for Fall-Risk Assessment

Daniel Copeland, Evan Linton, Xiang Zhang, et al.

A multimodal dataset of 32 participants performing the One-Legged Stand Test (OLST), with synchronized motion capture, force plate, and 24 GHz radar data. Each of 1,241 trials is labeled with foot-lift, stability phases, and foot-touchdown.

motion capture human pose estimation human movement fall risk assessment non-contact sensing one-legged stand test force plate analysis digital biomarkers human balance testing geriatrics radar signal processing postural control multimodal sensing biomechanics aging and mobility

Published: Jan. 25, 2026. Version: 1.0


Database Open Access

Multimodal Synchronized Motion Capture, Force Plate, and Radar Dataset of the One-Legged Stand Test for Fall-Risk Assessment

Daniel Copeland, Evan Linton, Xiang Zhang, et al.

A multimodal dataset of 32 participants performing the One-Legged Stand Test (OLST), with synchronized motion capture, force plate, and 24 GHz radar data. Each of 1,241 trials is labeled with foot-lift, stability phases, and foot-touchdown.

motion capture human pose estimation human movement fall risk assessment non-contact sensing one-legged stand test force plate analysis digital biomarkers human balance testing geriatrics radar signal processing postural control multimodal sensing biomechanics aging and mobility

Published: Jan. 25, 2026. Version: 1.0


Database Contributor Review

A multimodal dental dataset facilitating machine learning research and clinic services

Wenjing Liu, Yunyou Huang, Suqin Tang

A new dental dataset that contains 169 patients, three commonly used dental image models, and images of various health conditions of the oral cavity.

Published: Oct. 11, 2024. Version: 1.1.0


Database Restricted Access

Multimodal Physiological Indices During Surgery Under Anesthesia

Sandya Subramanian, Bryan Tseng, Riccardo Barbieri, et al.

Multimodal physiological indices collected during surgery when patients were under anesthesia

anesthesia nociception

Published: Aug. 23, 2024. Version: 1.0


Database Restricted Access

MIMIC-Eye: Integrating MIMIC Datasets with REFLACX and Eye Gaze for Multimodal Deep Learning Applications

Chihcheng Hsieh, Chun Ouyang, Jacinto C Nascimento, et al.

MIMIC-Eye: Integrating MIMIC Datasets with REFLACX and Eye Gaze for Multimodal Deep Learning Applications

Published: March 23, 2023. Version: 1.0.0