Ph.D. Research Proposal: Mudi Zhang

Wednesday, April 29, 2026
3:00 p.m.
2211 Kim Engineering Building

ANNOUNCEMENT: Ph.D. Research Proposal Exam

 

Name: Mudi Zhang

 

Committee:

Professor Min Wu (Chair)

Professor Sahil Shah

Professor Ang Li

 

Date/time: 04/29/2026 3:00 pm to 5:00 pm

 

Location: 2211 Kim Engineering Building

 

Title: Multi-modal Physiological Signal Learning for Smart Health

 

Abstract: Physiological signals are widely adopted for continuous health monitoring in clinical and daily-life settings. Clinical-grade physiological signals, such as electrocardiography (ECG) and polysomnography (PSG), offer reliable and accurate health monitoring, but their practicality for daily use is limited due to their complexity and resource demands. In contrast, wearable signals, such as photoplethysmography (PPG), enable continuous daily-life monitoring, but many have reduced reliability and limited clinical practices. Motivated by the complementary strengths of clinical and wearable physiological signals, this proposal investigates two promising directions that leverage inherent relationships among clinical and wearable physiological signals to enable effective multi-modal physiological learning for smart health and enhance the utility of wearable signals in real-world application scenarios.


The first direction aims to infer clinical-grade physiological modalities from wearable-grade modalities using deep generative models. In the first part of the proposal, we propose Never-Miss-A-Beat, which is a novel PPG-to-ECG framework strategically combining transfer learning and causal representation learning. A causality-incorporated conditional variational autoencoder, named Causal-CVAE, is proposed as the backbone model to infer ECG signals from PPG signals. Experimental results show that the proposed framework yields better ECG inference performance compared to other baseline models, suggesting the potential of utilizing these frameworks for reliable and scalable precision cardiac monitoring.

The second direction focuses on achieving multi-modal coordination through cross-modal alignment. In the second part of the proposal, we propose PhysioGMC, a Generalizable Multi-modal Coordination framework for Physiological signals that explicitly accounts for their strong inter-subject variability. PhysioGMC incorporates both clinical and wearable modalities into the training process to improve cross-subject performance when only a single wearable modality is available at deployment. It introduces a cross-modal contrastive learning module consisting of two contrastive losses to jointly learn label-relevant, subject-agnostic representations across modalities. %Specifically, the self-supervised contrastive loss aligns latent features across modalities, while the supervised contrastive loss encourages learning label-discriminative features that are invariant to subject identity. Experiments on various health monitoring tasks demonstrate that PhysioGMC consistently outperforms existing methods, achieving superior cross-subject performance using only wearable modalities, PPG, at test time.

Building upon these research works, we plan to expand the research efforts in the domain of multi-modal physiological learning to complete the dissertation. The ongoing research works include: 1) A personalized diffusion models for ECG generation conditioned on PPG; 2) A framework that explicitly factorizes the latent representations of ECG signals into clinically and semantically meaningful components under the guidance of paired clinical reports; 3) A comprehensive and clinically grounded evaluation framework for assessing the generation quality of ECG signals conditioned on PPG; 4) A multi-modal coordination framework for sleep staging to improve the sleep staging performance of wearable signals with the coordination of PSG signals.

Audience: Graduate  Faculty 

remind we with google calendar

 

April 2026

SU MO TU WE TH FR SA
29 30 31 1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 1 2
Submit an Event