Generalized Generative Deep Learning Models for Biosignal Synthesis and Modality Transfer

, , , , & (2023) Generalized Generative Deep Learning Models for Biosignal Synthesis and Modality Transfer. IEEE Journal of Biomedical and Health Informatics, 27(2), pp. 968-979.

View at publisher

Description

Generative Adversarial Networks (GANs) are a revolutionary innovation in machine learning that enables the generation of artificial data. Artificial data synthesis is valuable especially in the medical field where it is difficult to collect and annotate real data due to privacy issues, limited access to experts, and cost. While adversarial training has led to significant breakthroughs in the computer vision field, biomedical research has not yet fully exploited the capabilities of generative models for data generation, and for more complex tasks such as biosignal modality transfer. We present a broad analysis on adversarial learning on biosignal data. Our study is the first in the machine learning community to focus on synthesizing 1D biosignal data using adversarial models. We consider three types of deep generative adversarial networks: a classical GAN, an adversarial AE, and a modality transfer GAN; individually designed for biosignal synthesis and modality transfer purposes. We evaluate these methods on multiple datasets for different biosignal modalites, including phonocardiogram (PCG), electrocardiogram (ECG), vectorcardiogram and 12-lead electrocardiogram. We follow subject-independent evaluation protocols, by evaluating the proposed models' performance on completely unseen data to demonstrate generalizability. We achieve superior results in generating biosignals, specifically in conditional generation, by synthesizing realistic samples while preserving domain-relevant characteristics. We also demonstrate insightful results in biosignal modality transfer that can generate expanded representations from fewer input-leads, ultimately making the clinical monitoring setting more convenient for the patient. Furthermore our longer duration ECGs generated, maintain clear ECG rhythmic regions, which has been proven using ad-hoc segmentation models.

Impact and interest:

2 citations in Scopus
Search Google Scholar™

Citation counts are sourced monthly from Scopus and Web of Science® citation databases.

These databases contain citations from different subsets of available publications and different time periods and thus the citation count from each is usually different. Some works are not in either database and no count is displayed. Scopus includes citations from articles published in 1996 onwards, and Web of Science® generally from 1980 onwards.

Citations counts from the Google Scholar™ indexing service can be viewed at the linked Google Scholar™ search.

ID Code: 236930
Item Type: Contribution to Journal (Journal Article)
Refereed: Yes
ORCID iD:
Fernando, Tharinduorcid.org/0000-0002-6935-1816
Denman, Simonorcid.org/0000-0002-0983-5480
Sridharan, Sridhaorcid.org/0000-0003-4316-9001
Fookes, Clintonorcid.org/0000-0002-8515-6324
Measurements or Duration: 12 pages
DOI: 10.1109/JBHI.2022.3223777
ISSN: 2168-2194
Pure ID: 118699475
Divisions: Current > Research Centres > Centre for Data Science
Current > Research Centres > Centre for Biomedical Technologies
Current > QUT Faculties and Divisions > Faculty of Science
Current > QUT Faculties and Divisions > Faculty of Engineering
Current > Schools > School of Electrical Engineering & Robotics
Copyright Owner: 2022 IEEE
Copyright Statement: This work is covered by copyright. Unless the document is being made available under a Creative Commons Licence, you must assume that re-use is limited to personal use and that permission from the copyright owner must be obtained for all other uses. If the document is available under a Creative Commons License (or other specified license) then refer to the Licence for details of permitted re-use. It is a condition of access that users recognise and abide by the legal requirements associated with these rights. If you believe that this work infringes copyright please provide details by email to qut.copyright@qut.edu.au
Deposited On: 14 Dec 2022 04:59
Last Modified: 20 Jul 2024 15:36