doi.bio/10.2139/ssrn.4635313

Multimodal integration of neuroimaging and genetic data for the diagnosis of mood disorders based on computer vision models.

URL:

https://doi.org/10.2139/ssrn.4635313

DOI

10.2139/ssrn.4635313

Authors and Affiliations:

Seungeun Lee Department of Mathematics, Korea University, Anamro 145, Seoungbuk-gu, Seoul, 02841, Republic of Korea.

Yongwon Cho Department of Computer Science and Engineering, Soonchunhyang University, South Korea, Republic of Korea.

Yuyoung Ji Division of Life Science, Korea University, Anamro 145, Seoungbuk-gu, Seoul, 02841, Republic of Korea.

Minhyek Jeon Division of Biotechnology, Korea University, Anamro 145, Seoungbuk-gu, Seoul, 02841, Republic of Korea; Computational Biology Department, Carnegie Mellon University, Pittsburgh, PA, 15213, United States.

Aram Kim Department of Biomedical Sciences, Korea University College of Medicine, Seoul, 02841, Republic of Korea.

Byung-Joo Ham Department of Psychiatry, Korea University Anam Hospital, 73, Goryeodae-ro, Seoungbuk-gu, Seoul, 02841, Republic of Korea. Electronic address: byungjoo.ham@gmail.com.

Yoonjung Yoonie Joo Department of Digital Health, Samsung Advanced Institute for Health Sciences & Technology (SAIHST), Sungkyunkwan University, Samsung Medical Center, 115 Irwon-Ro, Gangnam-Gu, Seoul, 06355, Republic of Korea. Electronic address: helloyjjoo@gmail.com.

Abstract:

Mood disorders, particularly major depressive disorder (MDD) and bipolar disorder (BD), are often underdiagnosed, leading to substantial morbidity. Harnessing the potential of emerging methodologies, we propose a novel multimodal fusion approach that integrates patient-oriented brain structural magnetic resonance imaging (sMRI) scans with DNA whole-exome sequencing (WES) data. Multimodal data fusion aims to improve the detection of mood disorders by employing established deep-learning architectures for computer vision and machine-learning strategies. We analyzed brain imaging genetic data of 321 East Asian individuals, including 147 patients with MDD, 78 patients with BD, and 96 healthy controls. We developed and evaluated six fusion models by leveraging common computer vision models in image classification: Vision Transformer (ViT), Inception-V3, and ResNet50, in conjunction with advanced machine-learning techniques (XGBoost and LightGBM) known for high-dimensional data analysis. Model validation was performed using a 10-fold cross-validation. Our ViT ⊕ XGBoost fusion model with MRI scans, genomic Single Nucleotide polymorphism (SNP) data, and unweighted polygenic risk score (PRS) outperformed baseline models, achieving an incremental area under the curve (AUC) of 0.2162 (32.03% increase) and 0.0675 (+8.19%) and incremental accuracy of 0.1455 (+25.14%) and 0.0849 (+13.28%) compared to SNP-only and image-only baseline models, respectively. Our findings highlight the opportunity to refine mood disorder diagnostics by demonstrating the transformative potential of integrating diverse, yet complementary, data modalities and methodologies.

References:

10.1177/0748233719878970

10.1016/j.etap.2017.04.012

10.1016/j.jclepro.2018.04.218

No reference information available

No reference information available

10.1515/ntrev-2018-0067

10.1051/agro:2002073

10.1007/s10529-015-2026-7

10.1080/01904167.2020.1724305

No reference information available

10.1089/ars.2012.5084

10.1021/jf405476u

10.1007/s40003-014-0113-y

10.1111/1753-0407.12554

10.1590/1980-5373-mr-2018-0227

No reference information available

10.1039/C5RA04012D

No reference information available

No reference information available

10.1111/1748-5967.12574

No reference information available

10.1007/s00449-017-1840-9

No reference information available

No reference information available

No reference information available

10.1016/j.ijleo.2019.02.046

10.1016/j.colsurfb.2009.05.018

10.4236/health.2010.211199

10.1016/j.molstruc.2017.09.005

No reference information available

10.1016/j.jclepro.2016.09.176

No reference information available

10.1016/j.apsusc.2016.11.118

10.1016/j.jbiotec.2016.07.010

No reference information available

10.1021/cm9708269

10.1016/j.reffit.2017.05.001

No reference information available

No reference information available

10.1016/j.mimet.2020.105966

10.1016/j.jksus.2016.10.002

10.1016/j.jtemb.2016.09.001

10.1080/00219266.1992.9655281

10.15835/nbha50112657

No reference information available

10.1049/iet-nbt.2016.0179

No reference information available

No reference information available

No reference information available

No reference information available