BioBridge: Bridging Biomedical Foundation Models via Knowledge Graphs

27
citations
#777
in ICLR 2024
of 2297 papers
6
Top Authors
4
Data Points

Abstract

Foundation models (FMs) learn from large volumes of unlabeled data to demonstrate superior performance across a wide range of tasks. However, FMs developed for biomedical domains have largely remained unimodal, i.e., independently trained and used for tasks on protein sequences alone, small molecule structures alone, or clinical data alone.To overcome this limitation, we present BioBridge, a parameter-efficient learning framework, to bridge independently trained unimodal FMs to establish multimodal behavior. BioBridge achieves it by utilizing Knowledge Graphs (KG) to learn transformations between one unimodal FM and another without fine-tuning any underlying unimodal FMs.Our results demonstrate that BioBridge canbeat the best baseline KG embedding methods (on average by ~ 76.3%) in cross-modal retrieval tasks. We also identify BioBridge demonstrates out-of-domain generalization ability by extrapolating to unseen modalities or relations. Additionally, we also show that BioBridge presents itself as a general-purpose retriever that can aid biomedical multimodal question answering as well as enhance the guided generation of novel drugs. Code is at https://github.com/RyanWangZf/BioBridge.

Citation History

Jan 28, 2026
0
Feb 13, 2026
27+27
Feb 13, 2026
27
Feb 13, 2026
27