Abstract
Motivation: Graph representation learning is a fundamental problem in the field of data science with applications to integrative analysis of biological networks. Previous work in this domain was mostly limited to shallow representation techniques. A recent deep representation technique, BIONIC, has achieved state-of-The-Art results in a variety of tasks but used arbitrarily defined components. Results: Here, we present BERTwalk, an unsupervised learning scheme that combines the BERT masked language model with a network propagation regularization for graph representation learning. The transformation from networks to texts allows our method to naturally integrate different networks and provide features that inform not only nodes or edges but also pathway-level properties. We show that our BERTwalk model outperforms BIONIC, as well as four other recent methods, on two comprehensive benchmarks in yeast and human. We further show that our model can be utilized to infer functional pathways and their effects. Contact: [email protected]
Original language | English |
---|---|
Article number | vbad086 |
Journal | Bioinformatics Advances |
Volume | 3 |
Issue number | 1 |
DOIs | |
State | Published - 2023 |
All Science Journal Classification (ASJC) codes
- Structural Biology
- Molecular Biology
- Genetics
- Computer Science Applications