Transformer Neural Networks for Protein Family and Interaction Prediction Tasks

Ananthan Nambiar, Simon Liu, Maeve Heflin, John Malcolm Forsyth, Sergei Maslov, Mark Hopkins, Anna Ritz

Research output: Contribution to journalArticlepeer-review

Abstract

The scientific community is rapidly generating protein sequence information, but only a fraction of these proteins can be experimentally characterized. While promising deep learning approaches for protein prediction tasks have emerged, they have computational limitations or are designed to solve a specific task. We present a Transformer neural network that pre-trains task-agnostic sequence representations. This model is fine-tuned to solve two different protein prediction tasks: protein family classification and protein interaction prediction. Our method is comparable to existing state-of-the-art approaches for protein family classification while being much more general than other architectures. Further, our method outperforms other approaches for protein interaction prediction for two out of three different scenarios that we generated. These results offer a promising framework for fine-tuning the pre-trained sequence representations for other protein prediction tasks.

Original languageEnglish (US)
Pages (from-to)95-111
Number of pages17
JournalJournal of Computational Biology
Volume30
Issue number1
DOIs
StatePublished - Jan 1 2023

Keywords

  • neural networks
  • protein family classification
  • protein-protein interaction prediction

ASJC Scopus subject areas

  • Computational Mathematics
  • Genetics
  • Molecular Biology
  • Computational Theory and Mathematics
  • Modeling and Simulation

Fingerprint

Dive into the research topics of 'Transformer Neural Networks for Protein Family and Interaction Prediction Tasks'. Together they form a unique fingerprint.

Cite this