[2603.20825] Cross-Granularity Representations for Biological Sequences: Insights from ESM and BiGCARP
About this article
Abstract page for arXiv paper 2603.20825: Cross-Granularity Representations for Biological Sequences: Insights from ESM and BiGCARP
Computer Science > Machine Learning arXiv:2603.20825 (cs) [Submitted on 21 Mar 2026] Title:Cross-Granularity Representations for Biological Sequences: Insights from ESM and BiGCARP Authors:Hanlin Xiao, Rainer Breitling, Eriko Takano, Mauricio A. Álvarez View a PDF of the paper titled Cross-Granularity Representations for Biological Sequences: Insights from ESM and BiGCARP, by Hanlin Xiao and 3 other authors View PDF HTML (experimental) Abstract:Recent advances in general-purpose foundation models have stimulated the development of large biological sequence models. While natural language shows symbolic granularity (characters, words, sentences), biological sequences exhibit hierarchical granularity whose levels (nucleotides, amino acids, protein domains, genes) further encode biologically functional information. In this paper, we investigate the integration of cross-granularity knowledge from models through a case study of BiGCARP, a Pfam domain-level model for biosynthetic gene clusters, and ESM, an amino acid-level protein language model. Using representation analysis tools and a set of probe tasks, we first explain why a straightforward cross-model embedding initialization fails to improve downstream performance in BiGCARP, and show that deeper-layer embeddings capture a more contextual and faithful representation of the model's learned knowledge. Furthermore, we demonstrate that representations at different granularities encode complementary biological knowledge, and th...