[2603.00049] BiJEPA: Bi-directional Joint Embedding Predictive Architecture for Symmetric Representation Learning
About this article
Abstract page for arXiv paper 2603.00049: BiJEPA: Bi-directional Joint Embedding Predictive Architecture for Symmetric Representation Learning
Computer Science > Machine Learning arXiv:2603.00049 (cs) [Submitted on 10 Feb 2026] Title:BiJEPA: Bi-directional Joint Embedding Predictive Architecture for Symmetric Representation Learning Authors:Yongchao Huang View a PDF of the paper titled BiJEPA: Bi-directional Joint Embedding Predictive Architecture for Symmetric Representation Learning, by Yongchao Huang View PDF HTML (experimental) Abstract:Self-Supervised Learning (SSL) has shifted from pixel-level reconstruction to latent space prediction, spearheaded by the Joint Embedding Predictive Architecture (JEPA). While effective, standard JEPA models typically rely on a uni-directional prediction mechanism (e.g. Context $\to$ Target), potentially neglecting the informative signal inherent in the inverse relationship, degrading its performance. In this work, we propose \textbf{BiJEPA}, a \textit{Bi-Directional Joint Embedding Predictive Architecture} that enforces cycle-consistent predictability between data segments. We address the inherent instability of symmetric prediction (representation explosion) by introducing a critical norm regularization mechanism on the representation vectors. We evaluate BiJEPA on three distinct modalities: synthetic periodic signals, chaotic Lorenz attractor trajectories, and high-dimensional image data (MNIST). Our results demonstrate that BiJEPA achieves stable convergence without collapse, captures the semantic structure of chaotic systems, and learns robust temporal and spatial represe...