🪆 Introduction to Matryoshka Embedding Models
About this article
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Back to Articles 🪆 Introduction to Matryoshka Embedding Models Published February 23, 2024 Update on GitHub Upvote 193 +187 Tom Aarsen tomaarsen Follow Joshua Xenova Follow Omar Sanseviero osanseviero Follow In this blogpost, we will introduce you to the concept of Matryoshka Embeddings and explain why they are useful. We will discuss how these models are theoretically trained and how you can train them using Sentence Transformers. Additionally, we will provide practical guidance on how to use Matryoshka Embedding models and share a comparison between a Matryoshka embedding model and a regular embedding model. Finally, we invite you to check out our interactive demo that showcases the power of these models. Table of Contents Understanding Embeddings 🪆 Matryoshka Embeddings 🪆 Matryoshka Dolls Why would you use 🪆 Matryoshka Embedding models? How are 🪆 Matryoshka Embedding models trained? Theoretically In Sentence Transformers How do I use 🪆 Matryoshka Embedding models? Theoretically In Sentence Transformers Results Demo References Understanding Embeddings Embeddings are one of the most versatile tools in natural language processing, enabling practitioners to solve a large variety of tasks. In essence, an embedding is a numerical representation of a more complex object, like text, images, audio, etc. The embedding model will always produce embeddings of the same fixed size. You can then compute the similarity of complex objects by computing the similarity of the respective em...