Safetensors is Joining the PyTorch Foundation
About this article
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
Back to Articles Safetensors is Joining the PyTorch Foundation Published April 8, 2026 Update on GitHub Upvote 12 +6 Luc Georges mcpotato Follow Lysandre lysandre Follow Today, we're announcing that Safetensors has joined the PyTorch Foundation as a foundation-hosted project under the Linux Foundation, alongside DeepSpeed, Helion, Ray, vLLM, and PyTorch itself. How we got here Safetensors started as a Hugging Face project born out of a concrete need: a way to store and share model weights that couldn't execute arbitrary code. The pickle-based formats that dominated the ecosystem at the time meant that there was a very real risk you’d be running malicious code. While this was an acceptable risk when ML was still budding, it would become unacceptable as open model sharing became central to how the ML community works. The format we built is intentionally simple: a JSON header with a hard limit of 100MB, describing tensor metadata, followed by raw tensor data. Zero-copy loading that maps tensors directly from disk. Lazy loading so you can read individual weights without deserializing an entire checkpoint. What we didn't fully anticipate was how broadly it would be adopted. Today, Safetensors is the default format for model distribution across the Hugging Face Hub and others, used by tens of thousands of models across all modalities in ML. It has become the preferred way for the open source ML community to share models. Why the PyTorch Foundation We want Safetensors to truly be...