Thinking small: How small language models could lessen the AI energy burden
About this article
According to researchers, for many industries, small language models may offer a host of advantages to energy- and resource-intensive large language models.
Category: impact Thinking small: How small language models could lessen the AI energy burden While large language models dominate the artificial intelligence discourse, their smaller cousins could offer a less energy intensive, meaningful alternative. By Noah Frank 7 Apr 2026 Share on Facebook Share on Twitter Copy address link to clipboard (From left) Student Thinh Pham, Professors Tu Vu and Xuan Wang, and students Gaurav Srivastava and William Quyet Do. Photo courtesy of Xuan Wang. Note to readers: This series of articles focuses on researchers whose work improves efficiency, addresses concerns, or offers alternative solutions to some of the pressing issues created by data centers. With the ubiquity of ChatGPT, Claude, Gemini and xAI, it can be easy to think that the entire artificial ingelligence (AI) industry is made up of these large language models (LLMs). But AI predates many of these systems and is far more expansive than just LLMs, including machine learning, deep learning, computer vision, robotics, and more. Even when it comes to language models themselves, the colossal, energy- and resource-intensive ones aren’t the only options available. According to Virginia Tech researchers, for many industries, small language models (SLMs) may offer a host of advantages. They are open source, requiring no token cost. They can be hosted privately on a local network, removing a level of cybersecurity threat. They don’t rely on the cloud, making them more reliable. This local...