Hugging Face’s SmolLM models bring powerful AI to your phone, no cloud required
These Models Bring Advanced AI Capabilities to Personal Devices
Hugging Face today unveiled SmolLM, a new family of compact language models that surpass similar offerings from Microsoft, Meta, and Alibaba’s Qwen in performance. The SmolLM lineup features three sizes — 135 million, 360 million, and 1.7 billion parameters — designed to accommodate various computational resources. Despite their small footprint, these models have demonstrated superior results on benchmarks testing common sense reasoning and world knowledge.
Loubna Ben Allal, lead ML engineer on SmolLM at Hugging Face, emphasized the efficacy of targeted, compact models in an interview with VentureBeat. “We don’t need big foundational models for every task, just like we don’t need a wrecking ball to drill a hole in a wall,” she said. “Small models designed for specific tasks can accomplish a lot.”