H2O-Danube2-1.8B Achieves Top Ranking On Hugging Face Open LLM Leaderboard For 2 Billion (2B) Parameters Range

MOUNTAIN VIEW, Calif., April 12 (Bernama-BUSINESS WIRE) — H2O.ai, the open source leader in Generative AI and machine learning, is proud to announce that its latest open-weights (Apache v2.0) small language model, H2O-Danube2-1.8B, has secured the top position on the Hugging Face Open LLM Leaderboard for the <2B range, even surpassing the much larger Gemma-2B model from Google in the 2.5B parameter category. This achievement underscores H2O.ai’s commitment to advancing AI accessibility and performance through innovative open-source solutions.

H2O-Danube2-1.8B is built upon the success of its predecessor, H2O-Danube 1.8B, with notable upgrades and optimizations that have propelled it to the forefront of the 2B SLM category. Leveraging a vast dataset of 2 trillion high-quality tokens, this model builds upon the Mistral architecture and optimizations, such as dropping windowing attention, to deliver unparalleled performance in natural language processing tasks.



Related Articles