NeurIPS 2025 E2LM Competition : Early Training Evaluation of Language Models Paper • 2506.07731 • Published Jun 9 • 2
Falcon-H1: A Family of Hybrid-Head Language Models Redefining Efficiency and Performance Paper • 2507.22448 • Published Jul 30 • 66
view article Article Announcing NeurIPS 2025 E2LM Competition: Early Training Evaluation of Language Models Jul 4 • 10
BitNet 1.58 Collection This collection houses BitNet-1.58, Falcon3-1.58 and Falcon-E quants. • 28 items • Updated Jun 10 • 2
BitVLA: 1-bit Vision-Language-Action Models for Robotics Manipulation Paper • 2506.07530 • Published Jun 9 • 20
Falcon-Arabic Collection 7B models built on top of Falcon3-7B • 3 items • Updated about 1 month ago • 11
Falcon-H1 Collection Falcon-H1 Family of Hybrid-Head Language Models (Transformer-SSM), including 0.5B, 1.5B, 1.5B-Deep, 3B, 7B, and 34B (pretrained & instruction-tuned). • 38 items • Updated about 1 month ago • 56
view article Article Falcon-H1: A Family of Hybrid-Head Language Models Redefining Efficiency and Performance May 21 • 37
view article Article Falcon-Edge: A series of powerful, universal, fine-tunable 1.58bit language models. May 15 • 36
Falcon Edge series Collection A series of powerful, universal and fine-tunable small Language Models • 7 items • Updated about 1 month ago • 24
BitNet Collection 🔥BitNet family of large language models (1-bit LLMs). • 7 items • Updated May 1 • 53
Falcon3 Collection Falcon3 family of Open Foundation Models is a set of pretrained and instruct LLMs ranging from 1B to 10B parameters. • 40 items • Updated about 1 month ago • 87
Falcon Mamba: The First Competitive Attention-free 7B Language Model Paper • 2410.05355 • Published Oct 7, 2024 • 35