OLMo 2 1B is the smallest model in the OLMo 2 family, featuring a transformer-style architecture with 4 trillion training tokens. It supports multiple models and fine-tuning options, and is designed for language modeling applications. The model and its associated resources are available on GitHub under an Apache 2.0 license.
+ olmo
language-model ✓
transformers ✓
+ ai
deep-learning ✓