Arcee AI has launched Trinity Large, a 400 billion parameter sparse Mixture of Experts (MoE) model, now available for free on OpenRouter for a limited time. This model features a unique architecture w
Mistral AI has launched Devstral 2, a next-generation coding model available in two sizes: Devstral 2 (123B parameters) and Devstral Small 2 (24B parameters), both open-source and designed for cost-ef