Topic

#local LLM

2 articles exploring local LLM. Expert insights and analysis from our editorial team.

Showing 1–2 of 2 articles

Articles

Newest first
Open Source

Devstral 2 from Mistral: A Fully Open-Source Coding Agent Model You Can Run on a Laptop

Devstral Small 2 is genuine Apache 2.0 and fits in 14 GB. The 123B flagship looks open-source but carries a revenue cap most enterprises will violate.

· 6 min read
Infrastructure & Runtime

MLX vs llama.cpp on Apple Silicon: Which Runtime to Use for Local LLM Inference

MLX delivers 20-87% faster generation on Apple Silicon for models under 14B parameters. llama.cpp wins for cross-platform use and long contexts.

· 9 min read