Topic
#local-inference
2 articles exploring local-inference. Expert insights and analysis from our editorial team.
Showing 1โ2 of 2 articles
Articles
Newest first
Open Source
free-claude-code Routes Claude Code Through NVIDIA NIM and Local Models After Anthropic's CLI Ban
free-claude-code reroutes Claude Code API calls to NVIDIA NIM, OpenRouter, or local backends. The proxy cuts API costs but cannot normalize capability across providers.
Models & Research
Running DeepSeek R1 Locally: Hardware Requirements, Quantization, and Real Throughput
What hardware actually runs DeepSeek R1 at useful speeds? Specific token/s benchmarks across GPU configs, quantization options, and the honest tradeoffs.