Here are the GGUF links to Mistral AI’s “collected works” from the past week – all ready for local use:
Cutting-edge coding models:
-
24B parameters: https://huggingface.co/bartowski/mistralai_Devstral-Small-2-24B-Instruct-2512-GGUF
-
123B parameters: https://huggingface.co/bartowski/mistralai_Devstral-2-123B-Instruct-2512-GGUF
Top-tier reasoning models – perfectly sized for consumer hardware:
-
3B parameters: https://huggingface.co/bartowski/mistralai_Ministral-3-3B-Reasoning-2512-GGUF
-
8B parameters: https://huggingface.co/bartowski/mistralai_Ministral-3-8B-Reasoning-2512-GGUF
-
14B parameters: https://huggingface.co/bartowski/mistralai_Ministral-3-14B-Reasoning-2512-GGUF
Powerful instruct models for local setups:
-
3B parameters: https://huggingface.co/bartowski/mistralai_Ministral-3-3B-Instruct-2512-GGUF
-
8B parameters: https://huggingface.co/bartowski/mistralai_Ministral-3-8B-Instruct-2512-GGUF
-
14B parameters: https://huggingface.co/bartowski/mistralai_Ministral-3-14B-Instruct-2512-GGUF
Mistral’s most advanced instruct model:
Licensing: All models under Apache 2.0, Devstral 2 with a modified MIT license.
What an insane achievement for a company that’s still small compared to OpenAI! Huge thanks to Mistral AI! <3
💬 Discussion r/LocalLLaMA (698 points, 95 commentaires)