Bazaroid

Home

❯

IA Locale et Multimodale

❯

IA / LLM

❯

VRAM Requirements Reference What can you run with your VRAM? (Contributions welcome)

VRAM Requirements Reference - What can you run with your VRAM? (Contributions welcome)

29 avr. 20251 min de lecture

  • LocalLLaMA
  • reddit-chad
  • Reddit

I created this resource to help me quickly see which models I can run on certain VRAM constraints.

Check it out here: https://imraf.github.io/ai-model-reference/

I’d like this to be as comprehensive as possible. It’s on GitHub and contributions are welcome!


💬 Discussion r/LocalLLaMA (151 points, 41 commentaires) 🔗 Source


Vue Graphique

Créé avec Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community