Bazaroid

Home

❯

IA Locale et Multimodale

❯

IA / LLM

❯

LLama 3 8B Instruct with a 262k context length landed on HuggingFace

LLama-3-8B-Instruct with a 262k context length landed on HuggingFace

25 avr. 20241 min de lecture

  • LocalLLaMA
  • reddit-chad
  • Reddit

We just released the first LLama-3 8B-Instruct with a context length of over 262K onto HuggingFace! This model is a early creation out of the collaboration between https://crusoe.ai/ and https://gradient.ai.

Link to the model: https://huggingface.co/gradientai/Llama-3-8B-Instruct-262k

Looking forward to community feedback, and new opportunities for advanced reasoning that go beyond needle-in-the-haystack!


💬 Discussion r/LocalLLaMA (55 points, 26 commentaires)


Vue Graphique

Créé avec Quartz v4.5.2 © 2026

  • GitHub
  • Discord Community