Aya

Aya 8B - Details

Last update on 2025-05-20

Aya 8B is a large language model developed by Cohere For Ai with 8b parameters. It is designed to support 23 languages and is released under the Creative Commons Attribution-NonCommercial 4.0 International (CC-BY-NC-4.0) license, making it accessible for non-commercial use while encouraging multilingual applications.

Description of Aya 8B

Aya 23 is an open weights research release of an instruction fine-tuned model with highly advanced multilingual capabilities. It pairs a highly performant pre-trained Command family of models with the recently released Aya Collection, resulting in a powerful multilingual large language model serving 23 languages. The model emphasizes research and non-commercial applications, leveraging its open weights approach to enable transparency and innovation in multilingual AI development.

Parameters & Context Length of Aya 8B

8b 8k

Aya 8B has 8b parameters, placing it in the mid-scale category of open-source LLMs, offering a balance between performance and resource efficiency for moderate complexity tasks. Its 8k context length falls into the moderate range, enabling it to handle extended interactions but with limitations for very long texts. This combination makes it suitable for applications requiring multilingual support without excessive computational demands, though it may struggle with extremely lengthy or highly complex queries.

  • Parameter Size: 8b
  • Context Length: 8k

Possible Intended Uses of Aya 8B

instruction following cross lingual communication

Aya 8B is a multilingual large language model with 8b parameters and support for 23 languages, making it a possible tool for tasks like multilingual text generation, cross-lingual communication, and instruction following. Its ability to handle diverse linguistic contexts suggests possible applications in content creation, language learning, or collaborative projects across regions. However, these uses remain possible and require thorough exploration to ensure alignment with specific needs. The model’s design emphasizes flexibility, but its effectiveness in real-world scenarios depends on further testing and adaptation.

  • Intended Uses: multilingual text generation, cross-lingual communication, instruction following
  • Supported Languages: persian, hindi, indonesian, greek, vietnamese, turkish, dutch, romanian, japanese, polish, chinese (simplified & traditional), hebrew, arabic, ukrainian, czech, italian, korean, russian, english, german, portuguese, french, spanish
  • Is Multi-Lingual: yes

Possible Applications of Aya 8B

content creation text generation language learning tool multilingual assistant collaboration tool

Aya 8B is a multilingual large language model with 8b parameters and support for 23 languages, making it a possible tool for tasks like multilingual text generation, cross-lingual communication, and instruction following. Its design could be possible for applications such as creating content in multiple languages, facilitating dialogue between speakers of different languages, assisting with multilingual instruction tasks, or supporting collaborative projects across linguistic boundaries. These uses remain possible and require thorough evaluation to ensure they meet specific needs. The model’s flexibility suggests potential for diverse scenarios, but each application must be carefully assessed before deployment.

  • multilingual text generation
  • cross-lingual communication
  • instruction following
  • multilingual collaboration

Quantized Versions & Hardware Requirements of Aya 8B

16 vram 32 ram

Aya 8B in its q4 quantized version offers a balanced trade-off between precision and performance, requiring a GPU with at least 16GB VRAM and a system with 32GB RAM for optimal operation. This configuration ensures the model can run efficiently on mid-range hardware, though additional cooling and a reliable power supply are recommended. The q4 version is particularly suited for users seeking a practical solution without excessive resource demands.

q2, q3, q4, q5, q6, q8

Conclusion

Aya 8B is a multilingual large language model with 8b parameters, supporting 23 languages and released under the Creative Commons Attribution-NonCommercial 4.0 International (CC-BY-NC-4.0) license, making it suitable for non-commercial research and applications. Developed by Cohere For Ai, it balances performance and resource efficiency while emphasizing accessibility and transparency in multilingual AI development.

References

Huggingface Model Page
Ollama Model Page

Benchmarks

Benchmark Name Score
Instruction Following Evaluation (IFEval) 46.99
Big Bench Hard (BBH) 20.20
Mathematical Reasoning Test (MATH Lvl 5) 1.66
General Purpose Question Answering (GPQA) 4.59
Multimodal Understanding and Reasoning (MUSR) 8.42
Massive Multitask Language Understanding (MMLU-PRO) 14.20
Link: Huggingface - Open LLM Leaderboard
Benchmark Graph
Maintainer
Parameters & Context Length
  • Parameters: 8b
  • Context Length: 8K
Statistics
  • Huggingface Likes: 415
  • Huggingface Downloads: 34K
Intended Uses
  • Multilingual Text Generation
  • Cross-Lingual Communication
  • Instruction Following
Languages
  • Persian
  • Hindi
  • Indonesian
  • Greek
  • Vietnamese
  • Turkish
  • Dutch
  • Romanian
  • Japanese
  • Polish
  • Chinese (Simplified & Traditional)
  • Hebrew
  • Arabic
  • Ukrainian
  • Czech
  • Italian
  • Korean
  • Russian
  • English
  • German
  • Portuguese
  • French
  • Spanish