Latest Announcements
GPT Oss: Configurable Reasoning and Agentic Efficiency
Published on 2025-08-05More News
Trending Models
Gpt Oss 120B
GPT-OSS 120B by OpenAI: 120 billion param. model for complex reasoning tasks, supports English only.
Llama3.3 70B Instruct
Llama3.3 70B: Meta's 70 billion param, multi-lingual model for assistant-style chat, with a context window of 128k.
All Minilm 22M
All-MiniLM 22M by Sentence Transformers, a compact model for efficient information retrieval.
All Minilm 33M
All-MiniLM-33M by Sentence Transformers: A compact, monolingual model for efficient information retrieval.
Llama3.2 3B Instruct
Llama3.2 3B Instruct by Meta Llama Enterprise: A multilingual, 3 billion parameter model with 8k to 128k context-length, designed for assistant-like chat apps.
Qwen3 0.6B
Qwen3 0.6B: Alibaba's 0.6 billion param model excels in reasoning tasks with a 32k context window.
Qwen2.5Vl 7B
Qwen2.5VL 7B: Alibaba's 7 billion parameter LLM for visual content analysis.
Bge M3 567M
Bge M3 567M by BAAI: A bi-lingual, 567M param model for efficient information retrieval with an 8k context window.
Qwen3 8B
Qwen3 8B: Alibaba's 8 billion parameter LLM for reasoning & problem-solving. Mono-lingual model with context lengths up to 8k.
Deepseek R1 1.5B
Deepseek R1 1.5B: Bi-lingual Large Language Model with 1.5 billion parameters, supporting context lengths up to 128k for enhanced code generation and debugging.
Llama3 8B Instruct
Llama3 8B Instruct by Meta Llama: An 8 billion parameter, English-focused model with an 8k context window, ideal for commercial applications.
Llama3 Gradient 8B Instruct
Llama3 Gradient 8B by Meta Llama: An 8 billion parameter, English-focused LLM with an 8k context window, ideal for commercial & research use.