Agent Selection Guide
This guide helps you choose the right AI agents for your Aragora stress-tests based on task type, cost, and capability requirements.
Perspective coverage note: Mistral adds an EU lens, and Chinese models like DeepSeek, Qwen, and Kimi provide a Chinese perspective (use the providers and keys listed below).
Available Agents
Primary Providers (Direct API)
| Agent ID | Provider | Model | Best For | Cost |
|---|---|---|---|---|
anthropic-api | Anthropic | claude-opus-4-5-20251101 | Code review, reasoning | $$ |
openai-api | OpenAI | gpt-5.2 | General tasks, creativity | $$ |
gemini | gemini-3-pro-preview | Long context, analysis | $ | |
mistral-api | Mistral | mistral-large-2512 | European compliance, multilingual | $$ |
grok | xAI | grok-4-latest | Real-time knowledge | $$ |
OpenRouter Providers (Fallback/Alternative)
| Agent ID | Model | Best For | Cost |
|---|---|---|---|
openrouter | model parameter (default: deepseek/deepseek-chat-v3-0324) | Fallback when primary fails | Varies |
deepseek | deepseek/deepseek-reasoner | Code, math, reasoning | $ |
deepseek-r1 | deepseek/deepseek-r1 | Chain-of-thought reasoning | $ |
mistral | mistralai/mistral-large-2411 | Fast, high-quality reasoning | $$ |
qwen | qwen/qwen3-max | Multilingual, code | $ |
qwen-max | qwen/qwen3-max | Flagship reasoning | $$ |
llama | meta-llama/llama-3.3-70b-instruct | General, open weights | $ |
yi | 01-ai/yi-large | Chinese/English | $ |
Cost Legend: $ = Low ($0.001-0.01/1K tokens), $$ = Medium ($0.01-0.05/1K), $$$ = High ($0.05+/1K)
Local Providers (No API Key)
| Agent ID | Model | Best For | Cost |
|---|---|---|---|
ollama | Local Ollama model | Air-gapped/private deployments | $ |
lm-studio | Local LM Studio model | Desktop/local inference | $ |
Environment variables:
export OLLAMA_HOST=http://localhost:11434
export OLLAMA_MODEL=llama2
export LM_STUDIO_HOST=http://localhost:1234
CLI usage:
aragora ask "Review this policy" --agents ollama
aragora ask "Summarize this spec" --agents lm-studio
Python autodetection:
from aragora.agents import LocalLLMDetector
status = await LocalLLMDetector().detect_all()
if status.any_available:
print(status.recommended_server, status.recommended_model)
API endpoints:
curl -s http://localhost:8080/api/agents/local
curl -s http://localhost:8080/api/agents/local/status