Agent APIModels
DocsModels

Models

Use the latest AI models from OpenAI, Google, Anthropic, and Mistral with a single interface.

Quick Start (60 Seconds)

Easiest Way: Use Managed Keys

No API key setup required! Authenticate once and get 100K free tokens to start.

Terminalbash
$co auth
main.py
from connectonion import Agent # Add co/ prefix - that's it! agent = Agent("assistant", model="co/gpt-5") response = agent.input("Explain quantum computing")
output
Quantum computing harnesses quantum mechanical phenomena...

Bonus: Star our repo for an additional 100K tokens!

Alternative: Bring Your Own Keys

For production or high-volume usage, use your own API keys for direct billing.

Terminalbash
$export OPENAI_API_KEY="sk-..."
main.py
from connectonion import Agent # Use model names directly agent = Agent("assistant", model="gpt-5") response = agent.input("Explain quantum computing")
output
Quantum computing harnesses quantum mechanical phenomena...

Model Comparison

ModelProviderContextStrengths
gpt-5OpenAI200KBest for coding & agentic tasks
gemini-3-pro-previewGoogle1MState-of-the-art reasoning
gemini-2.5-proGoogle2MMultimodal, huge context
claude-opus-4-5Anthropic200KMost capable Claude
mistral-large-latestMistral128KHigh performance European model

Default Models

When you don't specify a model, ConnectOnion uses these optimized defaults:

Agent Default

For Agent() class

main.py
# Default: co/gemini-2.5-pro agent = Agent("assistant") # Uses co/gemini-2.5-pro # Same as: agent = Agent("assistant", model="co/gemini-2.5-pro")

Best for agentic tasks with tool calling and complex reasoning

llm_do Default

For llm_do() function

main.py
from connectonion import llm_do # Default: co/gemini-2.5-flash result = llm_do("Summarize this text...") # Same as: result = llm_do("...", model="co/gemini-2.5-flash")

Fast and cost-effective for simple LLM calls

Note: Default models require authentication with co auth. To use your own API keys without the co/ prefix, set GEMINI_API_KEY environment variable.

Available Models

GPT-5 Series

gpt-5Best for coding and agentic tasks across domains
gpt-5-miniFaster, cost-efficient version for well-defined tasks
gpt-5-nanoFastest, most cost-efficient version

Reasoning Models

o4-miniOpenAI's newest reasoning model

Model Selection Guide

Best for Coding

main.py
agent = Agent("coder", model="gpt-5") # Alternative: claude-sonnet-4-5

Fast Responses

main.py
agent = Agent("quick", model="gpt-5-nano") # Alternative: gemini-2.5-flash

Cost-Optimized

main.py
agent = Agent("budget", model="gpt-5-nano") # Alternative: gemini-2.5-flash-lite

Multimodal

main.py
agent = Agent("vision", model="gemini-2.5-pro") # Supports: audio, video, images, PDF

Two Ways to Use Models

Option 1: Managed Keys

Recommended for getting started

Terminalbash
$co auth
main.py
# Use any model with co/ prefix agent = Agent("assistant", model="co/gpt-5") agent = Agent("assistant", model="co/gemini-2.5-pro") agent = Agent("assistant", model="co/claude-opus-4-5")

✓ 100K free tokens to start

✓ Access to all providers

✓ No API key management

View pricing →

Option 2: Your Own Keys

For production or high-volume usage

Terminalbash
$export OPENAI_API_KEY="sk-..."
$export GEMINI_API_KEY="AIza..."
$export ANTHROPIC_API_KEY="sk-ant-..."
$export MISTRAL_API_KEY="..."
main.py
# Use models without co/ prefix agent = Agent("assistant", model="gpt-5") agent = Agent("assistant", model="gemini-2.5-pro") agent = Agent("assistant", model="claude-opus-4-5")

✓ Production deployments

✓ Direct billing

✓ Existing infrastructure

Smart Model Selection

Automatically select the best model based on your needs:

main.py
def select_model(task_type: str, speed_priority: bool = False) -> str: """Select optimal model based on requirements.""" if speed_priority: return { "code": "gpt-5-mini", "chat": "gpt-5-nano", "analysis": "gemini-2.5-flash" }.get(task_type, "gpt-5-nano") else: return { "code": "gpt-5", "reasoning": "o4-mini", "analysis": "claude-opus-4-5" }.get(task_type, "gpt-5") # Use appropriate model model = select_model("code", speed_priority=False) agent = Agent("coder", model=model)
output
Selected model: gpt-5

Fallback Chain

Try multiple models if one fails:

main.py
def create_agent_with_fallback(name: str): """Try multiple models if one fails.""" model_chain = [ "gpt-5", # Best overall "claude-opus-4-5", # Strong alternative "gemini-2.5-pro", # Multimodal option "gpt-5-mini" # Faster fallback ] for model in model_chain: try: return Agent(name, model=model) except Exception as e: print(f"Failed with {model}: {e}") continue raise Exception("No models available") # Will use best available model agent = create_agent_with_fallback("assistant")
output
Using model: gpt-5

Key Benefits

  • Get started in 60 seconds with managed keys (co auth)
  • 100K free tokens + bonus credits for starring our repo
  • Same pricing as official APIs - see full pricing table or add credits
  • Same code works everywhere - just change the model name or add/remove co/ prefix
  • Tool support identical across all models and providers
  • Easy transition from managed keys to your own keys when ready for production

Star us on GitHub

If ConnectOnion saves you time, a ⭐ goes a long way — and earns you a coffee chat with our founder.