ConnectOnionConnectOnion
DocsConnect to Agents

Connect to Agents

Use any agent, anywhere, as if local. Create a proxy to a remote agent with the same interface.

Why connect? Access specialized agents from anywhere, build distributed workflows, scale horizontally across multiple machines.

60-Second Quick Start

Connect to a remote agent with one function call:

use_remote.py
1from connectonion import connect 2 3# Connect to a remote agent 4remote_agent = connect("0x3d4017c3e843895a92b70aa74d1b7ebc9c982ccf2ec4968cc0cd55f12af4660c") 5 6# Use it like a local agent 7result = remote_agent.input("Search for Python documentation") 8print(result)
Python REPL
Interactive
I found extensive Python documentation at docs.python.org covering tutorials,
library reference, and language specifications.

What Just Happened?

Created proxy agent → Acts like a local Agent instance
Connected to relay → WebSocket at wss://oo.openonion.ai/ws/announce
Sent INPUT message → Routed to the remote agent
Received OUTPUT → Got the result back

Complete Example: Two Terminals

Terminal 1: Host an Agent

host_agent.py
1# host_agent.py 2from connectonion import Agent, host 3 4def calculate(expression: str) -> str: 5 """Perform calculations.""" 6 return str(eval(expression)) 7 8def get_weather(city: str) -> str: 9 """Get weather information.""" 10 return f"Weather in {city}: Sunny, 72°F" 11 12agent = Agent( 13 "assistant", 14 tools=[calculate, get_weather], 15 system_prompt="You are a helpful assistant." 16) 17 18print("Starting agent...") 19host(agent)
Python REPL
Interactive
Starting agent...
Agent 'assistant' hosted at: http://localhost:8000
Address: 0x7a8f9d4c2b1e3f5a6c8d9e0f1a2b3c4d5e6f7a8b9c0d1e2f3a4b5c6d7e8f9a0b
P2P Relay: wss://oo.openonion.ai/ws/announce
Waiting for tasks...

Terminal 2: Connect and Use

use_agent.py
1# use_agent.py 2from connectonion import connect 3 4# Connect using the agent's address 5assistant = connect("0x7a8f9d4c2b1e3f5a6c8d9e0f1a2b3c4d5e6f7a8b9c0d1e2f3a4b5c6d7e8f9a0b") 6 7# Use it 8result1 = assistant.input("What is 42 * 17?") 9print(result1) 10 11result2 = assistant.input("What's the weather in Seattle?") 12print(result2)
Python REPL
Interactive
The result of 42 * 17 is 714.
 
Weather in Seattle: Sunny, 72°F

Common Patterns

1. Connect to Multiple Agents

Build workflows with specialized remote agents:

main.py
1from connectonion import connect 2 3# Connect to specialized agents 4searcher = connect("0xaaa...") 5writer = connect("0xbbb...") 6reviewer = connect("0xccc...") 7 8# Use them together 9research = searcher.input("Research AI trends") 10draft = writer.input(f"Write article about: {research}") 11final = reviewer.input(f"Review and improve: {draft}") 12 13print(final)

2. Retry on Connection Failure

Handle network failures gracefully:

main.py
1import time 2from connectonion import connect 3 4def connect_with_retry(address, max_retries=3): 5 for attempt in range(max_retries): 6 try: 7 return connect(address) 8 except Exception as e: 9 if attempt < max_retries - 1: 10 print(f"Retrying... ({attempt + 1}/{max_retries})") 11 time.sleep(2) 12 else: 13 raise 14 15agent = connect_with_retry("0x7a8f...")

3. Agent Pool (Load Balancing)

Distribute load across multiple identical agents:

main.py
1from connectonion import connect 2 3# Pool of identical agents 4agent_addresses = [ 5 "0xaaa...", 6 "0xbbb...", 7 "0xccc..." 8] 9 10agents = [connect(addr) for addr in agent_addresses] 11 12# Simple round-robin 13def get_agent(): 14 agent = agents.pop(0) 15 agents.append(agent) 16 return agent 17 18# Use different agent each time 19result1 = get_agent().input("Task 1") 20result2 = get_agent().input("Task 2") 21result3 = get_agent().input("Task 3")

Multi-Turn Conversations

Remote agents maintain conversation state across multiple input() calls:

main.py
1remote = connect("0x7a8f...") 2 3# Turn 1 4response1 = remote.input("Calculate 100 + 50") 5print(response1) 6 7# Turn 2 - remembers context 8response2 = remote.input("Multiply that by 2") 9print(response2)
Python REPL
Interactive
The result is 150
 
The result is 300

Real-World: Distributed Workflow

Local orchestrator using remote specialized agents:

main.py
1from connectonion import Agent, connect 2 3# Local orchestrator agent 4def run_workflow(task: str) -> str: 5 """Run distributed workflow.""" 6 7 # Connect to remote specialized agents 8 researcher = connect("0xaaa...") 9 analyst = connect("0xbbb...") 10 writer = connect("0xccc...") 11 12 # Step 1: Research 13 research = researcher.input(f"Research: {task}") 14 15 # Step 2: Analyze 16 analysis = analyst.input(f"Analyze this data: {research}") 17 18 # Step 3: Write report 19 report = writer.input(f"Write report based on: {analysis}") 20 21 return report 22 23# Local agent with access to remote agents via tool 24orchestrator = Agent("orchestrator", tools=[run_workflow]) 25 26# User just talks to local agent 27result = orchestrator.input("Create a report on AI market trends") 28print(result)

Configuration

Default Relay (Production)

main.py
1# Uses wss://oo.openonion.ai/ws/announce by default 2agent = connect("0x7a8f...")

Local Relay (Development)

main.py
1# Connect to local relay server 2agent = connect("0x7a8f...", relay_url="ws://localhost:8000/ws/announce")

Environment-Based

main.py
1import os 2 3relay_url = os.getenv( 4 "RELAY_URL", 5 "wss://oo.openonion.ai/ws/announce" 6) 7 8agent = connect("0x7a8f...", relay_url=relay_url)

Local vs Remote Agents

Local Agent

main.py
1from connectonion import Agent 2 3agent = Agent("local", 4 tools=[search, calculate]) 5 6result = agent.input("task")

+ No network latency

+ Works offline

Limited to one machine

No sharing

Remote Agent

main.py
1from connectonion import connect 2 3agent = connect("0x7a8f...") 4 5result = agent.input("task")

+ Access from anywhere

+ Share across team

Network latency

Requires connectivity

TypeScript SDK

The connectonion npm package provides the same connect() interface for TypeScript and JavaScript:

terminal
1npm install connectonion

Basic Usage

TS
app.ts
1import { connect } from 'connectonion' 2 3// Connect to a hosted agent 4const agent = connect("0x3d4017c3e843...") 5 6// Send a message and get a response 7const response = await agent.input("What is Python?") 8console.log(response.text)

Direct Connection (Deployed Agents)

TS
direct.ts
1import { connect } from 'connectonion' 2 3// Connect directly to a deployed agent (bypasses relay) 4const agent = connect("my-agent", { 5 directUrl: "https://my-agent.example.com" 6}) 7 8const response = await agent.input("Hello!") 9console.log(response.text)

Streaming Events

While the agent works, events stream in real-time via the ui property. Each event is a ChatItem:

Event TypeDescription
userUser message
agentAgent response text
thinkingLLM thinking/reasoning
tool_callTool execution with name, args, result
ask_userAgent asking a question (with options)
approval_neededTool requires user approval before running
plan_reviewAgent presenting a plan for review

React Hook: useAgent()

The SDK includes a React hook that wraps connect() with state management and localStorage persistence:

ChatPage.tsx
1import { useAgent } from 'connectonion/react' 2 3function ChatPage() { 4 const { 5 ui, // ChatItem[] — streaming events 6 status, // 'idle' | 'working' | 'waiting' 7 isProcessing, // true while agent is working 8 mode, // approval mode 9 input, // send a message 10 respond, // answer ask_user 11 respondToApproval, 12 reset, // clear conversation 13 } = useAgent("0x3d4017c3e843...", { 14 sessionId: "my-session-123" // auto-persisted to localStorage 15 }) 16 17 return ( 18 <div> 19 {/* Render streaming events */} 20 {ui.map(item => { 21 if (item.type === 'user') return <UserMsg key={item.id}>{item.content}</UserMsg> 22 if (item.type === 'agent') return <AgentMsg key={item.id}>{item.content}</AgentMsg> 23 if (item.type === 'thinking') return <Thinking key={item.id} /> 24 if (item.type === 'tool_call') return <ToolCall key={item.id} name={item.name} /> 25 if (item.type === 'ask_user') return ( 26 <AskUser 27 key={item.id} 28 question={item.text} 29 options={item.options} 30 onAnswer={(answer) => respond(answer)} 31 /> 32 ) 33 return null 34 })} 35 36 {/* Input */} 37 <input onSubmit={(msg) => input(msg)} disabled={isProcessing} /> 38 </div> 39 ) 40}

Session persistence: The hook automatically saves conversation state to localStorage using the sessionId. Page refreshes restore the full conversation.

Interactive Features

Agents can ask questions, request approval for dangerous tools, and present plans for review. Here's how to handle each:

Ask User

Agent needs information from the user:

TS
app.ts
1// Agent sends: { type: 'ask_user', text: 'Which city?', options: ['Sydney', 'Tokyo'] } 2 3// Respond with: 4respond("Sydney") 5 6// Or multiple selections: 7respond(["Sydney", "Tokyo"])

Tool Approval

Agent wants to run a tool that needs permission:

TS
app.ts
1// Agent sends: { type: 'approval_needed', tool: 'shell', arguments: { cmd: 'rm -rf /tmp' } } 2 3// Approve once: 4respondToApproval(true, 'once') 5 6// Approve for entire session: 7respondToApproval(true, 'session') 8 9// Reject with feedback: 10respondToApproval(false, 'once', 'reject_explain', 'Too dangerous')

Plan Review

Agent presenting a plan before executing:

TS
app.ts
1// Agent sends: { type: 'plan_review', plan_content: '1. Research\n2. Analyze\n3. Report' } 2 3// Approve and continue: 4respondToPlanReview("Looks good, proceed") 5 6// Request changes: 7respondToPlanReview("Skip step 2, go straight to report")

oo-chat: Open-Source Reference Client

oo-chat is an open-source Next.js chat client built on the TypeScript SDK. It's a complete working example of how to build a chat UI for ConnectOnion agents.

oo-chat/
├── app/[address]/[sessionId]/page.tsx   ← session page (uses useAgentSDK)
├── components/chat/
│   ├── chat.tsx                         ← main Chat component
│   ├── chat-input.tsx                   ← message input
│   ├── chat-messages.tsx                ← message list
│   ├── use-agent-sdk.ts                 ← wrapper hook around useAgent()
│   └── messages/
│       ├── tool-call.tsx                ← tool call rendering
│       └── tools/plan-card.tsx          ← plan review UI
└── package.json                         ← depends on connectonion

How oo-chat Connects

app/[address]/[sessionId]/page.tsx
1// app/[address]/[sessionId]/page.tsx 2import { useAgentSDK } from '@/components/chat/use-agent-sdk' 3 4export default function ChatSession({ params }) { 5 const { address, sessionId } = params 6 7 const { 8 ui, 9 isLoading, 10 elapsedTime, 11 pendingAskUser, 12 pendingApproval, 13 pendingPlanReview, 14 mode, 15 send, 16 respondToAskUser, 17 respondToApproval, 18 respondToPlanReview, 19 setMode, 20 clear, 21 } = useAgentSDK({ agentAddress: address, sessionId }) 22 23 return ( 24 <Chat 25 ui={ui} 26 isLoading={isLoading} 27 elapsedTime={elapsedTime} 28 onSend={(msg, images) => send(msg, images)} 29 pendingAskUser={pendingAskUser} 30 onAskUserResponse={respondToAskUser} 31 pendingApproval={pendingApproval} 32 onApprovalResponse={respondToApproval} 33 pendingPlanReview={pendingPlanReview} 34 onPlanReviewResponse={respondToPlanReview} 35 mode={mode} 36 onModeChange={setMode} 37 /> 38 ) 39}

Architecture

┌──────────────────────────────────────────────────┐
│  oo-chat (Next.js)                               │
│                                                   │
│  page.tsx                                         │
│    └─ useAgentSDK()     ← elapsed time, pending   │
│         └─ useAgent()   ← connectonion/react      │
│              └─ connect()  ← WebSocket to agent   │
│                                                   │
│  <Chat />                                         │
│    ├─ <ChatMessages />  ← renders ui: ChatItem[]  │
│    ├─ <AskUser />       ← from pendingAskUser     │
│    ├─ <Approval />      ← from pendingApproval    │
│    └─ <ChatInput />     ← calls send()            │
└──────────────────────────────────────────────────┘
         │ WebSocket
         ▼
┌──────────────────────────────────────────────────┐
│  Hosted Agent (Python)                            │
│  host(agent)                                      │
└──────────────────────────────────────────────────┘

Ready to Use Remote Agents?

Python, TypeScript, or React — connect to any agent with one function call.

Enjoying ConnectOnion?

⭐ Star us on GitHub = ☕ Coffee chat with our founder. We love meeting builders.