NetworkConnect to Agents
DocsConnect to Agents

Connect to Agents

Use any agent, anywhere, as if local. Create a proxy to a remote agent with the same interface.

Why connect? Access specialized agents from anywhere, build distributed workflows, scale horizontally across multiple machines.

60-Second Quick Start

Connect to a remote agent with one function call:

use_remote.py
from connectonion import connect # Connect to a remote agent remote_agent = connect("0x3d4017c3e843895a92b70aa74d1b7ebc9c982ccf2ec4968cc0cd55f12af4660c") # Use it like a local agent result = remote_agent.input("Search for Python documentation") print(result)
output
I found extensive Python documentation at docs.python.org covering tutorials,
library reference, and language specifications.

What Just Happened?

Created proxy agent → Acts like a local Agent instance
Connected to relay → WebSocket at wss://oo.openonion.ai/ws/announce
Sent INPUT message → Routed to the remote agent
Received OUTPUT → Got the result back

Complete Example: Two Terminals

Terminal 1: Host an Agent

host_agent.py
# host_agent.py from connectonion import Agent, host def calculate(expression: str) -> str: """Perform calculations.""" return str(eval(expression)) def get_weather(city: str) -> str: """Get weather information.""" return f"Weather in {city}: Sunny, 72°F" agent = Agent( "assistant", tools=[calculate, get_weather], system_prompt="You are a helpful assistant." ) print("Starting agent...") host(agent)
output
Starting agent...
Agent 'assistant' hosted at: http://localhost:8000
Address: 0x7a8f9d4c2b1e3f5a6c8d9e0f1a2b3c4d5e6f7a8b9c0d1e2f3a4b5c6d7e8f9a0b
P2P Relay: wss://oo.openonion.ai/ws/announce
Waiting for tasks...

Terminal 2: Connect and Use

use_agent.py
# use_agent.py from connectonion import connect # Connect using the agent's address assistant = connect("0x7a8f9d4c2b1e3f5a6c8d9e0f1a2b3c4d5e6f7a8b9c0d1e2f3a4b5c6d7e8f9a0b") # Use it result1 = assistant.input("What is 42 * 17?") print(result1) result2 = assistant.input("What's the weather in Seattle?") print(result2)
output
The result of 42 * 17 is 714.
 
Weather in Seattle: Sunny, 72°F

Common Patterns

1. Connect to Multiple Agents

Build workflows with specialized remote agents:

main.py
from connectonion import connect # Connect to specialized agents searcher = connect("0xaaa...") writer = connect("0xbbb...") reviewer = connect("0xccc...") # Use them together research = searcher.input("Research AI trends") draft = writer.input(f"Write article about: {research}") final = reviewer.input(f"Review and improve: {draft}") print(final)

2. Retry on Connection Failure

Handle network failures gracefully:

main.py
import time from connectonion import connect def connect_with_retry(address, max_retries=3): for attempt in range(max_retries): try: return connect(address) except Exception as e: if attempt < max_retries - 1: print(f"Retrying... ({attempt + 1}/{max_retries})") time.sleep(2) else: raise agent = connect_with_retry("0x7a8f...")

3. Agent Pool (Load Balancing)

Distribute load across multiple identical agents:

main.py
from connectonion import connect # Pool of identical agents agent_addresses = [ "0xaaa...", "0xbbb...", "0xccc..." ] agents = [connect(addr) for addr in agent_addresses] # Simple round-robin def get_agent(): agent = agents.pop(0) agents.append(agent) return agent # Use different agent each time result1 = get_agent().input("Task 1") result2 = get_agent().input("Task 2") result3 = get_agent().input("Task 3")

Multi-Turn Conversations

Remote agents maintain conversation state across multiple input() calls:

main.py
remote = connect("0x7a8f...") # Turn 1 response1 = remote.input("Calculate 100 + 50") print(response1) # Turn 2 - remembers context response2 = remote.input("Multiply that by 2") print(response2)
output
The result is 150
 
The result is 300

Real-World: Distributed Workflow

Local orchestrator using remote specialized agents:

main.py
from connectonion import Agent, connect # Local orchestrator agent def run_workflow(task: str) -> str: """Run distributed workflow.""" # Connect to remote specialized agents researcher = connect("0xaaa...") analyst = connect("0xbbb...") writer = connect("0xccc...") # Step 1: Research research = researcher.input(f"Research: {task}") # Step 2: Analyze analysis = analyst.input(f"Analyze this data: {research}") # Step 3: Write report report = writer.input(f"Write report based on: {analysis}") return report # Local agent with access to remote agents via tool orchestrator = Agent("orchestrator", tools=[run_workflow]) # User just talks to local agent result = orchestrator.input("Create a report on AI market trends") print(result)

Configuration

Default Relay (Production)

main.py
# Uses wss://oo.openonion.ai/ws/announce by default agent = connect("0x7a8f...")

Local Relay (Development)

main.py
# Connect to local relay server agent = connect("0x7a8f...", relay_url="ws://localhost:8000/ws/announce")

Environment-Based

main.py
import os relay_url = os.getenv( "RELAY_URL", "wss://oo.openonion.ai/ws/announce" ) agent = connect("0x7a8f...", relay_url=relay_url)

Local vs Remote Agents

Local Agent

main.py
from connectonion import Agent agent = Agent("local", tools=[search, calculate]) result = agent.input("task")

+ No network latency

+ Works offline

Limited to one machine

No sharing

Remote Agent

main.py
from connectonion import connect agent = connect("0x7a8f...") result = agent.input("task")

+ Access from anywhere

+ Share across team

Network latency

Requires connectivity

TypeScript SDK

The connectonion npm package provides the same connect() interface for TypeScript and JavaScript:

terminal
npm install connectonion

Basic Usage

TS
app.ts
import { connect } from 'connectonion' // Connect to a hosted agent const agent = connect("0x3d4017c3e843...") // Send a message and get a response const response = await agent.input("What is Python?") console.log(response.text)

Direct Connection (Deployed Agents)

TS
direct.ts
import { connect } from 'connectonion' // Connect directly to a deployed agent (bypasses relay) const agent = connect("my-agent", { directUrl: "https://my-agent.example.com" }) const response = await agent.input("Hello!") console.log(response.text)

Streaming Events

While the agent works, events stream in real-time via the ui property. Each event is a ChatItem:

Event TypeDescription
userUser message
agentAgent response text
thinkingLLM thinking/reasoning
tool_callTool execution with name, args, result
ask_userAgent asking a question (with options)
approval_neededTool requires user approval before running
plan_reviewAgent presenting a plan for review

React Hook: useAgentForHuman()

The SDK includes a React hook that wraps connect() with state management and localStorage persistence:

ChatPage.tsx
import { useAgentForHuman } from 'connectonion/react' function ChatPage() { const { ui, // ChatItem[] — streaming events status, // 'idle' | 'working' | 'waiting' isProcessing, // true while agent is working mode, // approval mode input, // send a message respond, // answer ask_user respondToApproval, reset, // clear conversation } = useAgentForHuman("0x3d4017c3e843...", { sessionId: "my-session-123" // auto-persisted to localStorage }) return ( <div> {/* Render streaming events */} {ui.map(item => { if (item.type === 'user') return <UserMsg key={item.id}>{item.content}</UserMsg> if (item.type === 'agent') return <AgentMsg key={item.id}>{item.content}</AgentMsg> if (item.type === 'thinking') return <Thinking key={item.id} /> if (item.type === 'tool_call') return <ToolCall key={item.id} name={item.name} /> if (item.type === 'ask_user') return ( <AskUser key={item.id} question={item.text} options={item.options} onAnswer={(answer) => respond(answer)} /> ) return null })} {/* Input */} <input onSubmit={(msg) => input(msg)} disabled={isProcessing} /> </div> ) }

Session persistence: The hook automatically saves conversation state to localStorage using the sessionId. Page refreshes restore the full conversation.

Interactive Features

Agents can ask questions, request approval for dangerous tools, and present plans for review. Here's how to handle each:

Ask User

Agent needs information from the user:

TS
app.ts
// Agent sends: { type: 'ask_user', text: 'Which city?', options: ['Sydney', 'Tokyo'] } // Respond with: respond("Sydney") // Or multiple selections: respond(["Sydney", "Tokyo"])

Tool Approval

Agent wants to run a tool that needs permission:

TS
app.ts
// Agent sends: { type: 'approval_needed', tool: 'shell', arguments: { cmd: 'rm -rf /tmp' } } // Approve once: respondToApproval(true, 'once') // Approve for entire session: respondToApproval(true, 'session') // Reject with feedback: respondToApproval(false, 'once', 'reject_explain', 'Too dangerous')

Plan Review

Agent presenting a plan before executing:

TS
app.ts
// Agent sends: { type: 'plan_review', plan_content: '1. Research\n2. Analyze\n3. Report' } // Approve and continue: respondToPlanReview("Looks good, proceed") // Request changes: respondToPlanReview("Skip step 2, go straight to report")

Sending Files

All SDKs support sending files alongside prompts. Files are base64-encoded and sent inline as data URLs.

Python

send_file.py
import base64 from connectonion import connect agent = connect("0x...") # Read file and convert to data URL with open("report.pdf", "rb") as f: data_url = f"data:application/pdf;base64,{base64.b64encode(f.read()).decode()}" response = agent.input("Summarize this document", files=[ {"name": "report.pdf", "data": data_url} ])

TypeScript

TS
send_file.ts
import { connect, FileAttachment } from 'connectonion' const agent = connect('0x...') // From a File object (browser) const file = fileInput.files[0] const reader = new FileReader() reader.onload = async () => { const attachment: FileAttachment = { name: file.name, type: file.type, size: file.size, dataUrl: reader.result as string, } const response = await agent.input('Summarize this', { files: [attachment] }) } reader.readAsDataURL(file)

React (useAgentForHuman)

FileUpload.tsx
import { useAgentForHuman, FileAttachment } from 'connectonion/react' function Chat() { const { input } = useAgentForHuman('0x...', { sessionId: 'my-session' }) const handleFileUpload = (e: React.ChangeEvent<HTMLInputElement>) => { const file = e.target.files?.[0] if (!file) return const reader = new FileReader() reader.onload = () => { const attachment: FileAttachment = { name: file.name, type: file.type, size: file.size, dataUrl: reader.result as string, } input('Analyze this file', { files: [attachment] }) } reader.readAsDataURL(file) } return <input type="file" onChange={handleFileUpload} /> }

FileAttachment Type

TS
types.ts
interface FileAttachment { name: string // Filename (e.g. "report.pdf") type: string // MIME type (e.g. "application/pdf") size: number // File size in bytes dataUrl: string // Base64 data URL: "data:<mime>;base64,..." }

How It Works

Client                              Server
  │                                    │
  │  Convert file to base64 data URL   │
  │                                    │
  │── INPUT ──────────────────────────►│
  │  { prompt, files: [               │
  │    { name, data: "data:...;base64,│
  │      ..." }                        │
  │  ]}                                │
  │                                    │
  │                 Validate file limits│
  │                 Decode base64      │
  │                 Save to .co/uploads│
  │                 Tell agent paths   │
  │                                    │
  │                 Agent reads files  │
  │                 via read_file tool │
  │                                    │
  │◄── OUTPUT ────────────────────────│
  │  { result }                        │

File limits: Default 10MB per file, 10 files per request. Check agent limits via GET /infoaccepted_inputs.files. Server-side, files are saved to .co/uploads/ and the agent reads them via tools. See host() for server-side details.

oo-chat: Open-Source Reference Client

oo-chat is an open-source Next.js chat client built on the TypeScript SDK. It's a complete working example of how to build a chat UI for ConnectOnion agents.

oo-chat/
├── app/[address]/[sessionId]/page.tsx   ← session page (uses useAgentForHumanSDK)
├── components/chat/
│   ├── chat.tsx                         ← main Chat component
│   ├── chat-input.tsx                   ← message input
│   ├── chat-messages.tsx                ← message list
│   ├── use-agent-sdk.ts                 ← wrapper hook around useAgentForHuman()
│   └── messages/
│       ├── tool-call.tsx                ← tool call rendering
│       └── tools/plan-card.tsx          ← plan review UI
└── package.json                         ← depends on connectonion

How oo-chat Connects

app/[address]/[sessionId]/page.tsx
// app/[address]/[sessionId]/page.tsx import { useAgentForHumanSDK } from '@/components/chat/use-agent-sdk' export default function ChatSession({ params }) { const { address, sessionId } = params const { ui, isLoading, elapsedTime, pendingAskUser, pendingApproval, pendingPlanReview, mode, send, respondToAskUser, respondToApproval, respondToPlanReview, setMode, clear, } = useAgentForHumanSDK({ agentAddress: address, sessionId }) return ( <Chat ui={ui} isLoading={isLoading} elapsedTime={elapsedTime} onSend={(msg, images) => send(msg, images)} pendingAskUser={pendingAskUser} onAskUserResponse={respondToAskUser} pendingApproval={pendingApproval} onApprovalResponse={respondToApproval} pendingPlanReview={pendingPlanReview} onPlanReviewResponse={respondToPlanReview} mode={mode} onModeChange={setMode} /> ) }

Architecture

┌──────────────────────────────────────────────────┐
│  oo-chat (Next.js)                               │
│                                                   │
│  page.tsx                                         │
│    └─ useAgentForHumanSDK()     ← elapsed time, pending   │
│         └─ useAgentForHuman()   ← connectonion/react      │
│              └─ connect()  ← WebSocket to agent   │
│                                                   │
│  <Chat />                                         │
│    ├─ <ChatMessages />  ← renders ui: ChatItem[]  │
│    ├─ <AskUser />       ← from pendingAskUser     │
│    ├─ <Approval />      ← from pendingApproval    │
│    └─ <ChatInput />     ← calls send()            │
└──────────────────────────────────────────────────┘
         │ WebSocket
         ▼
┌──────────────────────────────────────────────────┐
│  Hosted Agent (Python)                            │
│  host(agent)                                      │
└──────────────────────────────────────────────────┘

Ready to Use Remote Agents?

Python, TypeScript, or React — connect to any agent with one function call.

Star us on GitHub

If ConnectOnion saves you time, a ⭐ goes a long way — and earns you a coffee chat with our founder.