Call Your OpenClaw over the phone using ElevenLabs Agents
Call your OpenClaw by phone: a quick look at ElevenLabs Agents’ demo
ElevenLabs’ recent post shows a neat way to turn an OpenClaw coding agent into something you can actually call, talk to, and ask to take actions for you. The writeup walks through the idea simply: let ElevenLabs handle voice, telephony and turn taking, and let OpenClaw stay focused on tools, memory and skills. It’s a clean split, and it makes the whole system feel more like a person you can actually pick up the phone to.
Briefly, the architecture is straightforward. ElevenLabs Agents does speech synthesis, recognition, phone integration and conversation flow, while OpenClaw exposes an OpenAI-style /v1/chat/completions endpoint to be the brain. The post lists prerequisites like an ElevenLabs account, OpenClaw running, ngrok to expose a local gateway, and a Twilio number if a phone line is wanted.
How the setup works, step by step: enable the chat completions endpoint in openclaw.json, start an ngrok tunnel so ElevenLabs can reach the gateway, configure a Custom LLM in ElevenLabs to point to the ngrok URL with the OpenClaw token, and finally attach a Twilio number in the agent’s Phone settings. After that, the ElevenLabs agent routes each conversation turn through OpenClaw, so the assistant keeps full context. The thread even notes that a coding agent could automate those requests for you, making the whole process smoother.
There’s something friendly about this idea, practical in ways that matter. Imagine asking your Claw to remember a note while driving, or to run a quick code snippet and report back, all over an actual phone call. For readers who want the original thread, see the demo here: https://x.com/ElevenLabsDevs/status/2018798792485880209.
Looking ahead, tying voice-first interfaces to capable agents like OpenClaw nudges development toward more natural, on-the-go workflows, and that’s exciting.



Kommentar abschicken