GitHub – cloudflare/moltworker: Run Moltbot (formely Clawdbot) on Cloudflare Workers

Run Moltbot (formely Clawdbot) on Cloudflare Workers - cloudflare/moltworker

If you’ve ever tried to keep a personal AI assistant running reliably, you already know the pain. Servers need babysitting. Containers restart at the worst moments. And that “always on” promise somehow turns into weekend maintenance. That’s why this experimental project from Cloudflare is interesting in a very real, practical way.

The GitHub repository cloudflare/moltworker shows how Moltbot, formerly known as Clawdbot, can run inside a Cloudflare Workers Sandbox. In plain terms, it’s a proof of concept that lets Moltbot live in a managed environment, without you having to self host or keep a server alive at 3 a.m.

Moltbot itself is a personal AI assistant built around a gateway architecture. That means it can connect to multiple chat platforms through one brain. The project packages this assistant into a Cloudflare Sandbox container, giving you a deployment that’s always available, automatically managed, and surprisingly lightweight. No traditional server setup. No constant patching.

There are some tradeoffs, and it’s honest about them. This is experimental. It’s not officially supported and could break without notice. The first request can take a minute or two while the container wakes up, which feels long when you’re staring at a browser tab, but it’s manageable. If you’ve ever waited for a home lab to reboot, you’ve seen worse.

One detail I really appreciate is the optional use of Cloudflare R2 storage. Without it, everything resets when the container restarts. With it, your paired devices and conversation history stick around. That persistence matters more than you think, especially if you actually talk to your assistant daily.

Security is handled thoughtfully too. Device pairing is required by default, and Cloudflare Access can protect the admin UI. It’s the kind of setup that feels a bit fiddly at first, then reassuring once it’s done.

Looking ahead, this kind of deployment hints at where personal AI is going. Less infrastructure stress. More focus on how you actually use the assistant. If you’re curious and comfortable experimenting, this project is worth your time.

Kommentar abschicken