Running Next.js inside ChatGPT: A deep dive into native app integration – Vercel
Running Next.js inside ChatGPT: native apps with MCP and the Apps SDK
This post explains how a full Next.js application can run natively inside ChatGPT using the Apps SDK and the Model Context Protocol.
It covers what this enables for distribution, interactivity, and integration with model-driven workflows.
Why this matters to you.
- 📣 Immediate distribution: your web UI can appear inside conversations where it is contextually relevant.
- ⚙️ Full app behavior: client-side navigation, React Server Components, and dynamic routing remain available.
- đź”’ Tooling and standards: MCP provides a structured way for models to discover and use your app without hard-coded API calls.
What ChatGPT apps and MCP enable
ChatGPT apps are interactive components that appear inside a chat when the model decides they are relevant.
MCP acts like a protocol for models to discover tools and fetch HTML to render inside an iframe.
The difference from a static iframe is that a native Next.js app preserves navigation and server-rendered components while running within the host environment.
Example prompt flow.
"Find me a hotel in Paris" → Chat model selects your app → app HTML fetched and rendered → user interacts with results inside chat.
How Next.js runs natively in the triple-iframe architecture
The approach adapts a Next.js app to the constraints of the host frame while keeping client routing and server components intact.
Patches handle routing, asset loading, and safe cross-frame communication.
Example MCP tool descriptor (simplified).
{ "tool": "hotel-search", "html_url": "https://your-mcp-server/app.html", "schema": { "query": "string" } }
Developer experience and deployment
For Next.js teams this means reuse of existing code and immediate reach into the chat ecosystem.
A starter template can be deployed to a platform that supports MCP endpoints and static or server-rendered pages.
Example action.
Deploy the starter template to your platform and expose an MCP endpoint for ChatGPT to fetch.
Final thoughts
Running Next.js inside ChatGPT narrows the gap between web apps and conversational interfaces.
The mood is pragmatic and opportunistic — you get distribution and interactivity without rewriting your app.
For business leaders this can mean faster experiments, clearer ownership of user experience, and direct model-to-UI integrations that reduce friction.
I recently adapted a client dashboard into this pattern and we regained consistent navigation and telemetry inside the chat surface.
Would that approach help your team move features to users faster?



Kommentar abschicken