NodeOps
UK

Guide

How to Deploy an MCP Server to Production

Deploy your MCP server as a standard HTTP service with SSE (Server-Sent Events) or Streamable HTTP transport. Any deployment platform that supports persistent processes can host it. The server becomes a remote endpoint that any MCP-compatible AI client can connect to.

The problem

You built an MCP (Model Context Protocol) server that exposes tools for AI assistants like Claude, Cursor, or ChatGPT. It works locally via stdio. But to make it available to anyone, it needs to be deployed as a remote HTTP service with authentication, HTTPS, and persistent uptime.

MCP transports: stdio vs HTTP

MCP servers support two transports. Stdio runs locally on the user's machine, reading from stdin and writing to stdout. HTTP (SSE or Streamable HTTP) runs remotely as a web service. Local stdio is great for development but cannot be shared. Remote HTTP lets anyone connect, making your MCP server available to thousands of AI assistant users.

What makes MCP servers different from regular APIs

MCP servers follow the Model Context Protocol specification. They expose tools, resources, and prompts in a standardized format that AI clients understand. Unlike REST APIs, MCP servers maintain persistent connections via SSE for real-time communication. This means your deployment needs to support long-lived HTTP connections, not just request-response.

Authentication for remote MCP servers

Local MCP servers run with the user's permissions. Remote servers need authentication. The standard approach is an API key passed via HTTP header. The MCP client includes the key in the connection config. Your server validates it on each request. Keep it simple: header-based API key auth is what most MCP clients support.

Approaches compared

Platform CLI (CreateOS, Railway, Fly)

Pros

  • One-command deploy
  • Persistent process (supports SSE)
  • Environment variables for API keys
  • HTTPS included
  • Custom domains

Cons

  • Platform dependency

Best for: MCP servers that need to be publicly accessible with minimal setup

Cloudflare Workers

Pros

  • Edge deployment (low latency)
  • Native MCP SDK support
  • Generous free tier

Cons

  • Worker runtime limitations
  • No persistent connections in standard Workers
  • Durable Objects needed for state

Best for: Stateless MCP tools that do not need long-running connections

Docker on a VPS

Pros

  • Full control
  • Persistent connections work naturally
  • No platform dependency

Cons

  • Manual SSL, domain, and monitoring setup
  • Server maintenance

Best for: Teams with DevOps capability who need full control

Deploy an MCP server with CreateOS CLI

Here is how to do it step by step using CreateOS CLI.

1

Install the CLI

$ brew install createos

Single binary for macOS and Linux.

2

Create your MCP server

$ # Example: Node.js MCP server with SSE transport import { McpServer } from '@modelcontextprotocol/sdk/server'; const server = new McpServer({ name: 'my-tools' }); server.tool('hello', { name: 'string' }, async ({ name }) => { return { content: [{ type: 'text', text: `Hello ${name}` }] }; }); // Start HTTP server with SSE transport on PORT

Build your MCP server using the official SDK. Expose it via HTTP with SSE transport.

3

Deploy

$ createos login && createos init && createos deploy

The CLI deploys your server and gives you a live HTTPS URL. This URL is what MCP clients will connect to.

4

Set environment variables

$ createos env set MCP_API_KEY=your-secret-key

Store authentication keys and any API keys your tools need (OpenAI, database URLs, etc.).

5

Connect from Claude or Cursor

$ # Claude Code: claude mcp add my-tools --transport http https://your-server.nodeops.app --header "X-API-Key: your-key"

Add your deployed MCP server to any compatible AI client using the live URL.

Frequently asked questions

Can I deploy an MCP server for free?
Most platforms offer free tiers sufficient for low-traffic MCP servers. CreateOS, Railway, and Render all have free tiers. Cloudflare Workers offers 100K free requests per day. For personal or development use, free tiers are usually enough.
Do MCP servers need persistent connections?
MCP over SSE uses long-lived HTTP connections for real-time communication. Your hosting needs to support this. Standard serverless functions (Lambda, Vercel Functions) time out after 30-60 seconds and are not suitable. Use a platform that supports persistent processes.
How do I secure my MCP server?
Use API key authentication via HTTP headers. Store the key as an environment variable on your server. MCP clients pass the key in the connection config. Always use HTTPS (which deployment platforms provide automatically).
Which AI clients can connect to a deployed MCP server?
Any MCP-compatible client: Claude Desktop, Claude Code, Cursor, Windsurf, VS Code with Copilot, ChatGPT (via plugins), and many more. The MCP protocol is an open standard supported by 40+ AI tools.

Try it yourself

$ brew install createos

100,000+ Builders. One Workspace.

Get product updates, builder stories, and early access to features that help you ship faster.

CreateOS is a unified intelligent workspace where ideas move seamlessly from concept to live deployment, eliminating context-switching across tools, infrastructure, and workflows with the opportunity to monetize ideas immediately on the CreateOS Marketplace.