Massive language fashions (LLMs) are revolutionizing the best way we construct software program. However because the scope of clever purposes grows, so does the necessity for structured, contextual communication between LLMs and real-world information, companies, and enterprise logic.
That is the place the Mannequin Context Protocol (MCP) is available in — a light-weight however highly effective normal for exposing structured context and practical APIs to LLMs. Consider it because the REST for AI-native purposes.
On this article, you’ll learn to:
- Perceive what MCP is and why it issues
- Construct a customized MCP Server in TypeScript
- Host it in a Docker container
- Deploy it to Microsoft Azure utilizing the Azure Developer CLI (azd)
- Prolong the server with your personal instruments and information
We’ll use the superb powergentic/azd-mcp-ts
open-source template, from Powergentic.ai, as our base — a production-friendly scaffold for constructing MCP-compatible companies.
Whether or not you’re constructing inside instruments, AI copilots, or superior chat workflows — this text will allow you to construct the bridge between your information and the mannequin.
What Is the Mannequin Context Protocol (MCP)?
The Mannequin Context Protocol (MCP) is an open, standardized means for big language fashions (LLMs) to work together with structured information, instruments, and workflows supplied by an exterior server. As a substitute of throwing every part into a large textual content immediate, MCP provides LLMs well-defined interfaces — making it simpler to construct extra highly effective, predictable, and maintainable AI-driven purposes.
MCP is utilized by shoppers like Claude Desktop, in-app LLM brokers, and even automated orchestration techniques. It allows composability between instruments, protected entry to structured information, and stronger interplay patterns between LLMs and apps.
In easy phrases, MCP is like an API for LLMs, however purpose-built for his or her distinctive wants.
Why Was MCP Created?
Conventional LLM prompting depends on stuffing context (like paperwork, information, and directions) right into a single, unstructured enter. As your app grows, this turns into brittle and onerous to scale.
MCP solves this by:
- Separating content material from management – your app manages what information/instruments are uncovered, and the LLM merely consumes them
- Encouraging composability – you’ll be able to construct reusable interfaces to structured info and actions
- Bettering security and auditability – each instrument name and useful resource learn is trackable
This mannequin is already influencing superior LLM platforms like Claude Desktop, multi-agent frameworks, and autonomous brokers.
MCP Core Primitives
On the coronary heart of MCP are three primitive constructing blocks that your server can implement:
🔹 Primitive | 📋 Description | 🧑💻 Analogy |
---|---|---|
Useful resource | Learn-only, contextual information the mannequin can question. Assume recordsdata, config, schemas, consumer profiles, and so forth. | GET endpoint (REST) |
Software | Actionable features that the mannequin can invoke. They could set off uncomfortable side effects, compute values, or name exterior companies. | POST endpoint (REST) |
Immediate | Reusable message templates that form how the LLM responds. These might be invoked by the consumer or triggered programmatically. | Slash command / macro |
These primitives are designed to help LLM-native use instances, the place understanding, decision-making, and interplay are central to the app’s performance.
How MCP Works at Runtime
Right here’s what a typical MCP interplay appears to be like like:
- A consumer (like Claude Desktop or a customized frontend) connects to your MCP server by way of Server-Despatched Occasions (SSE).
- The server advertises what assets, instruments, and prompts it helps.
- The LLM (or consumer) requests a useful resource like
greeting://alice
, or calls a instrument likeinstruments://calculate-bmi
. - Your server returns the requested information or executes the instrument, streaming the response again to the consumer.
This strategy provides you:
- Actual-time communication by way of SSE
- Declarative descriptions of what your server presents
- A transparent separation of roles between server logic and LLM utilization
Advantages of Utilizing MCP
Listed below are a number of of the advantages of utilizing Mannequin Context Protocol (MCP) servers together with your LLM / AI Agent answer:
- ✅ Higher management over what information and actions an LLM can entry
- 🧱 Modular server design utilizing instruments, assets, and prompts
- 🛡️ Safer and extra auditable than arbitrary code technology
- 🔄 Composable throughout shoppers — use the identical MCP server in Claude Desktop, your inside chatbots, or customized LLM brokers
- 🌐 Language-agnostic — servers might be written in Python, TypeScript, or any language with an SDK
Consider it like constructing an API on your LLM — however far more tailor-made for a way language fashions devour info and execute duties.
Why Use TypeScript to Construct an MCP Server?
TypeScript is a pure alternative for constructing MCP servers. Right here’s why:
- ✅ Sturdy typing makes your instrument/useful resource definitions safer and simpler to take care of.
- ⚡ Quick iteration with acquainted tooling (Node.js, npm, and so forth.)
- 🧩 The official @modelcontextprotocol/sdk is constructed for contemporary TypeScript workflows.
And with the powergentic/azd-mcp-ts
template, you get a ready-to-run challenge scaffold that makes use of:
- Specific.js for HTTP + Server-Despatched Occasions (SSE)
- Docker for constant builds
- azd for seamless Azure deployment
On the time or penning this, the Mannequin Context Protocol (MCP) and it’s SDKs embrace extra examples and higher documentation round utilizing TypeScript and Node.js for constructing MCP servers. You may as well take a look at the modelcontextprotocol/server
challenge on GitHub for a ton of nice MCP server examples.
Contained in the powergentic/azd-mcp-ts
Template
Let’s have a look at the powergentic/azd-mcp-ts
challenge structure you should use as the bottom basis for constructing your personal MCP servers utilizing TypeScript, Docker and Azure Container Apps (ACA):
azd-mcp-ts/
├── src/
│ └── mcpserver/
│ └── server.ts # Primary MCP server definition
├── infra/ # Infrastructure-as-code for Azure
├── Dockerfile # Docker picture for native + cloud use
├── azure.yaml # azd metadata config
└── bundle.json
The coronary heart of the server lives in src/mcpserver/server.ts
, which makes use of the MCP SDK to reveal a useful resource and wire up the transport layer.
Right here’s a really simplified model of the server code (see the challenge for the complete code):
import { McpServer, ResourceTemplate } from "@modelcontextprotocol/sdk/server/mcp.js";
import { SSEServerTransport } from "@modelcontextprotocol/sdk/server/sse.js";
import categorical from "categorical";
const server = new McpServer({ identify: "Demo", model: "1.0.0" });
server.useful resource(
"greeting",
new ResourceTemplate("greeting://{identify}", { listing: undefined }),
async (uri, { identify }) => ({
contents: [{ uri: uri.href, text: `Hello, ${name}!` }]
})
);
const app = categorical();
app.get("/sse", async (req, res) => {
const transport = new SSEServerTransport("/messages", res);
await server.join(transport);
});
This defines a dynamic MCP useful resource that returns customized greetings when the LLM queries greeting://your-name
.
What Is SSE (Server-Ship Occasions) and Why Does MCP Use It?
One of the crucial essential technical underpinnings of the Mannequin Context Protocol is its use of Server-Despatched Occasions (SSE) for real-time, event-driven communication between your MCP server and shoppers (like Claude Desktop or an LLM app).
Let’s break down what SSE is, the way it compares to options, and why it’s such a pure match for MCP.
What Are Server-Despatched Occasions (SSE)?
SSE is an online know-how that permits a server to push updates to a consumer over a single, long-lived HTTP connection. It’s a part of the HTML5 normal and works over plain HTTP/1.1, making it broadly supported and straightforward to implement.
Key properties:
- One-way communication: server → consumer
- Streamed as
textual content/event-stream
- Reconnection and heartbeat in-built
- Works nice for streaming logs, updates, or – within the case of MCP – mannequin responses
In distinction, conventional HTTP is request-response. SSE lets the server proactively ship new info because it turns into accessible.
-
The Shopper Connects
The consumer (e.g. Claude Desktop) opens a persistent connection to the server’s
/sse
endpoint utilizing HTTP and begins listening for occasions.GET /sse HTTP/1.1 Settle for: textual content/event-stream
Your MCP server responds and begins streaming messages:
Content material-Kind: textual content/event-stream occasion: resourceUpdate information: {"uri": "greeting://alice", "textual content": "Whats up, Alice!"} occasion: toolResponse information: {"instrument": "calculate-bmi", "consequence": "22.4"}
-
Server Streams Responses
As your server receives and handles requests — like studying a useful resource or executing a instrument — it streams again occasions over the open SSE channel. These occasions are structured utilizing the MCP message protocol.
That may embrace:
-
Outcomes from a instrument invocation
-
Errors or standing updates
-
Output from a long-running process
-
Progress updates throughout file processing
-
Content material from a streaming mannequin output
-
-
Shopper Sends Request Individually
The consumer sends requests (like calling a instrument or studying a useful resource) by way of a separate
/messages
endpoint, sometimes utilizing HTTP POST.This separation of considerations — learn (SSE) vs write (POST) — helps hold the protocol easy and dependable. It’s additionally well-suited for environments like Azure Container Apps, which help HTTP-based communication out of the field.
Why SSE as an alternative of WebSockets?
At first look, you would possibly marvel why the Mannequin Context Protocol doesn’t use WebSockets, that are a extra frequent alternative for real-time communication in trendy net apps. In spite of everything, WebSockets provide full-duplex (two-way) messaging and are well-liked in chat apps, multiplayer video games, and collaborative instruments.
However MCP has a distinct set of priorities — simplicity, compatibility, and reliability in cloud-native environments. For the kinds of interactions that LLMs require, Server-Despatched Occasions (SSE) presents a greater steadiness of efficiency and practicality.
Right here’s a more in-depth comparability between SSE and WebSockets:
Function | SSE | WebSocket |
---|---|---|
Protocol | HTTP/1.1 (textual content/event-stream ) |
Customized TCP protocol |
Route | One-way (server → consumer) | Two-way |
Complexity | Easy | Extra advanced to handle |
HTTP-compatible | ✅ Sure | ❌ Requires improve |
Cloud help | ✅ Sure | ❌ Not all the time supported |
Ideally suited for… | Actual-time updates, streaming | Video games, chat apps, full duplex eventualities |
MCP’s use case is generally server-push – streaming information and updates to LLM shoppers. So SSE is less complicated, extra suitable, and will get the job carried out.
Containerizing Your MCP Server with Docker
Earlier than deploying to the cloud, we’d like a constant runtime setting — enter Docker.
The powergentic/azd-mcp-ts
template features a preconfigured Dockerfile
that packages the MCP server right into a lean container. Right here’s what it does at a excessive stage:
# 1. Use a minimal Node.js base picture
FROM node:20-slim
# 2. Set working listing
WORKDIR /app
# 3. Copy dependencies and set up
COPY bundle*.json ./
RUN npm set up --production
# 4. Copy supply code
COPY . .
# 5. Expose port and run the server
EXPOSE 3000
CMD ["npm", "start"]
Right here’s the docker
instructions to construct and take a look at the container domestically:
docker construct -t mcp-server .
docker run -p 3000:3000 mcp-server
As soon as constructed, this picture might be:
- ✅ Run domestically for improvement
- 🚀 Deployed to a Docker host; like Azure Container Apps or Kubernetes
Deploying MCP Server to Azure with the Azure Developer CLI (azd)
The Azure Developer CLI (azd
) is a contemporary developer expertise that simplifies deploying full-stack apps to Azure. It makes use of conference over configuration and helps Infra-as-Code out of the field.
With the powergentic/azd-mcp-ts
Azure Developer CLI template, you may get began and deploy your personal customized Mannequin Context Protocol (MCP) server with only a few instructions:
-
Create a brand new folder on your
azd
challenge:mkdir mcp-server cd mcp-server
-
Initialize the challenge from the template:
azd init --template powergentic/azd-mcp-ts
-
Login to Azure:
azd auth login
-
Deploy the MCP Server:
azd up
This step does the next:
- Builds Docker picture
- Provisions Azure Container Registry, Azure Container Apps, Log Analytics, and a managed setting
- Deploys your MCP server behind a public HTTPS endpoint
🎉 Your MCP Server is stay and able to join with shoppers like Claude Desktop or a customized MCP consumer.
Extending the MCP Server with Instruments and Prompts
Together with your server deployed, it’s time to make it your personal.
Let’s add a brand new instrument that calculates BMI (Physique Mass Index):
Edit src/mcpserver/server.ts
import { z } from "zod";
// Add this after your greeting useful resource
server.instrument(
"calculate-bmi",
{
weightKg: z.quantity(),
heightM: z.quantity()
},
async ({ weightKg, heightM }) => ({
content material: [{
type: "text",
text: `Your BMI is ${(weightKg / (heightM * heightM)).toFixed(2)}`
}]
})
);
This exposes a instrument that may be referred to as by an LLM or agent when built-in together with your MCP server by way of:
instruments://calculate-bmi
Wrapping Up: The Energy of MCP + TypeScript + Azure
On this article, you discovered tips on how to:
✅ Perceive what Mannequin Context Protocol (MCP) is and why it issues
✅ Use TypeScript and the official SDK to outline MCP assets and instruments
✅ Construct a light-weight MCP Server with real-time communication by way of SSE
✅ Bundle it right into a Docker container
✅ Deploy it to Azure utilizing the Azure Developer CLI (azd)
✅ Customise the server with your personal logic, endpoints, and context
This will provide you with the muse for constructing your personal production-grade Mannequin Context Protocol (MCP) server for extending your AI Agent with extra performance!
Unique Article Supply: How you can Construct and Deploy an MCP Server with TypeScript and Azure Developer CLI (azd) utilizing Azure Container Apps and Docker written by Chris Pietschmann (For those who’re studying this someplace aside from Build5Nines.com, it was republished with out permission.)