ModelContextProtocol 0.1.0-preview.1.25171.12
Prefix ReservedSee the version list below for details.
dotnet add package ModelContextProtocol --version 0.1.0-preview.1.25171.12
NuGet\Install-Package ModelContextProtocol -Version 0.1.0-preview.1.25171.12
<PackageReference Include="ModelContextProtocol" Version="0.1.0-preview.1.25171.12" />
<PackageVersion Include="ModelContextProtocol" Version="0.1.0-preview.1.25171.12" />
<PackageReference Include="ModelContextProtocol" />
paket add ModelContextProtocol --version 0.1.0-preview.1.25171.12
#r "nuget: ModelContextProtocol, 0.1.0-preview.1.25171.12"
#:package ModelContextProtocol@0.1.0-preview.1.25171.12
#addin nuget:?package=ModelContextProtocol&version=0.1.0-preview.1.25171.12&prerelease
#tool nuget:?package=ModelContextProtocol&version=0.1.0-preview.1.25171.12&prerelease
MCP C# SDK
The official C# SDK for the Model Context Protocol, enabling .NET applications to connect to and interact with MCP clients and servers.
This is a preview release. Breaking changes can be introduced without prior notice.
About MCP
The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to Large Language Models (LLMs). It enables secure integration between LLMs and various data sources and tools.
For more information about MCP:
Getting Started (Client)
Then create a client and start using tools, or other capabilities, from the servers you configure:
McpClientOptions options = new()
{
ClientInfo = new() { Name = "TestClient", Version = "1.0.0" }
};
McpServerConfig config = new()
{
Id = "everything",
Name = "Everything",
TransportType = TransportTypes.StdIo,
TransportOptions = new()
{
["command"] = "npx",
["arguments"] = "-y @modelcontextprotocol/server-everything",
}
};
var client = await McpClientFactory.CreateAsync(config, options);
// Print the list of tools available from the server.
await foreach (var tool in client.ListToolsAsync())
{
Console.WriteLine($"{tool.Name} ({tool.Description})");
}
// Execute a tool (this would normally be driven by LLM tool invocations).
var result = await client.CallToolAsync(
"echo",
new() { ["message"] = "Hello MCP!" },
CancellationToken.None);
// echo always returns one and only one text content object
Console.WriteLine(result.Content.First(c => c.Type == "text").Text);
Note that you should pass CancellationToken objects suitable for your use case, to enable proper error handling, timeouts, etc. This example also does not paginate the tools list, which may be necessary for large tool sets. See the IntegrationTests project for an example of pagination, as well as examples of how to handle Prompts and Resources.
It is also highly recommended that you pass a proper LoggerFactory instance to the factory constructor, to enable logging of MCP client operations.
You can find samples demonstrating how to use ModelContextProtocol with an LLM SDK in the samples directory, and also refer to the tests project for more examples.
Additional examples and documentation will be added as in the near future.
Remember you can connect to any MCP server, not just ones created using ModelContextProtocol. The protocol is designed to be server-agnostic, so you can use this library to connect to any compliant server.
Tools can be exposed easily as AIFunction
instances so that they are immediately usable with IChatClient
s.
// Get available functions.
IList<AIFunction> tools = await client.GetAIFunctionsAsync();
// Call the chat client using the tools.
IChatClient chatClient = ...;
var response = await chatClient.GetResponseAsync(
"your prompt here",
new()
{
Tools = [.. tools],
});
Getting Started (Server)
Here is an example of how to create an MCP server and register all tools from the current application.
It includes a simple echo tool as an example (this is included in the same file here for easy of copy and paste, but it needn't be in the same file...
the employed overload of WithTools
examines the current assembly for classes with the McpToolType
attribute, and registers all methods with the
McpTool
attribute as tools.)
using ModelContextProtocol;
using ModelContextProtocol.Server;
using Microsoft.Extensions.Hosting;
using System.ComponentModel;
var builder = Host.CreateEmptyApplicationBuilder(settings: null);
builder.Services
.AddMcpServer()
.WithStdioServerTransport()
.WithTools();
await builder.Build().RunAsync();
[McpToolType]
public static class EchoTool
{
[McpTool, Description("Echoes the message back to the client.")]
public static string Echo(string message) => $"hello {message}";
}
More control is also available, with fine-grained control over configuring the server and how it should handle client requests. For example:
using ModelContextProtocol.Protocol.Transport;
using ModelContextProtocol.Protocol.Types;
using ModelContextProtocol.Server;
using Microsoft.Extensions.Logging.Abstractions;
McpServerOptions options = new()
{
ServerInfo = new() { Name = "MyServer", Version = "1.0.0" },
Capabilities = new()
{
Tools = new()
{
ListToolsHandler = async (request, cancellationToken) =>
{
return new ListToolsResult()
{
Tools =
[
new Tool()
{
Name = "echo",
Description = "Echoes the input back to the client.",
InputSchema = new JsonSchema()
{
Type = "object",
Properties = new Dictionary<string, JsonSchemaProperty>()
{
["message"] = new JsonSchemaProperty() { Type = "string", Description = "The input to echo back." }
}
},
}
]
};
},
CallToolHandler = async (request, cancellationToken) =>
{
if (request.Params?.Name == "echo")
{
if (request.Params.Arguments?.TryGetValue("message", out var message) is not true)
{
throw new McpServerException("Missing required argument 'message'");
}
return new CallToolResponse()
{
Content = [new Content() { Text = $"Echo: {message}", Type = "text" }]
};
}
throw new McpServerException($"Unknown tool: '{request.Params?.Name}'");
},
}
},
};
await using IMcpServer server = McpServerFactory.Create(new StdioServerTransport("MyServer"), options);
await server.StartAsync();
// Run until process is stopped by the client (parent process)
await Task.Delay(Timeout.Infinite);
License
This project is licensed under the MIT License.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Microsoft.Bcl.Memory (>= 9.0.3)
- Microsoft.Extensions.AI (>= 9.3.0-preview.1.25161.3)
- Microsoft.Extensions.AI.Abstractions (>= 9.3.0-preview.1.25161.3)
- Microsoft.Extensions.Hosting.Abstractions (>= 9.0.3)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.3)
- System.Net.ServerSentEvents (>= 10.0.0-preview.2.25163.2)
- System.Text.Json (>= 9.0.3)
- System.Threading.Channels (>= 9.0.3)
-
net8.0
- Microsoft.Extensions.AI (>= 9.3.0-preview.1.25161.3)
- Microsoft.Extensions.AI.Abstractions (>= 9.3.0-preview.1.25161.3)
- Microsoft.Extensions.Hosting.Abstractions (>= 9.0.3)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.3)
- System.Net.ServerSentEvents (>= 10.0.0-preview.2.25163.2)
NuGet packages (50)
Showing the top 5 NuGet packages that depend on ModelContextProtocol:
Package | Downloads |
---|---|
BotSharp.Core
Open source LLM application framework to build scalable, flexible and robust AI system. |
|
ModelContextProtocol.AspNetCore
ASP.NET Core extensions for the C# Model Context Protocol (MCP) SDK. |
|
Senparc.Weixin.AspNet
微信 SDK - Senparc.Weixin.AspNet 模块 Senparc.Weixin SDK 开源项目: https://github.com/JeffreySu/WeiXinMPSDK |
|
PayTech.RestHost
Easy Rest Api Development Library |
|
Senparc.Ncf.XncfBase
Senparc.Ncf.XncfBase |
GitHub repositories (30)
Showing the top 20 popular GitHub repositories that depend on ModelContextProtocol:
Repository | Stars |
---|---|
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
|
|
JeffreySu/WeiXinMPSDK
微信全平台 .NET SDK, Senparc.Weixin for C#,支持 .NET Framework 及 .NET Core、.NET 8.0。已支持微信公众号、小程序、小游戏、微信支付、企业微信/企业号、开放平台、JSSDK、微信周边等全平台。 WeChat SDK for C#.
|
|
dotnet/aspire
Tools, templates, and packages to accelerate building observable, production-ready apps
|
|
dotnet/extensions
This repository contains a suite of libraries that provide facilities commonly needed when creating production-ready applications.
|
|
SciSharp/BotSharp
AI Multi-Agent Framework in .NET
|
|
microsoft/Generative-AI-for-beginners-dotnet
Five lessons, learn how to really apply AI to your .NET Applications
|
|
microsoft/mcp
Catalog of official Microsoft MCP (Model Context Protocol) server implementations for AI-powered data access and tool integration
|
|
IoTSharp/IoTSharp
IoTSharp is an open-source IoT platform for data collection, processing, visualization, and device management.
|
|
Azure/azure-mcp
The Azure MCP Server, bringing the power of Azure to your agents.
|
|
awaescher/OllamaSharp
The easiest way to use Ollama in .NET
|
|
iioter/iotgateway
An industrial IoTGateway with B/S architecture that enables bidirectional communication between industrial devices (southbound connections) and IoT platforms (northbound connections). It supports numerous industrial protocols, and can connect to various IoT cloud platforms.
|
|
tryAGI/LangChain
C# implementation of LangChain. We try to be as close to the original as possible in terms of abstractions, but are open to new entities.
|
|
getcellm/cellm
Use LLMs in Excel formulas
|
|
mixcore/mix.core
🚀 A future-proof enterprise web CMS supporting both headless and decoupled approaches. Build any type of app with customizable APIs on ASP.NET Core/.NET Core. Completely open-source and designed for flexibility.
|
|
CommunityToolkit/Aspire
A community project with additional components and extensions for .NET Aspire
|
|
IvanMurzak/Unity-MCP
MCP Server + Plugin for Unity Editor and Unity game. The Plugin allows to connect to MCP clients like Claude Desktop or others.
|
|
CervantesSec/cervantes
Cervantes is an open-source, collaborative platform designed specifically for pentesters and red teams. It serves as a comprehensive management tool, streamlining the organization of projects, clients, vulnerabilities, and reports in a single, centralized location.
|
|
Senparc/Senparc.CO2NET
Base Common Library, support for.NET Framework &.NET Core
|
|
axzxs2001/Asp.NetCoreExperiment
原来所有项目都移动到**OleVersion**目录下进行保留。新的案例装以.net 5.0为主,一部分对以前案例进行升级,一部分将以前的工作经验总结出来,以供大家参考!
|
|
lofcz/LlmTornado
The .NET library to build AI systems with 100+ LLM APIs: Anthropic, Azure, Cohere, DeepInfra, DeepSeek, Google, Groq, Mistral, Ollama, OpenAI, OpenRouter, Perplexity, vLLM, Voyage, xAI, and many more!
|
Version | Downloads | Last Updated |
---|---|---|
0.3.0-preview.4 | 81,309 | 8/20/2025 |
0.3.0-preview.3 | 141,526 | 7/16/2025 |
0.3.0-preview.2 | 69,789 | 7/3/2025 |
0.3.0-preview.1 | 81,358 | 6/20/2025 |
0.2.0-preview.3 | 92,530 | 6/3/2025 |
0.2.0-preview.2 | 30,678 | 5/29/2025 |
0.2.0-preview.1 | 70,244 | 5/16/2025 |
0.1.0-preview.14 | 11,733 | 5/15/2025 |
0.1.0-preview.13 | 20,416 | 5/10/2025 |
0.1.0-preview.12 | 14,523 | 5/5/2025 |
0.1.0-preview.11 | 51,464 | 4/24/2025 |
0.1.0-preview.10 | 31,046 | 4/19/2025 |
0.1.0-preview.9 | 20,426 | 4/15/2025 |
0.1.0-preview.8 | 13,592 | 4/11/2025 |
0.1.0-preview.7 | 6,948 | 4/9/2025 |
0.1.0-preview.6 | 9,392 | 4/4/2025 |
0.1.0-preview.5 | 2,170 | 4/3/2025 |
0.1.0-preview.4 | 9,970 | 3/31/2025 |
0.1.0-preview.3 | 296 | 3/31/2025 |
0.1.0-preview.2 | 6,106 | 3/27/2025 |