Encamina.Enmarcha.SemanticKernel.Plugins.Chat
8.1.8-preview-01
See the version list below for details.
dotnet add package Encamina.Enmarcha.SemanticKernel.Plugins.Chat --version 8.1.8-preview-01
NuGet\Install-Package Encamina.Enmarcha.SemanticKernel.Plugins.Chat -Version 8.1.8-preview-01
<PackageReference Include="Encamina.Enmarcha.SemanticKernel.Plugins.Chat" Version="8.1.8-preview-01" />
paket add Encamina.Enmarcha.SemanticKernel.Plugins.Chat --version 8.1.8-preview-01
#r "nuget: Encamina.Enmarcha.SemanticKernel.Plugins.Chat, 8.1.8-preview-01"
// Install Encamina.Enmarcha.SemanticKernel.Plugins.Chat as a Cake Addin #addin nuget:?package=Encamina.Enmarcha.SemanticKernel.Plugins.Chat&version=8.1.8-preview-01&prerelease // Install Encamina.Enmarcha.SemanticKernel.Plugins.Chat as a Cake Tool #tool nuget:?package=Encamina.Enmarcha.SemanticKernel.Plugins.Chat&version=8.1.8-preview-01&prerelease
Semantic Kernel - Chat Plugin
Chat Plugin is a project that provides Chat functionality in the form of a Semantic Kernel Plugin. It allows users to interact while chatting and asking questions to an Artificial Intelligence, usually a Large Language Model (LLM). Additionally, it stores the conversation history.
Setup
Nuget package
First, install NuGet. Then, install Encamina.Enmarcha.SemanticKernel.Plugins.Chat from the package manager console:
PM> Install-Package Encamina.Enmarcha.SemanticKernel.Plugins.Chat
.NET CLI:
First, install .NET CLI. Then, install Encamina.Enmarcha.SemanticKernel.Plugins.Chat from the .NET CLI:
dotnet add package Encamina.Enmarcha.SemanticKernel.Plugins.Chat
How to use
To use ChatWithHistoryPlugin, the usual approach is to import it as a plugin within Semantic Kernel. The simplest way to do this is by using the extension method ImportChatWithHistoryPlugin, which handles the import of the Plugin into Semantic Kernel. However, some previous configuration is required before importing it.
First, you need to add the SemanticKernelOptions, ChatWithHistoryPluginOptions and ChatHistoryProviderOptions to your project configuration. You can achieve this by using any configuration provider. The followng code is an example of how the settings should look like using the appsettings.json
file:
{
// ...
"SemanticKernelOptions": {
"ChatModelName": "gpt-35-turbo", // Name (sort of a unique identifier) of the model to use for chat
"ChatModelDeploymentName": "gpt-35-turbo", // Model deployment name on the LLM (for example OpenAI) to use for chat
"EmbeddingsModelName": "text-embedding-ada-002", // Name (sort of a unique identifier) of the model to use for embeddings
"EmbeddingsModelDeploymentName": "text-embedding-ada-002", // Model deployment name on the LLM (for example OpenAI) to use for embeddings
"Endpoint": "https://your-url.openai.azure.com/", // Uri for an LLM resource (like OpenAI). This should include protocol and hostname.
"Key": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx", // Key credential used to authenticate to an LLM resource
},
"ChatWithHistoryPluginOptions": {
"ChatRequestSettings": {
"MaxTokens": 1000, // Maximum number of tokens to generate in the completion
"Temperature": 0.8, // Controls the randomness of the completion. The higher the temperature, the more random the completion
"TopP": 0.5, // Controls the diversity of the completion. The higher the TopP, the more diverse the completion.
}
},
"ChatHistoryProviderOptions": {
HistoryMaxMessages": 12,
}
// ...
}
Next, in Program.cs
or a similar entry point file in your project, add the following code.
// Entry point
var builder = WebApplication.CreateBuilder(new WebApplicationOptions
{
// ...
});
// ...
var tokenLengthFunction = ILengthFunctions.LengthByTokenCount;
string cosmosContainer = "cosmosDbContainer"; // You probably want to save this in the appsettings or similar
// Or others configuration providers...
builder.Configuration.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true);
// Requires Encamina.Enmarcha.SemanticKernel.Abstractions nuget
builder.Services.AddOptions<SemanticKernelOptions>().Bind(builder.Configuration.GetSection(nameof(SemanticKernelOptions)))
.ValidateDataAnnotations()
.ValidateOnStart();
builder.Services.AddOptions<ChatWithHistoryPluginOptions>().Bind(builder.Configuration.GetSection(nameof(ChatWithHistoryPluginOptions)))
.ValidateDataAnnotations()
.ValidateOnStart();
builder.Services.AddOptions<ChatHistoryProviderOptions>().Bind(builder.Configuration.GetSection(nameof(ChatHistoryProviderOptions)))
.ValidateDataAnnotations()
.ValidateOnStart();
// Requieres Encamina.Enmarcha.Data.Cosmos
builder.Services.AddCosmos(builder.Configuration);
builder.Services.AddCosmosChatHistoryProvider(cosmosContainer, tokenLengthFunction);
builder.Services.AddScoped(sp =>
{
var kernel = new KernelBuilder()
.WithAzureChatCompletionService("<YOUR DEPLOYMENT NAME>", "<YOUR AZURE ENDPOINT>", "<YOUR API KEY>")
//.WithOpenAIChatCompletionService("<YOUR MODEL ID>", "<YOUR API KEY>", "<YOUR API KEY>")
/// ...
.Build();
// ...
kernel.ImportChatWithHistoryPlugin(sp, openAIOptions, tokenLengthFunction);
return kernel;
});
Now you can inject the kernel via constructor, and the chat capabilities are already available.
public class MyClass
{
private readonly Kernel kernel;
public MyClass(Kernel kernel)
{
this.kernel = kernel;
}
public async Task TestChatAsync()
{
var contextVariables = new ContextVariables();
contextVariables.Set(PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Parameters.Ask, "What is the weather like in Madrid?");
contextVariables.Set(PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Parameters.UserId, "123456");
contextVariables.Set(PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Parameters.UserName, "John Doe");
contextVariables.Set(PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Parameters.Locale, "en");
var functionChat = kernel.Func(PluginsInfo.ChatWithHistoryPlugin.Name, PluginsInfo.ChatWithHistoryPlugin.Functions.Chat.Name);
var resultContext = await kernel.RunAsync(contextVariables, functionChat);
}
}
Advanced configurations
If you want to disable chat history, simply configure the HistoryMaxMessages of ChatHistoryProviderOptions with a value of 0.
You can also inherit from the ChatWithHistoryPlugin class and add the customizations you need.
public class MyCustomChatWithHistoryPlugin : ChatWithHistoryPlugin
{
public MyCustomChatWithHistoryPlugin(Kernel kernel, string chatModelName, Func<string, int> tokensLengthFunction, IChatHistoryProvider chatHistoryProvider, IOptionsMonitor<ChatWithHistoryPluginOptions> options)
: base(kernel, chatModelName, tokensLengthFunction, chatHistoryProvider, options)
{
}
protected override string SystemPrompt => "You are a Virtual Assistant who only talks about the weather.";
// There are more overridable methods/properties
}
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net8.0
- CommunityToolkit.Diagnostics (>= 8.2.2)
- Encamina.Enmarcha.AI.OpenAI.Abstractions (>= 8.1.8-preview-01)
- Encamina.Enmarcha.AI.OpenAI.Azure (>= 8.1.8-preview-01)
- Encamina.Enmarcha.Core (>= 8.1.8-preview-01)
- Encamina.Enmarcha.Data.Abstractions (>= 8.1.8-preview-01)
- Encamina.Enmarcha.Data.Cosmos (>= 8.1.8-preview-01)
- Encamina.Enmarcha.Entities.Abstractions (>= 8.1.8-preview-01)
- Encamina.Enmarcha.SemanticKernel.Abstractions (>= 8.1.8-preview-01)
- Microsoft.Extensions.Options (>= 8.0.2)
- Microsoft.SemanticKernel.Abstractions (>= 1.15.0)
- Newtonsoft.Json (>= 13.0.3)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
8.2.0 | 271 | 10/22/2024 |
8.2.0-preview-01-m01 | 94 | 9/17/2024 |
8.1.9-preview-02 | 64 | 10/22/2024 |
8.1.9-preview-01 | 279 | 10/4/2024 |
8.1.8 | 220 | 9/23/2024 |
8.1.8-preview-07 | 514 | 9/12/2024 |
8.1.8-preview-06 | 216 | 9/11/2024 |
8.1.8-preview-05 | 76 | 9/10/2024 |
8.1.8-preview-04 | 354 | 8/16/2024 |
8.1.8-preview-03 | 173 | 8/13/2024 |
8.1.8-preview-02 | 97 | 8/13/2024 |
8.1.8-preview-01 | 97 | 8/12/2024 |
8.1.7 | 93 | 8/7/2024 |
8.1.7-preview-09 | 150 | 7/3/2024 |
8.1.7-preview-08 | 92 | 7/2/2024 |
8.1.7-preview-07 | 90 | 6/10/2024 |
8.1.7-preview-06 | 91 | 6/10/2024 |
8.1.7-preview-05 | 102 | 6/6/2024 |
8.1.7-preview-04 | 92 | 6/6/2024 |
8.1.7-preview-03 | 85 | 5/24/2024 |
8.1.7-preview-02 | 91 | 5/10/2024 |
8.1.7-preview-01 | 100 | 5/8/2024 |
8.1.6 | 1,137 | 5/7/2024 |
8.1.6-preview-08 | 59 | 5/2/2024 |
8.1.6-preview-07 | 89 | 4/29/2024 |
8.1.6-preview-06 | 366 | 4/26/2024 |
8.1.6-preview-05 | 86 | 4/24/2024 |
8.1.6-preview-04 | 103 | 4/22/2024 |
8.1.6-preview-03 | 91 | 4/22/2024 |
8.1.6-preview-02 | 122 | 4/17/2024 |
8.1.6-preview-01 | 181 | 4/15/2024 |
8.1.5 | 113 | 4/15/2024 |
8.1.5-preview-15 | 110 | 4/10/2024 |
8.1.5-preview-14 | 133 | 3/20/2024 |
8.1.5-preview-13 | 80 | 3/18/2024 |
8.1.5-preview-12 | 97 | 3/13/2024 |
8.1.5-preview-11 | 84 | 3/13/2024 |
8.1.5-preview-10 | 110 | 3/13/2024 |
8.1.5-preview-09 | 94 | 3/12/2024 |
8.1.5-preview-08 | 90 | 3/12/2024 |
8.1.5-preview-07 | 99 | 3/8/2024 |
8.1.5-preview-06 | 205 | 3/8/2024 |
8.1.5-preview-05 | 90 | 3/7/2024 |
8.1.5-preview-04 | 98 | 3/7/2024 |
8.1.5-preview-03 | 75 | 3/7/2024 |
8.1.5-preview-02 | 133 | 2/28/2024 |
8.1.5-preview-01 | 111 | 2/19/2024 |
8.1.4 | 150 | 2/15/2024 |
8.1.3 | 119 | 2/13/2024 |
8.1.3-preview-07 | 67 | 2/13/2024 |
8.1.3-preview-06 | 92 | 2/12/2024 |
8.1.3-preview-05 | 102 | 2/9/2024 |
8.1.3-preview-04 | 88 | 2/8/2024 |
8.1.3-preview-03 | 82 | 2/7/2024 |
8.1.3-preview-02 | 87 | 2/2/2024 |
8.1.3-preview-01 | 81 | 2/2/2024 |
8.1.2 | 121 | 2/1/2024 |
8.1.2-preview-9 | 104 | 1/22/2024 |
8.1.2-preview-8 | 86 | 1/19/2024 |
8.1.2-preview-7 | 87 | 1/19/2024 |
8.1.2-preview-6 | 83 | 1/19/2024 |
8.1.2-preview-5 | 94 | 1/19/2024 |
8.1.2-preview-4 | 84 | 1/19/2024 |
8.1.2-preview-3 | 88 | 1/18/2024 |
8.1.2-preview-2 | 85 | 1/18/2024 |
8.1.2-preview-16 | 85 | 1/31/2024 |
8.1.2-preview-15 | 82 | 1/31/2024 |
8.1.2-preview-14 | 188 | 1/25/2024 |
8.1.2-preview-13 | 85 | 1/25/2024 |
8.1.2-preview-12 | 96 | 1/23/2024 |
8.1.2-preview-11 | 105 | 1/23/2024 |
8.1.2-preview-10 | 93 | 1/22/2024 |
8.1.2-preview-1 | 78 | 1/18/2024 |
8.1.1 | 136 | 1/18/2024 |
8.1.0 | 102 | 1/18/2024 |
8.0.3 | 153 | 12/29/2023 |
8.0.1 | 129 | 12/14/2023 |
8.0.0 | 155 | 12/7/2023 |
6.0.4.3 | 153 | 12/29/2023 |
6.0.4.2 | 146 | 12/20/2023 |
6.0.4.1 | 213 | 12/19/2023 |
6.0.4 | 171 | 12/4/2023 |
6.0.3.20 | 144 | 11/27/2023 |
6.0.3.19 | 146 | 11/22/2023 |