Mythosia.AI
3.0.3
dotnet add package Mythosia.AI --version 3.0.3
NuGet\Install-Package Mythosia.AI -Version 3.0.3
<PackageReference Include="Mythosia.AI" Version="3.0.3" />
<PackageVersion Include="Mythosia.AI" Version="3.0.3" />
<PackageReference Include="Mythosia.AI" />
paket add Mythosia.AI --version 3.0.3
#r "nuget: Mythosia.AI, 3.0.3"
#:package Mythosia.AI@3.0.3
#addin nuget:?package=Mythosia.AI&version=3.0.3
#tool nuget:?package=Mythosia.AI&version=3.0.3
Mythosia.AI
Package Summary
The Mythosia.AI
library provides a unified interface for various AI models with multimodal support, function calling, and advanced streaming capabilities, including OpenAI GPT-4o, Anthropic Claude 3, Google Gemini, DeepSeek, and Perplexity Sonar.
📚 Documentation
- Basic Usage Guide - Getting started with text queries, streaming, image analysis, and more
- Advanced Features - Function calling, policies, and enhanced streaming (v3.0.0)
🚀 What's New in v3.0.0
Function Calling Support 🎯
- Universal Function Calling: Add custom functions that AI can call during conversations
- Automatic Type Conversion: Seamless parameter marshalling between AI and your code
- Policy-Based Execution: Control timeout, max rounds, and concurrency
- Attribute-Based Registration: Decorate methods with
[AiFunction]
for automatic discovery - Fluent API: Chain function definitions with builder pattern
Enhanced Streaming 🌊
- Structured Streaming: New
StreamingContent
with metadata support - Stream Options: Control what data you receive (text-only, metadata, function calls)
- Function Streaming: Real-time function execution during streaming
- Improved Performance: Optimized streaming with proper backpressure handling
Policy System ⚙️
- Execution Policies: Control function calling behavior with pre-defined or custom policies
- Timeout Management: Per-request timeout configuration
- Round Limiting: Prevent infinite loops in function calling
- Debug Support: Built-in logging for function execution flow
Breaking Changes ⚠️
- Function calling requires explicit enablement via
WithFunctions()
orWithFunction()
- Streaming now returns
StreamingContent
when using advanced options - Some internal APIs have changed for better consistency
Installation
dotnet add package Mythosia.AI
For advanced LINQ operations with streams:
dotnet add package System.Linq.Async
Function Calling (New in v3.0.0)
Quick Start with Functions
// Define a simple function
var service = new ChatGptService(apiKey, httpClient)
.WithFunction(
"get_weather",
"Gets the current weather for a location",
("location", "The city and country", required: true),
(string location) => $"The weather in {location} is sunny, 22°C"
);
// AI will automatically call the function when needed
var response = await service.GetCompletionAsync("What's the weather in Seoul?");
// Output: "The weather in Seoul is currently sunny with a temperature of 22°C."
Attribute-Based Function Registration
public class WeatherService
{
[AiFunction("get_current_weather", "Gets the current weather for a location")]
public string GetWeather(
[AiParameter("The city name", required: true)] string city,
[AiParameter("Temperature unit", required: false)] string unit = "celsius")
{
// Your implementation
return $"Weather in {city}: 22°{unit[0]}";
}
}
// Register all functions from a class
var weatherService = new WeatherService();
var service = new ChatGptService(apiKey, httpClient)
.WithFunctions(weatherService);
Advanced Function Builder
var service = new ChatGptService(apiKey, httpClient)
.WithFunction(FunctionBuilder.Create("calculate")
.WithDescription("Performs mathematical calculations")
.AddParameter("expression", "string", "The math expression", required: true)
.AddParameter("precision", "integer", "Decimal places", required: false, defaultValue: 2)
.WithHandler(async (args) =>
{
var expr = args["expression"].ToString();
var precision = Convert.ToInt32(args.GetValueOrDefault("precision", 2));
// Calculate and return result
return await CalculateAsync(expr, precision);
})
.Build());
Multiple Functions with Different Types
var service = new ChatGptService(apiKey, httpClient)
// Parameterless function
.WithFunction(
"get_time",
"Gets the current time",
() => DateTime.Now.ToString("HH:mm:ss")
)
// Two-parameter function
.WithFunction(
"add_numbers",
"Adds two numbers",
("a", "First number", true),
("b", "Second number", true),
(double a, double b) => $"The sum is {a + b}"
)
// Async function
.WithFunctionAsync(
"fetch_data",
"Fetches data from API",
("endpoint", "API endpoint", true),
async (string endpoint) => await httpClient.GetStringAsync(endpoint)
);
// The AI will automatically use the appropriate functions
var response = await service.GetCompletionAsync(
"What time is it? Also, what's 15 plus 27?"
);
Function Calling Policies
// Pre-defined policies
service.DefaultPolicy = FunctionCallingPolicy.Fast; // 30s timeout, 10 rounds
service.DefaultPolicy = FunctionCallingPolicy.Complex; // 300s timeout, 50 rounds
service.DefaultPolicy = FunctionCallingPolicy.Vision; // 200s timeout, for image analysis
// Custom policy
service.DefaultPolicy = new FunctionCallingPolicy
{
MaxRounds = 25,
TimeoutSeconds = 120,
MaxConcurrency = 5,
EnableLogging = true // Enable debug output
};
// Per-request policy override
var response = await service
.WithPolicy(FunctionCallingPolicy.Fast)
.GetCompletionAsync("Complex task requiring functions");
// Inline policy configuration
var response = await service
.BeginMessage()
.AddText("Analyze this data")
.WithMaxRounds(5)
.WithTimeout(60)
.SendAsync();
Function Calling with Streaming
// Stream with function calling support
await foreach (var content in service.StreamAsync(
"What's the weather in Seoul and calculate 15% tip on $85",
StreamOptions.WithFunctions))
{
if (content.Type == StreamingContentType.FunctionCall)
{
Console.WriteLine($"Calling function: {content.Metadata["function_name"]}");
}
else if (content.Type == StreamingContentType.FunctionResult)
{
Console.WriteLine($"Function completed: {content.Metadata["status"]}");
}
else if (content.Type == StreamingContentType.Text)
{
Console.Write(content.Content);
}
}
Disabling Functions Temporarily
// Disable functions for a single request
var response = await service
.WithoutFunctions()
.GetCompletionAsync("Don't use any functions for this");
// Or use the async helper
var response = await service.AskWithoutFunctionsAsync(
"Process this without calling functions"
);
Enhanced Streaming (v3.0.0)
Stream Options
// Text only - fastest, no overhead
await foreach (var chunk in service.StreamAsync("Hello", StreamOptions.TextOnlyOptions))
{
Console.Write(chunk.Content);
}
// With metadata - includes model info, timestamps, etc.
await foreach (var content in service.StreamAsync("Hello", StreamOptions.FullOptions))
{
if (content.Metadata != null)
{
Console.WriteLine($"Model: {content.Metadata["model"]}");
}
Console.Write(content.Content);
}
// Custom options
var options = new StreamOptions()
.WithMetadata(true)
.WithFunctionCalls(true)
.WithTokenInfo(false)
.AsTextOnly(false);
await foreach (var content in service.StreamAsync("Query", options))
{
// Process based on content.Type
switch (content.Type)
{
case StreamingContentType.Text:
Console.Write(content.Content);
break;
case StreamingContentType.FunctionCall:
Console.WriteLine($"Calling: {content.Metadata["function_name"]}");
break;
case StreamingContentType.Completion:
Console.WriteLine($"Total length: {content.Metadata["total_length"]}");
break;
}
}
Service-Specific Function Support
Service | Function Calling | Streaming Functions | Notes |
---|---|---|---|
OpenAI GPT-4o | ✅ Full | ✅ Full | Best support, all features |
OpenAI GPT-4.1 | ✅ Full | ✅ Full | Full function support |
OpenAI o3 | ✅ Full | ✅ Full | Advanced reasoning with functions |
OpenAI GPT-5 | 🔜 Coming Soon | 🔜 Coming Soon | Support planned for future update |
Claude 3/4 | ✅ Full | ✅ Full | Tool use via native API |
Gemini | 🔜 Coming Soon | 🔜 Coming Soon | Support planned for future update |
DeepSeek | ❌ | ❌ | Not yet available |
Perplexity | ❌ | ❌ | Web search focused |
Complete Examples
Building a Weather Assistant
public class WeatherAssistant
{
private readonly ChatGptService _service;
private readonly HttpClient _httpClient;
public WeatherAssistant(string apiKey)
{
_httpClient = new HttpClient();
_service = new ChatGptService(apiKey, _httpClient)
.WithSystemMessage("You are a helpful weather assistant.")
.WithFunction(
"get_weather",
"Gets current weather for a city",
("city", "City name", true),
GetWeatherData
)
.WithFunction(
"get_forecast",
"Gets weather forecast",
("city", "City name", true),
("days", "Number of days", false),
GetForecast
);
// Configure function calling behavior
_service.DefaultPolicy = new FunctionCallingPolicy
{
MaxRounds = 10,
TimeoutSeconds = 30,
EnableLogging = true
};
}
private string GetWeatherData(string city)
{
// In real implementation, call weather API
return $"{{\"city\":\"{city}\",\"temp\":22,\"condition\":\"sunny\"}}";
}
private string GetForecast(string city, int days = 3)
{
// In real implementation, call forecast API
return $"{{\"city\":\"{city}\",\"forecast\":\"{days} days of sun\"}}";
}
public async Task<string> AskAsync(string question)
{
return await _service.GetCompletionAsync(question);
}
public async IAsyncEnumerable<string> StreamAsync(string question)
{
await foreach (var content in _service.StreamAsync(question))
{
if (content.Type == StreamingContentType.Text && content.Content != null)
{
yield return content.Content;
}
}
}
}
// Usage
var assistant = new WeatherAssistant(apiKey);
// Functions are called automatically
var response = await assistant.AskAsync("What's the weather in Tokyo?");
// AI calls get_weather("Tokyo") and responds naturally
// Streaming also supports functions
await foreach (var chunk in assistant.StreamAsync(
"Compare weather in Seoul and Tokyo for the next 5 days"))
{
Console.Write(chunk);
}
Math Tutor with Step-by-Step Solutions
var mathTutor = new ChatGptService(apiKey, httpClient)
.WithSystemMessage("You are a math tutor. Always explain your reasoning.")
.WithFunction(
"calculate",
"Performs calculations",
("expression", "Math expression", true),
(string expr) => {
// Using a math expression evaluator
var result = EvaluateExpression(expr);
return $"Result: {result}";
}
)
.WithFunction(
"solve_equation",
"Solves equations step by step",
("equation", "Equation to solve", true),
(string equation) => {
var steps = SolveWithSteps(equation);
return JsonSerializer.Serialize(steps);
}
);
// The AI will use functions and explain the process
var response = await mathTutor.GetCompletionAsync(
"Solve the equation 2x + 5 = 13 and verify the answer"
);
// Output includes step-by-step solution with verification
Migration Guide from v2.x to v3.0.0
Function Calling (New Feature)
// v3.0.0 - Functions are now supported!
var service = new ChatGptService(apiKey, httpClient)
.WithFunction("my_function", "Description",
("param", "Param description", true),
(string param) => $"Result: {param}");
// AI will automatically use functions when appropriate
var response = await service.GetCompletionAsync("Use my function");
Streaming Changes
// v2.x - Returns string chunks
await foreach (var chunk in service.StreamAsync("Hello"))
{
Console.Write(chunk); // chunk is string
}
// v3.0.0 - Can return StreamingContent with metadata
await foreach (var content in service.StreamAsync("Hello", StreamOptions.FullOptions))
{
Console.Write(content.Content); // Access text via .Content
var metadata = content.Metadata; // Access metadata
}
// For backward compatibility, default behavior unchanged
await foreach (var chunk in service.StreamAsync("Hello"))
{
Console.Write(chunk); // Still works, chunk is string
}
Policy System (New)
// v3.0.0 - Control function execution behavior
service.DefaultPolicy = FunctionCallingPolicy.Fast;
// Per-request override
await service
.WithTimeout(60)
.WithMaxRounds(5)
.GetCompletionAsync("Complex task");
Best Practices
Function Design: Keep functions focused and simple. Complex logic should be broken into multiple functions.
Error Handling: Functions should return meaningful error messages that the AI can understand.
Performance: Use appropriate policies for your use case (Fast for simple tasks, Complex for detailed analysis).
Streaming: Use
TextOnlyOptions
for best performance when metadata isn't needed.Testing: Test function calling with various prompts to ensure robust behavior.
Troubleshooting
Q: Functions aren't being called when expected?
- Ensure functions are registered with clear, descriptive names and descriptions
- Check that
EnableFunctions
is true on the ChatBlock - Verify the model supports function calling (GPT-4, Claude 3+, Gemini)
Q: Function calling is too slow?
- Adjust the policy timeout:
service.DefaultPolicy.TimeoutSeconds = 30
- Use
FunctionCallingPolicy.Fast
for simple operations - Consider using streaming for better perceived performance
Q: How to debug function execution?
- Enable logging:
service.DefaultPolicy.EnableLogging = true
- Check the console output for round-by-round execution details
- Use
StreamOptions.FullOptions
to see function call metadata
Q: Can I use functions with streaming?
- Yes! Functions work seamlessly with streaming in v3.0.0
- Use
StreamOptions.WithFunctions
to see function execution in real-time
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
.NET Core | netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.1 is compatible. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.1
- Azure.AI.OpenAI (>= 2.1.0)
- Mythosia (>= 1.4.0)
- Newtonsoft.Json (>= 13.0.3)
- System.Threading.Channels (>= 9.0.7)
- TiktokenSharp (>= 1.1.6)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last Updated |
---|---|---|
3.0.3 | 122 | 9/7/2025 |
3.0.2 | 74 | 9/6/2025 |
3.0.1 | 146 | 9/1/2025 |
3.0.0 | 185 | 8/28/2025 |
2.2.1 | 138 | 8/18/2025 |
2.2.0 | 151 | 8/8/2025 |
2.1.0 | 112 | 7/18/2025 |
2.0.1 | 148 | 7/17/2025 |
2.0.0 | 148 | 7/17/2025 |
1.7.1 | 364 | 2/3/2025 |
1.7.0 | 97 | 2/2/2025 |
1.6.0 | 85 | 1/29/2025 |
1.5.0 | 86 | 1/26/2025 |
1.4.0 | 101 | 12/9/2024 |
1.3.2 | 98 | 11/25/2024 |
1.3.1 | 102 | 10/21/2024 |
1.3.0 | 92 | 10/8/2024 |
1.2.1 | 88 | 9/27/2024 |
1.2.0 | 93 | 9/26/2024 |
1.1.1 | 97 | 9/24/2024 |
1.1.0 | 96 | 9/22/2024 |
1.0.0 | 132 | 9/22/2024 |