How to Add AI Chat to a .NET MAUI App Using Microsoft.Extensions.AI

·

·

Adding AI chat to a mobile app used to mean picking an SDK, locking into a provider, and hoping the API didn’t change under you. Microsoft.Extensions.AI changes that — it provides a unified IChatClient abstraction that works with OpenAI, Azure OpenAI, Ollama, and any compatible provider. Switch models without touching your ViewModel or UI code.

This guide builds a complete AI chat screen in .NET MAUI: message history, streaming responses, loading states, and proper DI setup.

Table of Contents

What is Microsoft.Extensions.AI?

Microsoft.Extensions.AI (often abbreviated MEAI) is a set of core abstractions for integrating AI services into .NET applications. The key interface is IChatClient — a simple contract for sending messages and receiving responses.

Why It Matters for MAUI

  • Provider portability — same ViewModel code works with OpenAI, Azure OpenAI, or a local Ollama instance
  • DI-native — integrates with the standard .NET DI container used in MAUI
  • Streaming support — built-in async enumerable streaming for real-time token output
  • Middleware pipeline — add caching, logging, rate limiting, or content filtering as pipeline middleware

IChatClient Interface

public interface IChatClient : IDisposable
{
    Task<ChatCompletion> CompleteAsync(
        IList<ChatMessage> chatMessages,
        ChatOptions? options = null,
        CancellationToken cancellationToken = default);

    IAsyncEnumerable<StreamingChatCompletionUpdate> CompleteStreamingAsync(
        IList<ChatMessage> chatMessages,
        ChatOptions? options = null,
        CancellationToken cancellationToken = default);
}

You call these two methods. The rest is implementation details handled by the provider package.

Project Setup and NuGet Packages

Create a new .NET MAUI project targeting .NET 8 or later, then add the required packages:

# Core MEAI abstractions (always required)
dotnet add package Microsoft.Extensions.AI

# OpenAI provider (use one or both)
dotnet add package Microsoft.Extensions.AI.OpenAI

# Azure OpenAI provider
dotnet add package Microsoft.Extensions.AI.AzureAIInference

# CommunityToolkit.Mvvm for ViewModel
dotnet add package CommunityToolkit.Mvvm

API Key Security

Never hardcode API keys in source code. For development, use local configuration or environment variables. For production apps, route requests through your own backend — don’t ship API keys in a mobile app binary.

# For local dev only — use .env or appsettings equivalent
# On device: retrieve key from your backend or secure enclave

Registering IChatClient in MauiProgram.cs

using Microsoft.Extensions.AI;
using OpenAI;

public static class MauiProgram
{
    public static MauiApp CreateMauiApp()
    {
        var builder = MauiApp.CreateBuilder();
        builder.UseMaui();

        // Register IChatClient with OpenAI
        builder.Services.AddSingleton<IChatClient>(sp =>
        {
            var openAiClient = new OpenAIClient(
                new ApiKeyCredential("YOUR_API_KEY")); // Replace with secure retrieval
            return openAiClient
                .AsChatClient("gpt-4o-mini")
                .AsBuilder()
                .UseLogging()          // Optional middleware
                .Build();
        });

        // Register ViewModel and Page
        builder.Services.AddTransient<ChatViewModel>();
        builder.Services.AddTransient<ChatPage>();

        return builder.Build();
    }
}

[INTERNAL_LINK: .NET MAUI dependency injection guide] — If you’re new to MAUI’s DI container, read this first before proceeding.

Data Models for Chat

// ChatMessageItem.cs — UI model (separate from MEAI's ChatMessage)
public class ChatMessageItem
{
    public string Text { get; set; } = string.Empty;
    public bool IsUser { get; set; }
    public DateTime Timestamp { get; set; } = DateTime.Now;
    public bool IsStreaming { get; set; }
}

We keep a separate UI model rather than binding directly to MEAI’s ChatMessage type. This lets us add UI-specific state like IsStreaming without polluting the domain model.

Building the Chat ViewModel

using CommunityToolkit.Mvvm.ComponentModel;
using CommunityToolkit.Mvvm.Input;
using Microsoft.Extensions.AI;

public partial class ChatViewModel : ObservableObject
{
    private readonly IChatClient _chatClient;
    private readonly List<ChatMessage> _conversationHistory = new();

    public ChatViewModel(IChatClient chatClient)
    {
        _chatClient = chatClient;

        // System prompt — set the AI's behavior
        _conversationHistory.Add(new ChatMessage(
            ChatRole.System,
            "You are a helpful .NET development assistant. Be concise and provide code examples when relevant."));
    }

    [ObservableProperty]
    private ObservableCollection<ChatMessageItem> _messages = new();

    [ObservableProperty]
    private string _userInput = string.Empty;

    [ObservableProperty]
    private bool _isLoading;

    [ObservableProperty]
    private bool _canSend = true;

    [RelayCommand(CanExecute = nameof(CanSendMessage))]
    private async Task SendMessageAsync(CancellationToken cancellationToken)
    {
        if (string.IsNullOrWhiteSpace(UserInput)) return;

        var userText = UserInput;
        UserInput = string.Empty;
        CanSend = false;
        IsLoading = true;

        // Add user message to UI
        Messages.Add(new ChatMessageItem { Text = userText, IsUser = true });

        // Add to conversation history for MEAI
        _conversationHistory.Add(new ChatMessage(ChatRole.User, userText));

        // Add placeholder for streaming response
        var aiMessage = new ChatMessageItem { Text = "", IsUser = false, IsStreaming = true };
        Messages.Add(aiMessage);

        try
        {
            await StreamResponseAsync(aiMessage, cancellationToken);
            _conversationHistory.Add(new ChatMessage(ChatRole.Assistant, aiMessage.Text));
        }
        catch (OperationCanceledException)
        {
            aiMessage.Text += " [cancelled]";
        }
        catch (Exception ex)
        {
            aiMessage.Text = $"Error: {ex.Message}";
        }
        finally
        {
            aiMessage.IsStreaming = false;
            IsLoading = false;
            CanSend = true;
            SendMessageCommand.NotifyCanExecuteChanged();
        }
    }

    private bool CanSendMessage() => CanSend && !string.IsNullOrWhiteSpace(UserInput);
}

Implementing Streaming Responses

private async Task StreamResponseAsync(ChatMessageItem aiMessage, CancellationToken cancellationToken)
{
    var responseText = new System.Text.StringBuilder();

    await foreach (var update in _chatClient.CompleteStreamingAsync(
        _conversationHistory,
        cancellationToken: cancellationToken))
    {
        if (update.Text is null) continue;

        responseText.Append(update.Text);

        // Update UI on main thread
        await MainThread.InvokeOnMainThreadAsync(() =>
        {
            aiMessage.Text = responseText.ToString();
        });
    }
}

Why Streaming Matters for Mobile UX

Without streaming, the user sees a blank loading state for several seconds, then the full response appears at once. With streaming, tokens appear progressively — exactly like ChatGPT’s interface. This dramatically improves perceived responsiveness, especially for longer responses.

Thread Safety

MAUI UI updates must happen on the main thread. The await foreach loop runs on a background thread (the async network context), so wrap all aiMessage.Text updates in MainThread.InvokeOnMainThreadAsync.

Building the Chat UI in XAML

<?xml version="1.0" encoding="utf-8" ?>
<ContentPage xmlns="http://schemas.microsoft.com/dotnet/2021/maui"
             xmlns:x="http://schemas.microsoft.com/winfx/2009/xaml"
             xmlns:vm="clr-namespace:MyAiApp.ViewModels"
             x:Class="MyAiApp.Views.ChatPage"
             x:DataType="vm:ChatViewModel"
             Title="AI Assistant">

    <Grid RowDefinitions="*, Auto" Padding="12">

        <!-- Message List -->
        <CollectionView Grid.Row="0"
                        ItemsSource="{Binding Messages}"
                        x:Name="MessageList"
                        ItemsUpdatingScrollMode="KeepLastItemInView">
            <CollectionView.ItemTemplate>
                <DataTemplate>
                    <Grid Padding="0,4">
                        <Frame CornerRadius="12"
                               Padding="12,8"
                               HorizontalOptions="{Binding IsUser, Converter={StaticResource BoolToLayoutOptions}}"
                               BackgroundColor="{Binding IsUser, Converter={StaticResource BoolToColor}}"
                               MaximumWidthRequest="280">
                            <Label Text="{Binding Text}"
                                   TextColor="{Binding IsUser, Converter={StaticResource BoolToTextColor}}"
                                   LineBreakMode="WordWrap" />
                        </Frame>
                    </Grid>
                </DataTemplate>
            </CollectionView.ItemTemplate>
        </CollectionView>

        <!-- Input Row -->
        <Grid Grid.Row="1" ColumnDefinitions="*, Auto" ColumnSpacing="8" Padding="0,8,0,0">
            <Entry Grid.Column="0"
                   Text="{Binding UserInput}"
                   Placeholder="Ask anything..."
                   ReturnCommand="{Binding SendMessageCommand}"
                   IsEnabled="{Binding CanSend}" />
            <Button Grid.Column="1"
                    Text="Send"
                    Command="{Binding SendMessageCommand}"
                    IsEnabled="{Binding CanSend}"
                    BackgroundColor="#6200EE"
                    TextColor="White" />
        </Grid>

    </Grid>

</ContentPage>

Value Converters for Chat Bubbles

public class BoolToLayoutOptionsConverter : IValueConverter
{
    public object Convert(object? value, Type targetType, object? parameter, CultureInfo culture)
        => value is true ? LayoutOptions.End : LayoutOptions.Start;

    public object ConvertBack(object? value, Type targetType, object? parameter, CultureInfo culture)
        => throw new NotImplementedException();
}

public class BoolToColorConverter : IValueConverter
{
    public object Convert(object? value, Type targetType, object? parameter, CultureInfo culture)
        => value is true ? Color.FromArgb("#6200EE") : Color.FromArgb("#F0F0F0");

    public object ConvertBack(object? value, Type targetType, object? parameter, CultureInfo culture)
        => throw new NotImplementedException();
}

System Prompts and Conversation Context

Designing Effective System Prompts

// Focused assistant with specific behavior
_conversationHistory.Add(new ChatMessage(
    ChatRole.System,
    """
    You are a .NET MAUI development assistant.
    - Answer questions concisely (under 200 words unless code is needed)
    - Always include working C# code examples
    - Mention CommunityToolkit.Mvvm patterns where relevant
    - If you don't know something, say so
    """));

Managing History Length

Sending the full conversation history every request gets expensive as conversations grow. Implement a sliding window:

private IList<ChatMessage> GetContextWindow(int maxMessages = 20)
{
    var systemPrompt = _conversationHistory.First(); // Always include
    var recentMessages = _conversationHistory.Skip(1).TakeLast(maxMessages);
    return new[] { systemPrompt }.Concat(recentMessages).ToList();
}

// Use in CompleteStreamingAsync:
await foreach (var update in _chatClient.CompleteStreamingAsync(
    GetContextWindow(),
    cancellationToken: cancellationToken))

Switching AI Providers

The power of IChatClient is provider portability. Switch from OpenAI to Azure OpenAI by changing only the registration in MauiProgram.cs:

// Azure OpenAI
using Azure.AI.OpenAI;

builder.Services.AddSingleton<IChatClient>(sp =>
{
    var client = new AzureOpenAIClient(
        new Uri("https://YOUR_RESOURCE.openai.azure.com/"),
        new AzureKeyCredential("YOUR_AZURE_KEY"));
    return client.AsChatClient("gpt-4o");
});

// Local Ollama (no API key needed)
using Microsoft.Extensions.AI;

builder.Services.AddSingleton<IChatClient>(sp =>
    new OllamaChatClient(new Uri("http://localhost:11434"), "llama3.2"));

The ViewModel, UI, and all application code remain unchanged. This is the key value proposition of MEAI for production apps.

Using Middleware

builder.Services.AddSingleton<IChatClient>(sp =>
{
    var openAiClient = new OpenAIClient(new ApiKeyCredential("KEY"));
    return openAiClient
        .AsChatClient("gpt-4o-mini")
        .AsBuilder()
        .UseLogging(sp.GetRequiredService<ILoggerFactory>())
        .UseDistributedCache(sp.GetRequiredService<IDistributedCache>()) // Cache identical requests
        .Build(sp);
});

FAQ

Is Microsoft.Extensions.AI production-ready in 2026?

Yes — it reached stable release in late 2024 and is the recommended Microsoft approach for integrating AI into .NET applications. Multiple Azure services and third-party providers implement IChatClient. It’s not a preview abstraction anymore.

How do I handle rate limiting from OpenAI in a mobile app?

Use the MEAI middleware pipeline to add a rate limiter, or implement a retry policy using Microsoft.Extensions.Http.Resilience. For production, route through your own API backend so you control rate limiting server-side and don’t expose the key in the app binary.

Can I use Microsoft.Extensions.AI with local on-device AI models?

Yes — Ollama provides an IChatClient implementation for locally running models like Llama and Phi. On-device inference via ONNX Runtime is also possible via separate packages, though the IChatClient abstraction may not cover all ONNX scenarios yet.

How do I add image support (multimodal) to the chat?

MEAI’s ChatMessage supports content parts including text and images via AIContent. Use ImageContent with a byte array or URL to send images:

var message = new ChatMessage(ChatRole.User, new AIContent[]
{
    new TextContent("What's in this image?"),
    new ImageContent(imageBytes, "image/png")
});

Does conversation history get stored on device?

By default, history lives only in memory for the app session. To persist it, serialize _conversationHistory to SQLite or the app’s local storage at appropriate lifecycle points (app backgrounding, page disappearing). Use System.Text.Json to serialize ChatMessage objects.

How do I stop a streaming response mid-generation?

Pass a CancellationToken to CompleteStreamingAsync and call Cancel() on the token source. The async enumerable will stop iterating and throw OperationCanceledException — catch it in your ViewModel and update the UI accordingly. The [RelayCommand(IncludeCancelCommand = true)] attribute generates a cancel command automatically.

Conclusion

Microsoft.Extensions.AI makes adding AI chat to .NET MAUI apps straightforward and future-proof. The IChatClient abstraction keeps your ViewModel clean, streaming responses give users the modern interactive feel they expect, and the middleware pipeline handles the production concerns (logging, caching, resilience) without cluttering your application code.

The key architectural decision: don’t ship API keys in your app binary. Route AI requests through your own backend in production — it gives you rate limiting, cost control, and the ability to swap providers without a new app release.

[INTERNAL_LINK: MVVM in .NET MAUI with CommunityToolkit.Mvvm] — the ViewModel patterns in this post assume familiarity with CommunityToolkit.Mvvm source generators.


Leave a Reply

Your email address will not be published. Required fields are marked *