DevExpress Blazor AI Chat — Agent2Agent (A2A) Protocol Integration

8 January 2026

Modern AI-powered applications increasingly implement automated functions through specialized AI agents. The statistics show impressive growth: in 2025, 85% of organizations have integrated AI agents into at least one workflow. Users now expect conversational interfaces for various business tasks: customer service automation, personalized shopping assistance, financial transaction management, content creation, and streamlined workflow coordination.

The key distinction between AI agents and working with standard large language models lies in a simple formula: an agent equals LLM plus instructions and tools it can invoke, representing an autonomous system.

What is an AI Agent — Diagram

Source/Credit: Microsoft Developer YouTube channel

While LLMs can only generate text, agents extend these capabilities into actionable intelligence by integrating external APIs, accessing databases, managing files, and connecting with enterprise systems. This transforms them from passive text generators into autonomous systems capable of executing complex, multi-step tasks. One of the first agents that was introduced in the industry were ChatGPT plugins that added web browsing and code interpreter capabilities (allowing the LLM to search the Web and execute code while replying to a user query).

In April 2025, Google introduced its Agent2Agent (A2A) protocol — an open standard for standardizing communication between agents. The protocol has gained support from over 150 organizations, including tech giants like Microsoft, Salesforce, SAP, ServiceNow, and Amazon Web Services. The protocol's advantage lies in enabling agents developed by different teams on various platforms to seamlessly exchange information and coordinate actions, creating essentially an "internet of agents" or "agentic web" — similar to how HTTP allowed websites to interact with each other.

The benefits notwithstanding, integrating agents with user interface components presents technical challenges. Agents use structured data exchange formats and support complex task lifecycles, while chat components expect simple text messages and streaming responses. In this article, I describe how to connect the DevExpress Blazor AI Chat component to A2A protocol-compliant agents with a small adapter. The implementation hosts four agents in separate ASP.NET Core servers and switches agents at runtime within a single chat UI. The sample also includes a Researcher agent from the official Microsoft A2A .NET SDK samples to show compatibility with existing agents.

This example focuses on agent connectivity rather than orchestration. The A2A protocol supports advanced inter-agent patterns such as routing, delegation, and task handoffs, but this sample does not implement them. Any A2A-compliant agent can connect to the Blazor AI Chat component with the approach documented here.

Architecture Overview

The solution uses a distributed layout. A shared library defines all agent classes and base types. Four ASP.NET Core projects host Poem, Shakespearean Style, Task, and Researcher agents. Each server runs on its own port and exposes a single A2A endpoint at /agent. A Blazor Server app embeds the DevExpress AI Chat component and talks to agents through IChatClient-based adapters.

Agent Server Apps

Each agent server follows the same pattern. The example below shows a minimal setup that maps the A2A pipeline to /agent and wires an agent to a task manager.

using A2A;
using A2A.AspNetCore;
using Azure;
using Azure.AI.OpenAI;
using A2AAgents.Shared;
using A2AAgentsServer.Agents;
using Microsoft.Extensions.AI;

var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();

TaskManager taskManager = new TaskManager();

var settings = builder.Configuration.GetSection("AzureOpenAISettings").Get<AzureOpenAIServiceSettings>();
if (settings == null || string.IsNullOrEmpty(settings.Endpoint) || string.IsNullOrEmpty(settings.Key) || string.IsNullOrEmpty(settings.DeploymentName))
    throw new InvalidOperationException("Specify Azure OpenAI endpoint, key, and deployment name in appsettings.json.");
var chatClient = new AzureOpenAIClient(new Uri(settings.Endpoint), new AzureKeyCredential(settings.Key))
    .GetChatClient(settings.DeploymentName)
    .AsIChatClient();

var agent = new PoemAgent(chatClient, taskManager);

app.UseHttpsRedirection();
app.MapA2A(taskManager, "/agent");
app.Run();

Shakespearean Style, Task, and Researcher servers use the same structure. They differ by agent type and the port configured in each project's launch settings.

Protocol Compliance Validation with a2a-inspector

All agents pass validation with the a2a-inspector tool. The inspector checks Agent Card metadata, message format, and streaming support, task state signaling for task agents, and protocol version compatibility.

a2a-inspector - The Poem Agent Validation Screenshot

Bridging A2A to IChatClient

The key to this integration is a lightweight adapter that translates between A2A protocol messages and the IChatClient interface used by DevExpress Blazor AI Chat to interact with LLMs. This adapter maintains conversation context, converts chat messages to A2A protocol format, and streams responses back to the UI:

public sealed class MessageAgentChatClient : DelegatingChatClient {
    private readonly A2AClient _agentClient;
    private readonly string _contextId;

    public MessageAgentChatClient(IChatClient innerClient, A2AClient agentClient, string contextId) 
        : base(innerClient) {
        _agentClient = agentClient ?? throw new ArgumentNullException(nameof(agentClient));
        _contextId = $"{contextId}-{Guid.NewGuid()}";
    }

    public override async IAsyncEnumerable<ChatResponseUpdate> GetStreamingResponseAsync(
        IEnumerable<ChatMessage> messages, 
        ChatOptions? options = null,
        [EnumeratorCancellation] CancellationToken cancellationToken = default) {

        var sendParams = new MessageSendParams { 
            Message = CreateA2AMessage(messages) 
        };

        await foreach (var item in _agentClient.SendMessageStreamingAsync(sendParams, cancellationToken)) {
            if (item.Data is AgentMessage message && message.Parts.FirstOrDefault() is TextPart textPart) {
                yield return new ChatResponseUpdate(ChatRole.Assistant, textPart.Text);
            }
        }
    }
    
    private AgentMessage CreateA2AMessage(IEnumerable<ChatMessage> messages) {
            return new AgentMessage() {
                Role = MessageRole.User,
                ContextId = _contextId,
                Parts = [new TextPart() { Text = messages.Last().Text }]
            };
        }
}

Runtime Agent Selection

The app registers each agent as a keyed IChatClient service and switches between them at runtime through a simple dropdown.

// Register each A2A agent as a keyed IChatClient service
builder.Services.AddKeyedScoped<IChatClient>(AgentsEndpoints.PoemAgent, (provider, key) =>
    new MessageAgentChatClient(chatClient, 
        new A2AClient(new Uri(AgentsEndpoints.PoemAgent)), 
        AgentsEndpoints.PoemAgent));

builder.Services.AddKeyedScoped<IChatClient>(AgentsEndpoints.ShakespeareanStyleAgent, (provider, key) =>
    new MessageAgentChatClient(chatClient, 
        new A2AClient(new Uri(AgentsEndpoints.ShakespeareanStyleAgent)), 
        AgentsEndpoints.ShakespeareanStyleAgent));

builder.Services.AddKeyedScoped<IChatClient>(AgentsEndpoints.TaskAgent, (provider, key) =>
    new TaskAgentChatClient(chatClient, 
        new A2AClient(new Uri(AgentsEndpoints.TaskAgent)), 
        AgentsEndpoints.TaskAgent));

builder.Services.AddKeyedScoped<IChatClient>(AgentsEndpoints.ResearcherAgent, (provider, key) =>
    new TaskAgentChatClient(chatClient, 
        new A2AClient(new Uri(AgentsEndpoints.ResearcherAgent)), 
        AgentsEndpoints.ResearcherAgent));

The ChatClientServiceKey property (available with DevExpress v25.1.4) enables runtime switching between multiple IChatClient services.

<DxAIChat @ref="_dxAIChat" 
          UseStreaming="true" 
          ChatClientServiceKey="@CurrentServiceKey"
          CssClass="chat-container" />

<DxComboBox Data="@AgentOptions"
            TextFieldName="nameof(AgentOption.Text)"
            ValueFieldName="nameof(AgentOption.Value)"
            @bind-Value="@CurrentServiceKey" />

Welcome Chat Screen

The EmptyMessageAreaTemplate shows agent descriptions and instructions when the chat is empty. This helps users understand each agent and start a conversation quickly.

<EmptyMessageAreaTemplate>
        <div class="empty-message-area">
            <h4>🤖 Multi-Agent AI Chat</h4>
            <p>Welcome to the Agent-to-Agent communication platform! Choose your specialized AI agent and start chatting.</p>

            <div>
                <h5>Available Agents:</h5>
                <ul class="agents-list">
                    <li><strong>🎭 Poem Agent:</strong> Enter 2-5 keywords and I'll generate a short lyrical poem</li>
                    <li><strong>📜 Shakespearean Style Agent:</strong> Transform your text into William Shakespeare's style</li>
                    <li><strong>📋 Task Agent:</strong> Help plan and execute task descriptions</li>
                    <li><strong>Researcher Agent:</strong> Agent from https://github.com/a2aproject/a2a-dotnet/blob/main/samples/AgentServer/ResearcherAgent.cs </li>
                    <li><strong>💬 Default Chat Client:</strong> Standard AI conversation</li>
                </ul>
            </div>

            <div>
                <p><strong>Current Agent:</strong> <span class="current-agent-badge">@AgentOptions.FirstOrDefault(x => x.Value == CurrentServiceKey)?.Text</span></p>
            </div>

            <p class="help-text">
                <small>💡 Select an agent from the dropdown above, then type your message to start the conversation!</small>
            </p>
        </div>
    </EmptyMessageAreaTemplate>
DevExpress Blazor AI Chat — An Empty State Screen

User Experience

Users can select Poem, Shakespearean Style, Task, or Researcher agents via a dropdown, utilize manual handoffs by copying output between agents to simulate composition workflows, and enjoy streaming responses that preserve the responsive feel of AI interactions. The video below demonstrates project functionality in action:

Download the Complete Example

To get started with this integration, clone the Blazor AI Chat — Communicate with Agents That Implement Agent2Agent (A2A) Protocol repository from GitHub and configure your AI service provider in appsettings.json:

{
  "AzureOpenAISettings": {
    "Endpoint": "https://your-instance.openai.azure.com/",
    "Key": "your-api-key",
    "DeploymentName": "your-model-name"
  }
}

Configure Agent Endpoints

The AgentsEndpoints class centralizes agent URLs. The app uses HTTP by default and includes HTTPS alternatives as comments.

public static class AgentsEndpoints {
    public const string PoemAgent = "http://localhost:5003/agent";
    public const string ShakespeareanStyleAgent = "http://localhost:5005/agent";
    public const string TaskAgent = "http://localhost:5007/agent";
    public const string ResearcherAgent = "http://localhost:5009/agent";
    // HTTPS alternatives:
    // public const string PoemAgent = "https://localhost:5004/agent";
    // public const string ShakespeareanStyleAgent = "https://localhost:5006/agent";
    // public const string TaskAgent = "https://localhost:5008/agent";
    // public const string ResearcherAgent = "https://localhost:5010/agent";
}

Start the four agent servers first (PoemAgentServer on http://localhost:5003, ShakespeareanStyleAgentServer on http://localhost:5005, TaskAgentServer on http://localhost:5007, ResearcherAgentServer on http://localhost:5009). Wait until each reports that it is listening on its port. Then start the Blazor Server app, select an agent in the UI, and start chatting. You can switch agents at runtime. The implementation supports Azure OpenAI, OpenAI API, and local runtimes like Ollama with minimal changes.

Your Feedback Matters

We're gathering feedback to prioritize future enhancements to our AI agent integration capabilities. Please take a moment to share your experience and help us shape future AI integration features:

Free DevExpress Products - Get Your Copy Today

The following free DevExpress product offers remain available. Should you have any questions about the free offers below, please submit a ticket via the DevExpress Support Center at your convenience. We'll be happy to follow-up.