- Basic Question Answering Agent about Insurance Policies with No Agent Framework
- [Insurance Policy Agent] Turn QA Agent into A2A Agent Server with A2A SDK (No Framework to show how the SDK works.)
- Basic A2A Client with A2A SDK to show communication (No Framework to show how SDK works)
- [Health Research Agent] ADK Agent using Gemini with Google Search tool to answer Health-based Questions. Using ADK A2A exposing.
- [Sequential Agent] ADK
SequentialAgentconnecting to Policy Agent and Health Agent in sequence. Using ADK A2A consuming. - [Healthcare Provider Agent] A2A Agent calling an MCP Server, built with LangChain/LangGraph.
- Uses
langgraph-a2a-server
- Uses
- A2A Client with Microsoft Agent Framework built-in Client
- [Healthcare Concierge Agent] Full General Healthcare Agent built with BeeAI Requirements Agent to call all of the A2A Agents in an Agentic way.
graph LR
%% User / Client Layer
User([User / A2A Client])
%% Main Orchestrator Layer (Lesson 8)
subgraph OrchestratorLayer [Router/Requirement Agent]
Concierge["<b>Healthcare Concierge Agent</b><br/>(BeeAI Framework)<br/><code>Port: 9996</code>"]
end
subgraph SubAgents [A2A Agent Servers]
direction TB
PolicyAgent["<b>Policy Agent</b><br/>(Gemini with A2A SDK)<br/><code>Port: 9999</code>"]
ResearchAgent["<b>Research Agent</b><br/>(Google ADK)<br/><code>Port: 9998</code>"]
ProviderAgent["<b>Provider Agent</b><br/>(LangGraph + LangChain)<br/><code>Port: 9997</code>"]
end
%% Data & Tools Layer
subgraph DataLayer [Data Sources & Tools]
PDF["Policy PDF"]
Google[Google Search Tool]
MCPServer["FastMCP Server<br/>(<code>doctors.json</code>)"]
end
Label_UA["Sends Query - A2A"]
Label_CP["A2A"]
Label_CR["A2A"]
Label_CProv["A2A"]
Label_MCP["MCP (stdio)"]
%% -- CONNECTIONS --
User --- Label_UA --> Concierge
Concierge --- Label_CP --> PolicyAgent
Concierge --- Label_CR --> ResearchAgent
Concierge --- Label_CProv --> ProviderAgent
PolicyAgent -- "Reads" --> PDF
ResearchAgent -- "Calls" --> Google
ProviderAgent --- Label_MCP --> MCPServer
classDef orchestrator fill:#f9f,stroke:#333,stroke-width:2px;
classDef agent fill:#e1f5fe,stroke:#0277bd,stroke-width:2px;
classDef tool fill:#fff3e0,stroke:#ef6c00,stroke-width:1px,stroke-dasharray: 5 5;
classDef protocolLabel fill:#ffffff,stroke:none,color:#000;
class Concierge orchestrator;
class PolicyAgent,ResearchAgent,ProviderAgent agent;
class PDF,Google,MCPServer tool;
class Label_UA,Label_CP,Label_CR,Label_CProv,Label_MCP protocolLabel;
Follow these steps to set up your environment and run the example agents. Each numbered module (1. ..., 2. ..., etc.) is designed to be run in sequence.
Before running the examples, complete the following setup steps:
-
Create a Gemini API Key or configure your environment for Vertex AI.
-
Configure Environment Variables:
- In the project root, make a copy of
example.envand rename it to.env.
cp example.env .env
- Replace
"YOUR_GEMINI_API_KEY"with your actual API Key.
- In the project root, make a copy of
-
Install Dependencies:
-
Locally: If you have
uvinstalled, run:uv sync
-
Notebooks / Google Colab: If running in a notebook environment, you can install the dependencies by running the following in a cell:
%pip install .
-
You can run the agent servers using uv run. Ensure you are in the project root.
-
Policy Agent (Lesson 2):
uv run a2a_policy_agent.py
-
Research Agent (Lesson 4):
uv run a2a_research_agent.py
-
Provider Agent (Lesson 6):
uv run a2a_provider_agent.py
-
Healthcare Concierge Agent (Lesson 8):
uv run a2a_healthcare_agent.py