You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- `POST /api/chat` — Chat endpoint (same as Ollama API, but with MCP tool support)
172
+
- **This is the only endpoint where MCP server tools are integrated.** All tool calls are handled and responses are merged transparently for the client.
173
+
- **All other endpoints** (except `/api/chat` and `/health`) are fully proxied to the underlying Ollama server with no modification. You can use your existing Ollama clients and libraries as usual.
174
+
- **Health check:**
175
+
- `GET /health` — Specific to `ollama-mcp-bridge` (not proxied)
151
176
152
177
> [!IMPORTANT]
153
-
>**All other standard Ollama endpoints are also transparently proxied by the bridge.**
154
-
155
-
- **Health check:**
156
-
- `GET /health`
178
+
>`/api/chat` is the only endpoint with MCP tool integration. All other endpoints are transparently proxied to Ollama. `/health` is specific to the bridge.
157
179
158
-
This bridge acts as a drop-in proxy for the Ollama API, but with all MCP tools from all connected servers available to every request. You can use your existing Ollama clients and libraries, just point them to this bridge instead of your Ollama server.
180
+
This bridge acts as a drop-in proxy for the Ollama API, but with all MCP tools from all connected servers available to every `/api/chat`request. You can use your existing Ollama clients and libraries, just point them to this bridge instead of your Ollama server.
159
181
160
182
### Example: Chat
161
183
```bash
@@ -186,27 +208,6 @@ curl -N -X POST http://localhost:8000/api/chat \
186
208
> [!TIP]
187
209
> Use `/docs` for interactive API exploration and testing.
188
210
189
-
## Architecture
190
-
191
-
The application is structured into three main modules:
192
-
193
-
### `main.py` - CLI Entry Point
194
-
- Uses Typer for command-line interface
195
-
- Handles configuration and server startup
196
-
- Passes configuration to FastAPI app
197
-
198
-
### `api.py` - FastAPI Application
199
-
- Defines API endpoints (`/api/chat`, `/health`)
200
-
- Manages application lifespan (startup/shutdown)
201
-
- Handles HTTP request/response processing
202
-
203
-
### `mcp_manager.py` - MCP Management
204
-
- Loads and manages MCP servers
205
-
- Collects and exposes all available tools
206
-
- Handles tool calls and integrates results into Ollama responses
207
-
208
-
### `utils.py` - Utility Functions
209
-
- NDJSON parsing, health checks, and other helper functions
210
211
211
212
## Development
212
213
@@ -264,10 +265,6 @@ curl -X POST "http://localhost:8000/api/chat" \
264
265
> [!NOTE]
265
266
> Tests require the server to be running on localhost:8000. Make sure to start the server before running pytest.
266
267
267
-
268
-
269
-
This creates a seamless experience where Ollama can use any tool from any connected MCP server without the client needing to know about the underlying MCP infrastructure.
270
-
271
268
## Inspiration and Credits
272
269
273
270
This project is based on the basic MCP client from my Medium article: [Build an MCP Client in Minutes: Local AI Agents Just Got Real](https://medium.com/@jonigl/build-an-mcp-client-in-minutes-local-ai-agents-just-got-real-a10e186a560f).
0 commit comments