Skip to content

Commit ab5355d

Browse files
authored
chore: updating main readme with the new pypi package. Fixing publish job name (#12)
1 parent 97856db commit ab5355d

File tree

2 files changed

+39
-42
lines changed

2 files changed

+39
-42
lines changed

.github/workflows/publish.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ on:
66
types:
77
- created
88
jobs:
9-
test-publish:
9+
publish:
1010
runs-on: ubuntu-latest
1111
permissions:
1212
contents: write

README.md

Lines changed: 38 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -8,24 +8,29 @@
88

99
# Ollama MCP Bridge
1010

11+
[![PyPI - Python Version](https://img.shields.io/pypi/v/ollama-mcp-bridge?label=ollama-mcp-bridge-pypi)](https://pypi.org/project/ollama-mcp-bridge/)
1112
[![Tests](https://github.com/jonigl/ollama-mcp-bridge/actions/workflows/test.yml/badge.svg)](https://github.com/jonigl/ollama-mcp-bridge/actions/workflows/test.yml)
13+
[![Test Publish](https://github.com/jonigl/ollama-mcp-bridge/actions/workflows/test-publish.yml/badge.svg)](https://github.com/jonigl/ollama-mcp-bridge/actions/workflows/test-publish.yml)
14+
[![Publish](https://github.com/jonigl/ollama-mcp-bridge/actions/workflows/publish.yml/badge.svg)](https://github.com/jonigl/ollama-mcp-bridge/actions/workflows/publish.yml)
1215
[![Python 3.10+](https://img.shields.io/badge/Python-3.10+-blue.svg)](https://www.python.org/downloads/)
1316
![License](https://img.shields.io/badge/License-MIT-green.svg)
1417

18+
1519
## Features
1620

17-
- 🏗️ **Modular Architecture**: Clean separation into CLI, API, and MCP management modules
1821
- 🚀 **Pre-loaded Servers**: All MCP servers are connected at startup from JSON configuration
22+
- 📝 **JSON Configuration**: Configure multiple servers with complex commands and environments
23+
- 🔗 **Tool Integration**: Automatic tool call processing and response integration
1924
- 🛠️ **All Tools Available**: Ollama can use any tool from any connected server simultaneously
2025
- 🔄 **Complete API Compatibility**: `/api/chat` adds tools while all other Ollama API endpoints are transparently proxied
21-
- ⚡️ **FastAPI Backend**: Modern async API with automatic documentation
22-
- 💻 **Typer CLI**: Clean command-line interface with configurable options
23-
- 📊 **Structured Logging**: Uses loguru for comprehensive logging
2426
- 🔧 **Configurable Ollama**: Specify custom Ollama server URL via CLI
25-
- 🔗 **Tool Integration**: Automatic tool call processing and response integration
26-
- 📝 **JSON Configuration**: Configure multiple servers with complex commands and environments
2727
- 🌊 **Streaming Responses**: Supports incremental streaming of responses to clients
2828
- 🤔 **Thinking Mode**: Proxies intermediate "thinking" messages from Ollama and MCP tools
29+
- ⚡️ **FastAPI Backend**: Modern async API with automatic documentation
30+
- 🏗️ **Modular Architecture**: Clean separation into CLI, API, and MCP management modules
31+
- 💻 **Typer CLI**: Clean command-line interface with configurable options
32+
- 📊 **Structured Logging**: Uses loguru for comprehensive logging
33+
- 📦 **PyPI Package**: Easily installable via pip or uv from PyPI
2934

3035

3136
## Requirements
@@ -36,6 +41,21 @@
3641

3742
## Installation
3843

44+
You can install `ollama-mcp-bridge` in two main ways:
45+
46+
### Quick Start
47+
Install instantly with [uvx](https://github.com/astral-sh/uv):
48+
```bash
49+
uvx ollama-mcp-bridge
50+
```
51+
52+
### Or, install from PyPI with pip
53+
```bash
54+
pip install --upgrade ollama-mcp-bridge
55+
```
56+
57+
### Or, install from source
58+
3959
```bash
4060
# Clone the repository
4161
git clone https://github.com/jonigl/ollama-mcp-bridge.git
@@ -60,17 +80,18 @@ uv tool install --editable .
6080
ollama-mcp-bridge
6181
```
6282

63-
6483
## How It Works
6584

6685
1. **Startup**: All MCP servers defined in the configuration are loaded and connected
6786
2. **Tool Collection**: Tools from all servers are collected and made available to Ollama
68-
3. **Chat Completion Request**: When a chat completion request is received:
69-
- The request is forwarded to Ollama along with the list of all available tools
70-
- If Ollama chooses to invoke any tools, those tool calls are executed through the corresponding MCP servers
87+
3. **Chat Completion Request (`/api/chat` endpoint only)**: When a chat completion request is received on `/api/chat`:
88+
- The request is forwarded to Ollama along with the list of all available tools
89+
- If Ollama chooses to invoke any tools, those tool calls are executed through the corresponding MCP servers
7190
- Tool responses are fed back to Ollama
7291
- The final response (with tool results integrated) is returned to the client
73-
4. **Logging**: All operations are logged using loguru for debugging and monitoring
92+
- **This is the only endpoint where MCP server tools are integrated.**
93+
4. **Other Endpoints**: All other endpoints (except `/api/chat` and `/health`) are fully proxied to the underlying Ollama server with no modification.
94+
5. **Logging**: All operations are logged using loguru for debugging and monitoring
7495

7596
## Configuration
7697

@@ -148,14 +169,15 @@ The API is available at `http://localhost:8000`.
148169
- **Swagger UI docs:** [http://localhost:8000/docs](http://localhost:8000/docs)
149170
- **Ollama-compatible endpoints:**
150171
- `POST /api/chat` — Chat endpoint (same as Ollama API, but with MCP tool support)
172+
- **This is the only endpoint where MCP server tools are integrated.** All tool calls are handled and responses are merged transparently for the client.
173+
- **All other endpoints** (except `/api/chat` and `/health`) are fully proxied to the underlying Ollama server with no modification. You can use your existing Ollama clients and libraries as usual.
174+
- **Health check:**
175+
- `GET /health` — Specific to `ollama-mcp-bridge` (not proxied)
151176
152177
> [!IMPORTANT]
153-
> **All other standard Ollama endpoints are also transparently proxied by the bridge.**
154-
155-
- **Health check:**
156-
- `GET /health`
178+
> `/api/chat` is the only endpoint with MCP tool integration. All other endpoints are transparently proxied to Ollama. `/health` is specific to the bridge.
157179
158-
This bridge acts as a drop-in proxy for the Ollama API, but with all MCP tools from all connected servers available to every request. You can use your existing Ollama clients and libraries, just point them to this bridge instead of your Ollama server.
180+
This bridge acts as a drop-in proxy for the Ollama API, but with all MCP tools from all connected servers available to every `/api/chat` request. You can use your existing Ollama clients and libraries, just point them to this bridge instead of your Ollama server.
159181
160182
### Example: Chat
161183
```bash
@@ -186,27 +208,6 @@ curl -N -X POST http://localhost:8000/api/chat \
186208
> [!TIP]
187209
> Use `/docs` for interactive API exploration and testing.
188210
189-
## Architecture
190-
191-
The application is structured into three main modules:
192-
193-
### `main.py` - CLI Entry Point
194-
- Uses Typer for command-line interface
195-
- Handles configuration and server startup
196-
- Passes configuration to FastAPI app
197-
198-
### `api.py` - FastAPI Application
199-
- Defines API endpoints (`/api/chat`, `/health`)
200-
- Manages application lifespan (startup/shutdown)
201-
- Handles HTTP request/response processing
202-
203-
### `mcp_manager.py` - MCP Management
204-
- Loads and manages MCP servers
205-
- Collects and exposes all available tools
206-
- Handles tool calls and integrates results into Ollama responses
207-
208-
### `utils.py` - Utility Functions
209-
- NDJSON parsing, health checks, and other helper functions
210211

211212
## Development
212213

@@ -264,10 +265,6 @@ curl -X POST "http://localhost:8000/api/chat" \
264265
> [!NOTE]
265266
> Tests require the server to be running on localhost:8000. Make sure to start the server before running pytest.
266267
267-
268-
269-
This creates a seamless experience where Ollama can use any tool from any connected MCP server without the client needing to know about the underlying MCP infrastructure.
270-
271268
## Inspiration and Credits
272269

273270
This project is based on the basic MCP client from my Medium article: [Build an MCP Client in Minutes: Local AI Agents Just Got Real](https://medium.com/@jonigl/build-an-mcp-client-in-minutes-local-ai-agents-just-got-real-a10e186a560f).

0 commit comments

Comments
 (0)