Skip to content

Commit a05828c

Browse files
authored
docs: updating main README.md (#13)
1 parent ab5355d commit a05828c

File tree

1 file changed

+2
-5
lines changed

1 file changed

+2
-5
lines changed

README.md

Lines changed: 2 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@
3737

3838
- Python >= 3.10.15
3939
- Ollama server running (local or remote)
40-
- MCP server scripts configured in `mcp-servers-config/mcp-config.json`
40+
- MCP server configuration file with at least one MCP server defined (see below for example)
4141

4242
## Installation
4343

@@ -148,10 +148,7 @@ ollama-mcp-bridge --config custom.json --host 0.0.0.0 --port 8080 --ollama-url h
148148
```
149149

150150
> [!TIP]
151-
> If installing with `uv`, you can run the bridge directly using:
152-
> ```bash
153-
> ollama-mcp-bridge --config /path/to/custom-config.json --host 0.0.0.0 --port 8080 --ollama-url http://remote-ollama:11434
154-
> ```
151+
> If using `uvx` to run the bridge, you have to specify the command as `uvx ollama-mcp-bridge` instead of just `ollama-mcp-bridge`.
155152
156153
> [!NOTE]
157154
> This bridge supports both streaming responses and thinking mode. You receive incremental responses as they are generated, with tool calls and intermediate thinking messages automatically proxied between Ollama and all connected MCP tools.

0 commit comments

Comments
 (0)