Skip to content

Adds Ollama service#250

Open
RychidM wants to merge 3 commits intotailscale-dev:mainfrom
RychidM:add-ollama-service
Open

Adds Ollama service#250
RychidM wants to merge 3 commits intotailscale-dev:mainfrom
RychidM:add-ollama-service

Conversation

@RychidM
Copy link
Copy Markdown

@RychidM RychidM commented Apr 7, 2026

Pull Request Title: Add Ollama Service

Description

Adds a Tailscale sidecar configuration for Ollama, a tool for running large language models (LLMs) locally. This lets users access their local models securely from any device on their Tailnet — including phones and remote machines — without exposing the Ollama API to the public internet.

Includes:

  • compose.yaml following the ScaleTail sidecar pattern (network_mode: service:tailscale, health checks, depends_on with health condition)
  • .env template with SERVICE, IMAGE_URL, TS_AUTHKEY, and an optional OLLAMA_API_KEY
  • README.md covering prerequisites, volumes, MagicDNS/HTTPS setup, optional LAN port exposure, and first-run model pull instructions
  • Optional yourNetwork external network on the Tailscale container, enabling other containers (e.g. Open WebUI) to reach Ollama via inter-container networking

Related Issues

N/A

Type of Change

  • Bug fix
  • New feature
  • Documentation update
  • Refactoring

How Has This Been Tested?

  1. Ran docker compose config — no errors or missing variable warnings.
  2. Pre-created bind-mount directories (config, ts/state, ollama-data) and started the stack with docker compose up -d.
  3. Confirmed both containers reached healthy status via docker compose ps.
  4. Verified Tailscale node registration and Tailnet IP with docker exec tailscale-ollama tailscale status.
  5. Pulled tinyllama and sent a test generation request via curl to the Tailnet IP on port 11434 — received a valid JSON response.
  6. Repeated the curl test from a second device on the same Tailnet to confirm remote access works end-to-end.
  7. Verified tailscale-ollama appears in docker network inspect yourNetwork.

Checklist

  • I have performed a self-review of my code
  • I have added tests that prove my fix or feature works
  • I have updated necessary documentation (e.g. frontpage README.md)
  • Any dependent changes have been merged and published in downstream modules

Screenshots (if applicable)

N/A — no visual UI changes.

Additional Notes

  • The yourNetwork network is optional. Users who don't need inter-container communication can remove the networks: sections from compose.yaml entirely.
  • Ollama models can be large (several GB each). The README notes this under the Volumes section so users are aware before first pull.
  • OLLAMA_KEEP_ALIVE is set to 24h by default to keep models warm; users can adjust or remove this to suit their hardware.

@RychidM RychidM changed the title Adds Ollama service configuration with Docker Compose Adds Ollama service Apr 9, 2026
@jackspiering jackspiering added the new service request to add a new service label Apr 11, 2026
Copy link
Copy Markdown
Collaborator

@jackspiering jackspiering left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please comment the requested lines out. After that, it can be merged.

Here is a screenshot of the working container 💯

Image

TS_AUTHKEY= # Auth key from https://tailscale.com/admin/authkeys. See: https://tailscale.com/kb/1085/auth-keys#generate-an-auth-key for instructions.

# Ollama-specific variables
OLLAMA_API_KEY= # Optional: set a secret key to restrict API access (leave blank to disable auth)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please comment this line out so users can uncomment it if necessary.

Comment on lines +44 to +45
networks:
- yourNetwork # Optional: connect to an existing proxy network so other containers can reach Ollama via its Tailscale IP
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please comment this line out so users can uncomment it if necessary.

container_name: app-${SERVICE} # Name for local container management
environment:
- OLLAMA_HOST=0.0.0.0:11434
- OLLAMA_API_KEY=${OLLAMA_API_KEY} # Optional: set an API key to restrict access
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please comment this line out so users can uncomment it if necessary.

Comment on lines +76 to +78
networks:
yourNetwork:
external: true # Assumes an existing external Docker network named "yourNetwork"
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please comment this line out so users can uncomment it if necessary.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

new service request to add a new service

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants