End-to-end documentation to set up your own local & fully private LLM server on Debian. Equipped with chat, web search, RAG, model management, MCP servers, image generation, and TTS.
-
Updated
Jan 30, 2026
End-to-end documentation to set up your own local & fully private LLM server on Debian. Equipped with chat, web search, RAG, model management, MCP servers, image generation, and TTS.
A unified Model Context Protocol server implementation that aggregates multiple MCP servers into one.
A centralized reverse-proxy platform for MCP servers — manage, group, and export as Skills from a single endpoint.
Authentication, analytics, and prompt visibility for MCP servers with zero code changes. Supports OAuth2.1, DCR, real-time logs, and client onboarding out of the box
This server acts as a central hub for Model Context Protocol (MCP) resource servers.
Supercharge AI Agents, Safely
MCP OAuth Proxy incl. dynamic client registration (DCR), MCP prompt analytics and MCP firewall to build enterprise grade MCP servers.
A centralized platform for managing and connecting MCP servers. MCP Center provides a high-performance proxy service that enables seamless communication between MCP clients and multiple MCP servers.
A comprehensive platform for managing and proxying Model Context Protocol (MCP) servers, providing scalable AI service orchestration across multiple microservices.
🚀 Enterprise-grade API Gateway for MCP Protocol, built with Java Spring Boot. Supports authentication, proxy, and traffic management for AI tool calling.
A universal MCP client with proxying feature to interact with MCP Servers which support STDIO transport.
An AI chat proxy with universal tool access, protocol conversion, load balancing, key isolation, prompt enhancement, centralized MCP hub, and built-in WebSearch & WebFetch — more than an AI assistant for chat, translation, mind maps, flowcharts, and search.
MCP tool management and workflow proxy
Use as many MCP servers as you want while minimizing context usage. A code mode MCP server gateway driven with Lua 🌙
Automatically generate an MCP Server from existing source code, service classes, helper methods, and external MCP tools. The MCP Mediator aggregates various sources and tools into a unified system, enabling seamless automatic generation of a complete MCP Server.
Preloop is the Safety Layer for AI agents: MCP firewall, human approvals, event-driven flows
Access control for AI agents. MCP proxy with RBAC, CEL policies, and full audit trail.
Add a description, image, and links to the mcp-proxy topic page so that developers can more easily learn about it.
To associate your repository with the mcp-proxy topic, visit your repo's landing page and select "manage topics."