A simple and easy-to-use Python wrapper for popular LLM APIs (OpenAI, Anthropic, and more).
uv is recommended for managing and installing packages in isolated environments.
uv add simple-llm-apiYou can also install it using pip:
pip install simple-llm-api- 🎯 Simple and consistent interface for multiple LLM providers
- 🤖 Support for OpenAI, Anthropic, Google Gemini, Mistral, DeepSeek, and local LLMs
- 🏠 Local LLM support (run locally hosted models on your own machine)
- 🚀 Easy to use with minimal configuration
- ⚙️ Customizable parameters for each provider
- 🔧 **kwargs support for additional API parameters
from simple_llm_api import OpenAIAPI
openai = OpenAIAPI("YOUR_API_KEY")
response = openai.simple_request("Hi!")
print(response)from simple_llm_api import AnthropicAPI
anthropic = AnthropicAPI("YOUR_API_KEY")
response = anthropic.simple_request("Hi!")
print(response)from simple_llm_api import GeminiAPI
gemini = GeminiAPI("YOUR_API_KEY")
response = gemini.simple_request("Hi!")
print(response)from simple_llm_api import MistralAPI
mistral = MistralAPI("YOUR_API_KEY")
response = mistral.simple_request("Hi!")
print(response)from simple_llm_api import DeepSeekAPI
deepseek = DeepSeekAPI("YOUR_API_KEY")
response = deepseek.simple_request("Hi!")
print(response)Use locally hosted models on your computer that work like OpenAI's API (like LM Studio or Ollama).
from simple_llm_api import OpenAIAPI
openai = OpenAIAPI(model="MODEL_NAME")
openai._openai_endpoint = "http://localhost:8080/v1/chat/completions"
response = openai.simple_request("Hi!")
print(response)Each API wrapper supports various parameters for customizing the response, plus **kwargs for additional API-specific parameters:
openai.simple_request(
user_prompt="Your prompt here",
system_prompt="Custom system prompt",
temperature=1,
top_p=1,
max_completion_tokens=2048
)anthropic.simple_request(
user_prompt="Your prompt here",
system_prompt="Custom system prompt",
temperature=1,
max_tokens=2048
)gemini.simple_request(
user_prompt="Your prompt here",
system_prompt="Custom system prompt",
temperature=1,
top_k=40,
top_p=0.95,
max_output_tokens=2048
)mistral.simple_request(
user_prompt="Your prompt here",
system_prompt="Custom system prompt",
temperature=0.7,
top_p=1,
max_tokens=2048
)deepseek.simple_request(
user_prompt="Your prompt here",
system_prompt="Custom system prompt",
temperature=1,
top_p=1,
max_tokens=2048
)The library includes custom exceptions for each API:
OpenAIError: OpenAIAPI ErrorAnthropicError: AnthropicAPI ErrorGeminiError: GeminiAPI ErrorMistralError: MistralAPI ErrorDeepSeekError: DeepSeekAPI Error
This software is provided "as is" without any warranty. The authors are not responsible for any problems that may happen when you use this software.
This library connects to third-party LLM APIs (OpenAI, Anthropic, Google Gemini, Mistral, and DeepSeek). You must follow the rules of these APIs and manage any costs yourself.
You are responsible for how you use this software and what you do with it.
Using this software means you accept these terms.
This project is licensed under the MIT License.