A lightweight API proxy service that reveals DeepSeek model's complete thinking process in Chatbox.
SiliconFlow recently upgraded their API to match DeepSeek's official format, splitting responses into two parts:
- reasoning_content: model's thinking process
- content: final response
This change made Chatbox unable to display the model's reasoning process. This project elegantly merges these two parts through API forwarding, allowing you to experience the model's complete thinking process.
- Open Chatbox, add a new chat model
- Fill in the configuration:
- Name: Any (e.g., DeepSeek-R1)
- Model: deepseek-ai/DeepSeek-R1
- API URL:
https://deepseek2chatbox.dawne.cn/v1 - API Key: Your SiliconFlow API Key (starts with sk-)
- Clone repository
- Install dependencies:
pip install flask requests
- Start service:
python server.py
- Deploy to server and add SSL
- Use in Chatbox:
http://your domain name:5000/v1
You'll see output like this:
The user sent a simple greeting. Let me think about how to respond: I should be friendly while maintaining professionalism. Adding appropriate emojis can increase approachability. The response should be concise yet warm, and leave room for further conversation.
Hello! Nice to meet you 👋 How may I help you today?
- 🔄 Real-time Forwarding: Zero latency, what you see is what you get
- 🧠 Visible Thinking: AI's reasoning process is no longer a black box
- 🎯 Plug and Play: Fully compatible with Chatbox
- 🪶 Lightweight: No complex configuration needed
Having issues? Feel free to submit an Issue or contact the author.