This tutorial shows how to deploy Dynasor locally with ollama and deepseek-r1.
Install ollama and pull a deepseek-r1 model.
ollama run deepseek-r1
git clone https://github.com/hao-ai-lab/Dynasor.git
cd Dynasor
pip install .
We also provide a proxy server dynasor-openai
for Dynasor that is compatible with any OpenAI API.
dynasor-openai --base-url http://localhost:11434/v1 --model deepseek-r1 --port 8001
You can interact with the proxy server running dynasor-chat
with the base URL http://localhost:8001/v1
.
dynasor-chat --base-url http://localhost:8001/v1
or simply run one of our example scripts to verify the proxy server is working:
python examples/client.py --prompt "2+2=?" --base-url http://localhost:8001/v1
Install Open WebUI and run the server
pip install open-webui
open-webui serve
Then follow this instruction to add a custom API to Open WebUI.
- URL:
http://localhost:8001/v1
(the base URL of the proxy server) - Key:
EMPTY
(optional) - Prefix ID:
dynasor
(optional)