Skip to content

Latest commit

 

History

History
53 lines (36 loc) · 1.41 KB

local.md

File metadata and controls

53 lines (36 loc) · 1.41 KB

Run Dynasor locally with ollama + OpenWebUI

This tutorial shows how to deploy Dynasor locally with ollama and deepseek-r1.

Install ollama

Install ollama and pull a deepseek-r1 model.

ollama run deepseek-r1

Install Dynasor

git clone https://github.com/hao-ai-lab/Dynasor.git
cd Dynasor
pip install .

Run dynasor-openai

We also provide a proxy server dynasor-openai for Dynasor that is compatible with any OpenAI API.

dynasor-openai --base-url http://localhost:11434/v1 --model deepseek-r1 --port 8001

You can interact with the proxy server running dynasor-chat with the base URL http://localhost:8001/v1.

dynasor-chat --base-url http://localhost:8001/v1

or simply run one of our example scripts to verify the proxy server is working:

python examples/client.py --prompt "2+2=?" --base-url http://localhost:8001/v1

Use Dynasor with OpenWebUI

Install Open WebUI and run the server

pip install open-webui
open-webui serve

Then follow this instruction to add a custom API to Open WebUI.

  • URL: http://localhost:8001/v1 (the base URL of the proxy server)
  • Key: EMPTY (optional)
  • Prefix ID: dynasor (optional)