Skip to content

Using Python to call this library open-interpreter encountered litellm.exceptions.BadRequestError exception #1603

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
jqsl2012 opened this issue Mar 16, 2025 · 0 comments

Comments

@jqsl2012
Copy link

Describe the bug

  1. I succeeded in interpreter command line mode (both openai and non-openai models succeeded)
  2. I failed in python. (However, python mode succeeded in openai model, but litellm.exceptions.BadRequestError always occurred in other models)

I tried both. My input problem was very simple, which was this: What's 34/24?

Reproduce

  1. For command line and python mode, I refer to Does it support Qwen series hosted model? #1572, but I use stream=True mode

Expected behavior

I hope the python mode request can succeed

Screenshots

This is the command line mode to request a non-OpenAI model, which succeeded

Image

Python mode request for non-OpenAI models failed.

litellm.llms.openai.common_utils.OpenAIError: {"error":{"message":"deepseek-chat is not a valid model ID","code":400},"user_id":"user_2dzIMQc2DXdaKhwdKz81nPS4sqH"}

litellm.exceptions.BadRequestError: litellm.BadRequestError: DeepseekException - {"error":{"message":"deepseek-chat is not a valid model ID","code":400},"user_id":"user_2dzIMQc2DXdaKhwdKz81nPS4sqH"}

Open Interpreter version

0.4.3

Python version

Python 3.9.13

Operating System name and version

centos7

Additional context

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant