Skip to content

fix: accept open ai compatible local llm #1671

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 16 additions & 1 deletion extensions/llms/openai/pandasai_openai/openai.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
from typing import Any, Dict, Optional

import openai
import requests

from pandasai.exceptions import APIKeyNotFoundError, UnsupportedModelError
from pandasai.helpers import load_dotenv
Expand Down Expand Up @@ -81,7 +82,21 @@ def __init__(
self._is_chat_model = False
self.client = openai.OpenAI(**self._client_params).completions
else:
raise UnsupportedModelError(self.model)
self._is_chat_model = kwargs.get("is_chat_model", True)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider documenting the new is_chat_model kwarg in the __init__ docstring. This improves clarity on default behavior.

model_names = [
model.get("id")
for model in requests.get(f"{self.api_base}/models")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add error handling (e.g., try/except and timeout) for the requests.get call to handle network/API failures gracefully.

.json()
.get("data", [])
]
if self.model in model_names:
self.client = (
openai.OpenAI(**self._client_params).chat.completions
if self._is_chat_model
else openai.OpenAI(**self._client_params).completions
)
else:
raise UnsupportedModelError(self.model)

@property
def _default_params(self) -> Dict[str, Any]:
Expand Down