Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes for OpenAI / Azure OpenAI compatibility with LiteLLM router #1760

Merged
merged 4 commits into from
Nov 5, 2024

Conversation

dbczumar
Copy link
Collaborator

@dbczumar dbczumar commented Nov 5, 2024

Fixes for OpenAI / Azure OpenAI compatibility with LiteLLM router

Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
@dbczumar dbczumar requested a review from okhat November 5, 2024 21:44
Comment on lines +268 to +275
if api_config.api_key is not None:
litellm_params["api_key"] = api_config.api_key
if api_config.api_base is not None:
litellm_params["api_base"] = api_config.api_base
if api_config.api_version is not None:
litellm_params["api_version"] = api_config.api_version
if api_config.azure_ad_token is not None:
litellm_params["azure_ad_token"] = api_config.azure_ad_token
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@okhat For OpenAI and Azure OpenAI, API keys must be defined in the Router's model_list directly, due to initialization logic in https://github.com/BerriAI/litellm/blob/0fe8cde7c78d6b975b936f7802a4124b58bd253a/litellm/router.py#L3999-L4001

Comment on lines -245 to -247
# Use the API key and base from the kwargs, or from the environment.
api_key = kwargs.pop("api_key", None) or os.getenv(f"{provider}_API_KEY")
api_base = kwargs.pop("api_base", None) or os.getenv(f"{provider}_API_BASE")
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are now part of the router definition, so they don't need to be passed to the text_completion() API

Signed-off-by: dbczumar <[email protected]>
Comment on lines +187 to +188
Note: The API configurations are removed from the specified `llm_kwargs`, if present, mutating
the input dictionary.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This avoids duplicating code in litellm_text_completion, which pops the api key and base from the user-specified kwargs

@okhat okhat merged commit d7d6fae into stanfordnlp:main Nov 5, 2024
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants