-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fixes for OpenAI / Azure OpenAI compatibility with LiteLLM router #1760
Conversation
Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
Signed-off-by: dbczumar <[email protected]>
if api_config.api_key is not None: | ||
litellm_params["api_key"] = api_config.api_key | ||
if api_config.api_base is not None: | ||
litellm_params["api_base"] = api_config.api_base | ||
if api_config.api_version is not None: | ||
litellm_params["api_version"] = api_config.api_version | ||
if api_config.azure_ad_token is not None: | ||
litellm_params["azure_ad_token"] = api_config.azure_ad_token |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@okhat For OpenAI and Azure OpenAI, API keys must be defined in the Router's model_list
directly, due to initialization logic in https://github.com/BerriAI/litellm/blob/0fe8cde7c78d6b975b936f7802a4124b58bd253a/litellm/router.py#L3999-L4001
# Use the API key and base from the kwargs, or from the environment. | ||
api_key = kwargs.pop("api_key", None) or os.getenv(f"{provider}_API_KEY") | ||
api_base = kwargs.pop("api_base", None) or os.getenv(f"{provider}_API_BASE") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These are now part of the router definition, so they don't need to be passed to the text_completion()
API
Signed-off-by: dbczumar <[email protected]>
Note: The API configurations are removed from the specified `llm_kwargs`, if present, mutating | ||
the input dictionary. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This avoids duplicating code in litellm_text_completion
, which pops the api key and base from the user-specified kwargs
Fixes for OpenAI / Azure OpenAI compatibility with LiteLLM router