You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have not been able to figure out which API key LLM wants.
It seems like you setup the LiteLLM Proxy models under extra-openai-models.yaml. We've successfully gotten it working with all LiteLLM API key checks disabled. Unfortunately, that's where it stops. We cannot get LLM to stop sending DUMMY_KEY. We have set litellm API and openai API key using:
llm keys set openai
llm keys set litellm
It ignores these keys and sends DUMMY_KEY instead. Can someone tell us what key it wants to be set for this?
Thank you.
The text was updated successfully, but these errors were encountered:
We have not been able to figure out which API key LLM wants.
It seems like you setup the LiteLLM Proxy models under extra-openai-models.yaml. We've successfully gotten it working with all LiteLLM API key checks disabled. Unfortunately, that's where it stops. We cannot get LLM to stop sending DUMMY_KEY. We have set litellm API and openai API key using:
llm keys set openai
llm keys set litellm
It ignores these keys and sends DUMMY_KEY instead. Can someone tell us what key it wants to be set for this?
Thank you.
The text was updated successfully, but these errors were encountered: