-
Notifications
You must be signed in to change notification settings - Fork 92
Throw UserError instead of Traceback error, when api-key is empty/incorrect #209
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: 18.0
Are you sure you want to change the base?
Conversation
…orrect, instead of throwing raw traceback
|
@moctarjallo Thanks for your contribution. I will check it now. |
adar2378
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The PR cannot be merged as-is because it breaks llm_ollama. The API key validation needs to be moved to provider-specific code.
| self.ensure_one() | ||
|
|
||
| # Check if API key is configured | ||
| if not self.api_key: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not all provider requires api key, Ollama for example if you want to run locally may not need API key.
| "Error details: %s" | ||
| ) % error_msg.split('\n')[0]) # Show only the first line for brevity | ||
| # Re-raise other exceptions with a user-friendly message | ||
| raise UserError(_( |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should not probably mask all exceptions under one UserError.. it would be better if we let them propagate and handle provider specific errors on provider level rather than here.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@adar2378 if llm_ollama is the only module that breaks, then maybe we could detect if this provider is ollama or not ?
the reason why i wanted to throw a UserError is because not everyone has the reflex to read entirely a traceback so not really knowing what's the issue (some may think that the module just does not work because a traceback is a too generic error..)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@moctarjallo I think you are right about user experience. But still, it should be handled on the provider-specific part, like in llm_openai or llm_ollama etc, modules and not here. Because it can hide bugs under the carpet of UserError.
For example, we could do something like this in llm_openai:
# In llm_openai provider
from openai import AuthenticationError, APIConnectionError, RateLimitError
def openai_models(self, model_id=None):
try:
# ... fetch models
except AuthenticationError as e:
raise UserError(_("Invalid API key. Please check your OpenAI API key
configuration.\n\nDetails: %s") % e)
except APIConnectionError as e:
raise UserError(_("Could not connect to OpenAI API. Please check your network
connection.\n\nDetails: %s") % e)
except RateLimitError as e:
raise UserError(_("OpenAI rate limit exceeded. Please try again
later.\n\nDetails: %s") % e)
fix(llm): return a comprehensible UserError when api-key is empty/inorrect, instead of throwing raw traceback