Skip to content

Conversation

@moctarjallo
Copy link

fix(llm): return a comprehensible UserError when api-key is empty/inorrect, instead of throwing raw traceback

@adar2378
Copy link
Contributor

adar2378 commented Dec 8, 2025

@moctarjallo Thanks for your contribution. I will check it now.

Copy link
Contributor

@adar2378 adar2378 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PR cannot be merged as-is because it breaks llm_ollama. The API key validation needs to be moved to provider-specific code.

self.ensure_one()

# Check if API key is configured
if not self.api_key:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not all provider requires api key, Ollama for example if you want to run locally may not need API key.

"Error details: %s"
) % error_msg.split('\n')[0]) # Show only the first line for brevity
# Re-raise other exceptions with a user-friendly message
raise UserError(_(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should not probably mask all exceptions under one UserError.. it would be better if we let them propagate and handle provider specific errors on provider level rather than here.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@adar2378 if llm_ollama is the only module that breaks, then maybe we could detect if this provider is ollama or not ?

the reason why i wanted to throw a UserError is because not everyone has the reflex to read entirely a traceback so not really knowing what's the issue (some may think that the module just does not work because a traceback is a too generic error..)

Copy link
Contributor

@adar2378 adar2378 Dec 9, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@moctarjallo I think you are right about user experience. But still, it should be handled on the provider-specific part, like in llm_openai or llm_ollama etc, modules and not here. Because it can hide bugs under the carpet of UserError.

For example, we could do something like this in llm_openai:

# In llm_openai provider
  from openai import AuthenticationError, APIConnectionError, RateLimitError

  def openai_models(self, model_id=None):
      try:
          # ... fetch models
      except AuthenticationError as e:
          raise UserError(_("Invalid API key. Please check your OpenAI API key 
  configuration.\n\nDetails: %s") % e)
      except APIConnectionError as e:
          raise UserError(_("Could not connect to OpenAI API. Please check your network 
  connection.\n\nDetails: %s") % e)
      except RateLimitError as e:
          raise UserError(_("OpenAI rate limit exceeded. Please try again 
  later.\n\nDetails: %s") % e)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants