-
-
Notifications
You must be signed in to change notification settings - Fork 5.7k
Open
Labels
🐞 BugSomething isn't workingSomething isn't working🩺 Needs TriageNeeds attention of maintainersNeeds attention of maintainers
Description
crawl4ai version
0.7.6
Expected Behavior
using the LLMExtractionStrategy, model choice gemini
Current Behavior
When using the LLMExtractionStrategy, if the maxtokens are set too small, the context window is insufficient to carry the output content of the LLM, so there is no return value. However, the returned Json of the LLM, such as "finish_reason":"MAX_TOKENS", does not return to the location of the outermost method call, making it impossible for the caller to determine the core of the problem
Is this reproducible?
Yes
Inputs Causing the Bug
Steps to Reproduce
Code snippets
result = await crawler.arun(
url="example.com", config=crawler_config
)
print(result.extracted_content)
//output
[
{
"index": 0,
"error": true,
"tags": [
"error"
],
"content": "'NoneType' object has no attribute 'startswith'"
}
]OS
Windows
Python version
3.12.9
Browser
No response
Browser version
No response
Error logs & Screenshots (if applicable)
[
{
"index": 0,
"error": true,
"tags": [
"error"
],
"content": "'NoneType' object has no attribute 'startswith'"
}
]
Metadata
Metadata
Assignees
Labels
🐞 BugSomething isn't workingSomething isn't working🩺 Needs TriageNeeds attention of maintainersNeeds attention of maintainers