Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
repos:
- repo: https://github.com/psf/black
rev: 24.8.0
rev: 25.1.0
hooks:
- id: black
name: Format Python code with black
Expand Down
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
"""
This module handles question generation within the Co-STORM framework, specifically designed to support the Moderator role.

The Moderator generates insightful, thought-provoking questions that introduce new directions into the conversation.
The Moderator generates insightful, thought-provoking questions that introduce new directions into the conversation.
By leveraging uncited or unused snippets of information retrieved during the discussion, the Moderator ensures the conversation remains dynamic and avoids repetitive or overly niche topics.

For more detailed information, refer to Section 3.5 of the Co-STORM paper: https://www.arxiv.org/pdf/2408.15232.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
"""
Warm starts the Co-STORM system by conducting a background information search to establish a shared conceptual space with the user.
This stage functions as a mini-STORM, where multiple LLM agents are spawned with different perspectives to engage in multi-round conversations.

This stage functions as a mini-STORM, where multiple LLM agents are spawned with different perspectives to engage in multi-round conversations.
The knowledge base (represented as a mind map) is initialized using the information gathered during these exchanges.

Additionally, the system generates a first draft of the report, which is then used to create a concise and engaging conversation.
Additionally, the system generates a first draft of the report, which is then used to create a concise and engaging conversation.
The synthesized conversation is presented to the user to help them quickly catch up on the system's current knowledge about the topic.
"""

Expand Down
4 changes: 2 additions & 2 deletions knowledge_storm/storm_wiki/modules/storm_dataclass.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ def __init__(self, conversations=List[Tuple[str, List[DialogueTurn]]]):

@staticmethod
def construct_url_to_info(
conversations: List[Tuple[str, List[DialogueTurn]]]
conversations: List[Tuple[str, List[DialogueTurn]]],
) -> Dict[str, Information]:
url_to_info = {}

Expand All @@ -81,7 +81,7 @@ def construct_url_to_info(

@staticmethod
def construct_log_dict(
conversations: List[Tuple[str, List[DialogueTurn]]]
conversations: List[Tuple[str, List[DialogueTurn]]],
) -> List[Dict[str, Union[str, Any]]]:
conversation_log = []
for persona, conv in conversations:
Expand Down
6 changes: 3 additions & 3 deletions knowledge_storm/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,7 @@ def create_or_update_vector_store(
embedding_model: str = "BAAI/bge-m3",
device: str = "mps",
):
from qdrant_client import Document
from langchain_core.documents.base import Document

"""
Takes a CSV file and adds each row in the CSV file to the Qdrant collection.
Expand Down Expand Up @@ -278,7 +278,7 @@ def create_or_update_vector_store(
"\uff0c", # Fullwidth comma
"\u3001", # Ideographic comma
" ",
"\u200B", # Zero-width space
"\u200b", # Zero-width space
"",
],
)
Expand Down Expand Up @@ -666,7 +666,7 @@ def __init__(
"\uff0c", # Fullwidth comma
"\u3001", # Ideographic comma
" ",
"\u200B", # Zero-width space
"\u200b", # Zero-width space
"",
],
)
Expand Down
Loading