forked from yondonfu/comfystream
-
Notifications
You must be signed in to change notification settings - Fork 6
ComfyUI native API integration with Spawn #130
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Draft
BuffMcBigHuge
wants to merge
27
commits into
livepeer:main
Choose a base branch
from
BuffMcBigHuge:comfy-native-local
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…ced uncessary base64 input frame operations, prep for multi-instance, cleanup.
…dded config for server management.
…ame size handling, commented out some logging.
Co-authored-by: John | Elite Encoder <[email protected]>
…the ui, cleanup of tensor code.
…ediate step, moved prompt execution strategy to `execution_start` event, moved buffer to self variable to avoid reinitalization.
…to improve frame buffer, modified comfy arg handling.
Author
|
Add two additional options to spawn mode: Cleaned up PR documentation and readability. |
…f app, pipeline and config files.
Author
|
Updates:
|
…cations to spawning instances, better handling of misconfigured workspace.
Author
|
Updates:
|
Author
|
We have tested spawn on Runpod with varied results on frame management with > 2 workers. More to come. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Introduction:
ComfyStream now supports a new client-mode called "spawn" which automatically launches and manages ComfyUI instances directly from the server. This approach eliminates the dependency on the Hidden Switch fork while maintaining parallel processing capability for video frames.
Key Features:
1. Client-Mode Spawn
The server can now dynamically spawn and manage ComfyUI instances, eliminating the need for manual instance setup. This is controlled via the
--client-mode spawncommand-line argument, with additional parameters to control the number of worker instances (--workers). This is manually set for now (i.e.--workers 2) but in the future, can be workflow-dependent.2. Improved Frame Management
3. Native ComfyUI API Integration
Usage:
Key arguments:
--max-frame-wait: Maximum milliseconds to wait for a frame before dropping--client-mode: Choose between "toml" (using config file) or "spawn" (spawn processes)--log-level: Choose default log level: "DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"Spawn Mode
This will auto-start subproccess ComfyUI native instances on UI workflow request, similar to EmbeddedClient. This is assuming your comfystream folder exists in
ComfyUI/custom_nodes/comfystream.Spawn mode key arguments:
--workspace: Path to the ComfyUI installation directory--workers: Number of ComfyUI instances to spawn per cuda device--workers-start-port: The starting port that the ComfyUI workers will use--cuda-devices: The available cuda devices to spawn workers on--comfyui-log-level: Spawn instances can now return their logs to the console, only one setting:DEBUGSpawn configuration:
With the spawn command above:
This will create 4 workers in total:
Server (toml) Mode
This will connect to servers that are running and defined in
comfy.toml.Server (toml) mode key arguments:
--config-file: Config file toml that defines available ComfyUI InstancesServer (toml) configuration:
When using TOML mode, define your server instances in the config file:
Benefits:
Limitations and Future Work:
Known Issues
Refreshing the UI during an active stream will not properly continue inference when a new workflow is loaded unless the server is restartedThis is now solvedThis implementation opens up new possibilities for advanced parallelization strategies while simplifying the overall architecture.