Skip to content

Feature/fix: Add Play/Pause control to inference generation queue#1623

Open
xnexuzx wants to merge 2 commits intoLykosAI:mainfrom
xnexuzx:feature/inference-queue-refactor
Open

Feature/fix: Add Play/Pause control to inference generation queue#1623
xnexuzx wants to merge 2 commits intoLykosAI:mainfrom
xnexuzx:feature/inference-queue-refactor

Conversation

@xnexuzx
Copy link
Copy Markdown

@xnexuzx xnexuzx commented May 1, 2026

This commit introduces an explicit Play/Pause toggle to control the execution flow of the generation queue, allowing users to safely stack multiple jobs without inadvertently overwriting their active UI state.

Key changes:

  • Added IsQueuePaused state and ToggleQueueStateCommand to InferenceGenerationViewModelBase.
  • Modified ProcessQueueAsync to respect the paused state and automatically pause when the queue naturally empties.
  • Inserted a dynamic Play/Pause button in InferenceTextToImageView that reacts to the queue's execution state.
  • Updated ClearQueueCommand and QueueGenerationCommand to enforce the default paused state behavior.
image

xnexuzx added 2 commits April 30, 2026 20:49
This commit introduces a queuing system for inference generations, allowing users to queue multiple tasks sequentially without blocking the UI.

Key changes:
- Added `QueueGenerationCommand` to save the current ViewModel state into a queue.
- Implemented `ProcessQueueAsync` background loop to execute queued generations sequentially while respecting active processes and user cancellations.
- Updated `InferenceTextToImageView` to include a dynamic 'Queue Generation' button that expands with the count.
- Added a 'Clear Queue' command with a trash icon to allow users to empty pending tasks.
This commit introduces an explicit Play/Pause toggle to control the execution flow of the generation queue, allowing users to safely stack multiple jobs without inadvertently overwriting their active UI state.

Key changes:
- Added `IsQueuePaused` state and `ToggleQueueStateCommand` to `InferenceGenerationViewModelBase`.
- Modified `ProcessQueueAsync` to respect the paused state and automatically pause when the queue naturally empties.
- Inserted a dynamic Play/Pause button in `InferenceTextToImageView` that reacts to the queue's execution state.
- Updated `ClearQueueCommand` and `QueueGenerationCommand` to enforce the default paused state behavior.
@github-actions
Copy link
Copy Markdown

github-actions Bot commented May 1, 2026

CLA Assistant Lite bot CLA Assistant bot All Contributors have signed the CLA.

@xnexuzx
Copy link
Copy Markdown
Author

xnexuzx commented May 1, 2026

I have read the CLA Document and I hereby sign the CLA

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a generation queue system to the InferenceGenerationViewModelBase, allowing users to queue multiple image generation tasks with controls to pause, resume, and clear the queue. The UI has been updated in InferenceTextToImageView to include these new controls. Feedback suggests improving the resilience of the queue processing loop by handling exceptions when awaiting active tasks and addressing potential thread-safety concerns regarding the shared queue list.

var executionTask = GenerateImageCommand.ExecutionTask;
if (executionTask is not null)
{
await executionTask;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Awaiting executionTask directly here can cause the entire queue processing loop to terminate if the task faults (e.g., due to an unhandled exception in a manual generation run). Since ProcessQueueAsync is intended to be a resilient background processor, it should handle potential failures of the tasks it waits on to ensure the rest of the queue can still be processed.

                try
                {
                    await executionTask;
                }
                catch (Exception ex)
                {
                    Logger.Warn(ex, "Previous generation task failed, continuing queue");
                }

[JsonIgnore]
public IInferenceClientManager ClientManager { get; }

private readonly List<InferenceProjectDocument> _generationQueue = [];
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The _generationQueue list is accessed and modified from both the UI thread (via ClearQueue and QueueGeneration commands) and potentially a background thread (via the ProcessQueueAsync loop). While Avalonia's DispatcherSynchronizationContext often keeps continuations on the UI thread, relying on it for thread safety of a standard List<T> across multiple await points is risky. Consider using a thread-safe collection like ConcurrentQueue<InferenceProjectDocument> or adding a lock around all accesses to _generationQueue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant