Feature/fix: Add Play/Pause control to inference generation queue#1623
Feature/fix: Add Play/Pause control to inference generation queue#1623xnexuzx wants to merge 2 commits intoLykosAI:mainfrom
Conversation
This commit introduces a queuing system for inference generations, allowing users to queue multiple tasks sequentially without blocking the UI. Key changes: - Added `QueueGenerationCommand` to save the current ViewModel state into a queue. - Implemented `ProcessQueueAsync` background loop to execute queued generations sequentially while respecting active processes and user cancellations. - Updated `InferenceTextToImageView` to include a dynamic 'Queue Generation' button that expands with the count. - Added a 'Clear Queue' command with a trash icon to allow users to empty pending tasks.
This commit introduces an explicit Play/Pause toggle to control the execution flow of the generation queue, allowing users to safely stack multiple jobs without inadvertently overwriting their active UI state. Key changes: - Added `IsQueuePaused` state and `ToggleQueueStateCommand` to `InferenceGenerationViewModelBase`. - Modified `ProcessQueueAsync` to respect the paused state and automatically pause when the queue naturally empties. - Inserted a dynamic Play/Pause button in `InferenceTextToImageView` that reacts to the queue's execution state. - Updated `ClearQueueCommand` and `QueueGenerationCommand` to enforce the default paused state behavior.
|
CLA Assistant Lite bot CLA Assistant bot All Contributors have signed the CLA. |
|
I have read the CLA Document and I hereby sign the CLA |
There was a problem hiding this comment.
Code Review
This pull request introduces a generation queue system to the InferenceGenerationViewModelBase, allowing users to queue multiple image generation tasks with controls to pause, resume, and clear the queue. The UI has been updated in InferenceTextToImageView to include these new controls. Feedback suggests improving the resilience of the queue processing loop by handling exceptions when awaiting active tasks and addressing potential thread-safety concerns regarding the shared queue list.
| var executionTask = GenerateImageCommand.ExecutionTask; | ||
| if (executionTask is not null) | ||
| { | ||
| await executionTask; |
There was a problem hiding this comment.
Awaiting executionTask directly here can cause the entire queue processing loop to terminate if the task faults (e.g., due to an unhandled exception in a manual generation run). Since ProcessQueueAsync is intended to be a resilient background processor, it should handle potential failures of the tasks it waits on to ensure the rest of the queue can still be processed.
try
{
await executionTask;
}
catch (Exception ex)
{
Logger.Warn(ex, "Previous generation task failed, continuing queue");
}| [JsonIgnore] | ||
| public IInferenceClientManager ClientManager { get; } | ||
|
|
||
| private readonly List<InferenceProjectDocument> _generationQueue = []; |
There was a problem hiding this comment.
The _generationQueue list is accessed and modified from both the UI thread (via ClearQueue and QueueGeneration commands) and potentially a background thread (via the ProcessQueueAsync loop). While Avalonia's DispatcherSynchronizationContext often keeps continuations on the UI thread, relying on it for thread safety of a standard List<T> across multiple await points is risky. Consider using a thread-safe collection like ConcurrentQueue<InferenceProjectDocument> or adding a lock around all accesses to _generationQueue.
This commit introduces an explicit Play/Pause toggle to control the execution flow of the generation queue, allowing users to safely stack multiple jobs without inadvertently overwriting their active UI state.
Key changes:
IsQueuePausedstate andToggleQueueStateCommandtoInferenceGenerationViewModelBase.ProcessQueueAsyncto respect the paused state and automatically pause when the queue naturally empties.InferenceTextToImageViewthat reacts to the queue's execution state.ClearQueueCommandandQueueGenerationCommandto enforce the default paused state behavior.