Skip to content

feat: enforce max queue deliveries in handlers with graceful failure#1344

Merged
pranaygp merged 3 commits intomainfrom
pgp/handler-max-deliveries
Mar 23, 2026
Merged

feat: enforce max queue deliveries in handlers with graceful failure#1344
pranaygp merged 3 commits intomainfrom
pgp/handler-max-deliveries

Conversation

@pranaygp
Copy link
Collaborator

@pranaygp pranaygp commented Mar 12, 2026

Summary

Replaces VQS maxDeliveries: 64 cap with handler-level enforcement. Handlers now gracefully fail runs/steps after excessive queue redeliveries, preventing "phantom stuck" runs.

Stacked on #1342#1340

Problem

When infrastructure is down (OOMs, network outages), VQS retries messages up to maxDeliveries: 64 times at 5s intervals. After exhausting retries, VQS drops the message — the run stays in running status forever with no error, no failure event.

Solution

  1. Remove maxDeliveries from VQS config — allow infinite retries at queue level
  2. Keep retryAfterSeconds: 5 — VQS owns retry timing (works even after SIGKILL/OOM)
  3. Handlers check metadata.attempt — when > MAX_QUEUE_DELIVERIES (64), fail gracefully with MAX_DELIVERIES_EXCEEDED error code
  4. If even failure event creation fails — log detailed error and consume the message (no point retrying further)

Queue error log examples (before → after)

Before (dumped full body, no run context):

[local world] Failed to queue message {
  queueName: '__wkf_step_...',
  text: '"WorkflowAPIError: Injected 5xx"',
  status: 500,
  headers: { ... },
  body: '{"workflowName":"...","workflowRunId":"wrun_01KKF...",
    "stepId":"step_01KKF...","traceCarrier":{...}}'
}

After (structured, includes run/step IDs, separates HTTP status from handler error):

[world-local] Queue message failed (attempt 3, HTTP 500) {
  queueName: '__wkf_step_...',
  messageId: 'msg_01KKF...',
  runId: 'wrun_01KKF...',
  stepId: 'step_01KKF...',
  handlerError: '"WorkflowAPIError: Injected 5xx"'
}

Local world queue

  • Removed hardcoded 3-retry cap → 1000 safety limit (handler enforces the real limit at 64)
  • Matches production VQS behavior

Test plan

  • 3 new unit tests for step handler max delivery enforcement
  • All core tests pass
  • All world-local tests pass
  • E2E: persistent failure → failed with MAX_DELIVERIES_EXCEEDED
  • E2E: transient failure → normal completion

🤖 Generated with Claude Code

@changeset-bot
Copy link

changeset-bot bot commented Mar 12, 2026

🦋 Changeset detected

Latest commit: 85d6acd

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 20 packages
Name Type
@workflow/errors Patch
@workflow/core Patch
@workflow/world-local Patch
@workflow/builders Patch
@workflow/sveltekit Patch
@workflow/cli Patch
workflow Patch
@workflow/world-postgres Patch
@workflow/world-vercel Patch
@workflow/next Patch
@workflow/nitro Patch
@workflow/vitest Patch
@workflow/web-shared Patch
@workflow/world-testing Patch
@workflow/astro Patch
@workflow/nest Patch
@workflow/rollup Patch
@workflow/vite Patch
@workflow/ai Patch
@workflow/nuxt Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@vercel
Copy link
Contributor

vercel bot commented Mar 12, 2026

@github-actions
Copy link
Contributor

github-actions bot commented Mar 12, 2026

🧪 E2E Test Results

Some tests failed

Summary

Passed Failed Skipped Total
✅ ▲ Vercel Production 780 0 67 847
✅ 💻 Local Development 782 0 142 924
✅ 📦 Local Production 782 0 142 924
✅ 🐘 Local Postgres 782 0 142 924
✅ 🪟 Windows 72 0 5 77
❌ 🌍 Community Worlds 118 56 21 195
✅ 📋 Other 198 0 33 231
Total 3514 56 552 4122

❌ Failed Tests

🌍 Community Worlds (56 failed)

mongodb (3 failed):

  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KMEJAKA6S63EBPHYY3TRYD7J
  • webhookWorkflow | wrun_01KMEJAWAYV6YW3KS2J1H9T3JT
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KMEJH0AV83ZNZHV89MSXRAF4

redis (2 failed):

  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KMEJAKA6S63EBPHYY3TRYD7J
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KMEJH0AV83ZNZHV89MSXRAF4

turso (51 failed):

  • addTenWorkflow | wrun_01KMEJ9CMF773KWHWTD31A6WFR
  • addTenWorkflow | wrun_01KMEJ9CMF773KWHWTD31A6WFR
  • wellKnownAgentWorkflow (.well-known/agent) | wrun_01KMEJAF9W95HXXHYDV167797R
  • should work with react rendering in step
  • promiseAllWorkflow | wrun_01KMEJ9KEWNMPMNXP1BTNAGVJ8
  • promiseRaceWorkflow | wrun_01KMEJ9SPXP2DNS9ETWQQS0T9H
  • promiseAnyWorkflow | wrun_01KMEJ9W2X19VBMJ5051C6G11Y
  • importedStepOnlyWorkflow | wrun_01KMEJAYBG6N9CHNZ48ZBYA8GG
  • hookWorkflow | wrun_01KMEJA94NW7E5R3K1NXNFX3SQ
  • hookWorkflow is not resumable via public webhook endpoint | wrun_01KMEJAKA6S63EBPHYY3TRYD7J
  • webhookWorkflow | wrun_01KMEJAWAYV6YW3KS2J1H9T3JT
  • sleepingWorkflow | wrun_01KMEJB2XX9G8QK8EDQA9ZXP8K
  • parallelSleepWorkflow | wrun_01KMEJBF4VSNGW3Y1FPK2B1PYN
  • nullByteWorkflow | wrun_01KMEJBKBPX6M7XK6QY6Z0F12B
  • workflowAndStepMetadataWorkflow | wrun_01KMEJBNAR4X0GJ1J45A9RFGM0
  • fetchWorkflow | wrun_01KMEJDG6N05WP7JFEDY29QVZ1
  • promiseRaceStressTestWorkflow | wrun_01KMEJDK9K5V2KSKB3HT08GNE7
  • error handling error propagation workflow errors nested function calls preserve message and stack trace
  • error handling error propagation workflow errors cross-file imports preserve message and stack trace
  • error handling error propagation step errors basic step error preserves message and stack trace
  • error handling error propagation step errors cross-file step error preserves message and function names in stack
  • error handling retry behavior regular Error retries until success
  • error handling retry behavior FatalError fails immediately without retries
  • error handling retry behavior RetryableError respects custom retryAfter delay
  • error handling retry behavior maxRetries=0 disables retries
  • error handling catchability FatalError can be caught and detected with FatalError.is()
  • hookCleanupTestWorkflow - hook token reuse after workflow completion | wrun_01KMEJGCWVWDTQ4J5Q11KTQ99Q
  • concurrent hook token conflict - two workflows cannot use the same hook token simultaneously | wrun_01KMEJH0AV83ZNZHV89MSXRAF4
  • hookDisposeTestWorkflow - hook token reuse after explicit disposal while workflow still running | wrun_01KMEJHKENZ3QZ2ZTH7HWAM3FD
  • stepFunctionPassingWorkflow - step function references can be passed as arguments (without closure vars) | wrun_01KMEJJ5EJBBVR1WE9Q3D4CGAT
  • stepFunctionWithClosureWorkflow - step function with closure variables passed as argument | wrun_01KMEJJD2VKCXMHXGA3DSTEXBZ
  • closureVariableWorkflow - nested step functions with closure variables | wrun_01KMEJJHTCX1S38X2TX288T4GY
  • spawnWorkflowFromStepWorkflow - spawning a child workflow using start() inside a step | wrun_01KMEJJKQX2GP1S0108E5NPDEZ
  • health check (queue-based) - workflow and step endpoints respond to health check messages
  • pathsAliasWorkflow - TypeScript path aliases resolve correctly | wrun_01KMEJK18WMPY5JCSJC9WMWSXW
  • Calculator.calculate - static workflow method using static step methods from another class | wrun_01KMEJK6474HDQ2TVFBND30SX9
  • AllInOneService.processNumber - static workflow method using sibling static step methods | wrun_01KMEJKBZX5TSYZK7HJRN0F01A
  • ChainableService.processWithThis - static step methods using this to reference the class | wrun_01KMEJKHXBJ72P63NSKEYFCCN6
  • thisSerializationWorkflow - step function invoked with .call() and .apply() | wrun_01KMEJKQQ6N91CM8Z8DZWFKTMZ
  • customSerializationWorkflow - custom class serialization with WORKFLOW_SERIALIZE/WORKFLOW_DESERIALIZE | wrun_01KMEJKXV5GMH3BD57WNCSHSN6
  • instanceMethodStepWorkflow - instance methods with "use step" directive | wrun_01KMEJM50567XG2MQ9X33FDCJ8
  • crossContextSerdeWorkflow - classes defined in step code are deserializable in workflow context | wrun_01KMEJMET97RPFD5MCVKFNQ3NM
  • stepFunctionAsStartArgWorkflow - step function reference passed as start() argument | wrun_01KMEJMPX44H3TFBSH0EYTJ7JR
  • cancelRun - cancelling a running workflow | wrun_01KMEJMXRXDC4XFPHZN2VHW7HH
  • cancelRun via CLI - cancelling a running workflow | wrun_01KMEJN6BH236N09P04FGN477N
  • pages router addTenWorkflow via pages router
  • pages router promiseAllWorkflow via pages router
  • pages router sleepingWorkflow via pages router
  • hookWithSleepWorkflow - hook payloads delivered correctly with concurrent sleep | wrun_01KMEJNHCBP48GM0JFSFKP1221
  • sleepInLoopWorkflow - sleep inside loop with steps actually delays each iteration | wrun_01KMEJP6TC7MC7XA9153KHCEM8
  • sleepWithSequentialStepsWorkflow - sequential steps work with concurrent sleep (control) | wrun_01KMEJPHJH0V2PP4DVG23DN82S

Details by Category

✅ ▲ Vercel Production
App Passed Failed Skipped
✅ astro 70 0 7
✅ example 70 0 7
✅ express 70 0 7
✅ fastify 70 0 7
✅ hono 70 0 7
✅ nextjs-turbopack 75 0 2
✅ nextjs-webpack 75 0 2
✅ nitro 70 0 7
✅ nuxt 70 0 7
✅ sveltekit 70 0 7
✅ vite 70 0 7
✅ 💻 Local Development
App Passed Failed Skipped
✅ astro-stable 66 0 11
✅ express-stable 66 0 11
✅ fastify-stable 66 0 11
✅ hono-stable 66 0 11
✅ nextjs-turbopack-canary 55 0 22
✅ nextjs-turbopack-stable 72 0 5
✅ nextjs-webpack-canary 55 0 22
✅ nextjs-webpack-stable 72 0 5
✅ nitro-stable 66 0 11
✅ nuxt-stable 66 0 11
✅ sveltekit-stable 66 0 11
✅ vite-stable 66 0 11
✅ 📦 Local Production
App Passed Failed Skipped
✅ astro-stable 66 0 11
✅ express-stable 66 0 11
✅ fastify-stable 66 0 11
✅ hono-stable 66 0 11
✅ nextjs-turbopack-canary 55 0 22
✅ nextjs-turbopack-stable 72 0 5
✅ nextjs-webpack-canary 55 0 22
✅ nextjs-webpack-stable 72 0 5
✅ nitro-stable 66 0 11
✅ nuxt-stable 66 0 11
✅ sveltekit-stable 66 0 11
✅ vite-stable 66 0 11
✅ 🐘 Local Postgres
App Passed Failed Skipped
✅ astro-stable 66 0 11
✅ express-stable 66 0 11
✅ fastify-stable 66 0 11
✅ hono-stable 66 0 11
✅ nextjs-turbopack-canary 55 0 22
✅ nextjs-turbopack-stable 72 0 5
✅ nextjs-webpack-canary 55 0 22
✅ nextjs-webpack-stable 72 0 5
✅ nitro-stable 66 0 11
✅ nuxt-stable 66 0 11
✅ sveltekit-stable 66 0 11
✅ vite-stable 66 0 11
✅ 🪟 Windows
App Passed Failed Skipped
✅ nextjs-turbopack 72 0 5
❌ 🌍 Community Worlds
App Passed Failed Skipped
✅ mongodb-dev 3 0 2
❌ mongodb 52 3 5
✅ redis-dev 3 0 2
❌ redis 53 2 5
✅ turso-dev 3 0 2
❌ turso 4 51 5
✅ 📋 Other
App Passed Failed Skipped
✅ e2e-local-dev-nest-stable 66 0 11
✅ e2e-local-postgres-nest-stable 66 0 11
✅ e2e-local-prod-nest-stable 66 0 11

📋 View full workflow run

Copy link
Collaborator Author

pranaygp commented Mar 12, 2026

Copy link
Contributor

@vercel vercel bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additional Suggestion:

SvelteKit package has hardcoded maxDeliveries: 64 on queue triggers, causing VQS to silently drop messages before the handler can gracefully fail runs/steps.

Fix on Vercel

@pranaygp pranaygp force-pushed the pgp/handler-max-deliveries branch from c2ed3e7 to 085a05a Compare March 17, 2026 22:37
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR moves enforcement of the queue delivery cap from the Vercel Queue trigger configuration into the workflow/step runtime handlers, and updates local-queue behavior/logging to support the new approach.

Changes:

  • Add a shared MAX_QUEUE_DELIVERIES constant and enforce it in both workflow and step handlers with graceful failure (run_failed / step_failed + requeue workflow for step).
  • Remove maxDeliveries from queue trigger definitions in @workflow/builders.
  • Improve world-local queue logging with runId/stepId context and add a local retry safety limit.

Reviewed changes

Copilot reviewed 8 out of 8 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
packages/world-local/src/queue.ts Adds structured identifiers to logs and replaces the old retry counter with a fixed safety-loop cap.
packages/errors/src/error-codes.ts Introduces MAX_DELIVERIES_EXCEEDED run error code.
packages/core/src/runtime/step-handler.ts Enforces max deliveries for steps and adjusts event creation/logging behavior.
packages/core/src/runtime/step-handler.test.ts Adds test coverage for step max-deliveries behavior.
packages/core/src/runtime/constants.ts Defines MAX_QUEUE_DELIVERIES.
packages/core/src/runtime.ts Enforces max deliveries for workflow handler and records run_failed with a specific error code.
packages/builders/src/constants.ts Removes VQS maxDeliveries from trigger constants.
.changeset/handler-max-deliveries.md Changeset describing the behavior shift from VQS config to handler enforcement.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Comment on lines +172 to +176
const startResult = await world.events.create(workflowRunId, {
eventType: 'step_started',
specVersion: SPEC_VERSION_CURRENT,
correlationId: stepId,
});
Comment on lines +120 to +131
try {
const world = getWorld();
await world.events.create(runId, {
eventType: 'run_failed',
specVersion: SPEC_VERSION_CURRENT,
eventData: {
error: {
message: `Workflow exceeded maximum queue deliveries (${metadata.attempt}/${MAX_QUEUE_DELIVERIES})`,
},
errorCode: RUN_ERROR_CODES.MAX_DELIVERIES_EXCEEDED,
},
});
Comment on lines +117 to +121
// Safety limit to prevent infinite loops in the local queue.
// The actual max delivery enforcement happens in the workflow/step handlers.
const MAX_LOCAL_SAFETY_LIMIT = 1000;
try {
let defaultRetriesLeft = 3;
for (let attempt = 0; defaultRetriesLeft > 0; attempt++) {
defaultRetriesLeft--;

for (let attempt = 0; attempt < MAX_LOCAL_SAFETY_LIMIT; attempt++) {
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah there's no reason this has to be 1000. just needs to be higher than the max queue attempts. let's go with 256

Comment on lines +529 to +568
it('should post step_failed and re-queue workflow when delivery count exceeds max', async () => {
const result = await capturedHandler(
createMessage(),
{ ...createMetadata('myStep'), attempt: 65 }
);

expect(result).toBeUndefined();
expect(mockEventsCreate).toHaveBeenCalledWith(
'wrun_test123',
expect.objectContaining({
eventType: 'step_failed',
correlationId: 'step_abc',
})
);
expect(mockQueueMessage).toHaveBeenCalled();
expect(mockRuntimeLogger.error).toHaveBeenCalledWith(
expect.stringContaining('exceeded max deliveries'),
expect.objectContaining({ workflowRunId: 'wrun_test123' })
);
});

it('should consume message silently when step_failed fails with EntityConflictError', async () => {
mockEventsCreate.mockRejectedValue(
new EntityConflictError('Step already completed')
);

const result = await capturedHandler(
createMessage(),
{ ...createMetadata('myStep'), attempt: 65 }
);

expect(result).toBeUndefined();
expect(mockStepFn).not.toHaveBeenCalled();
});

it('should not trigger max deliveries check when under limit', async () => {
const result = await capturedHandler(
createMessage(),
{ ...createMetadata('myStep'), attempt: 64 }
);
@github-actions
Copy link
Contributor

github-actions bot commented Mar 20, 2026

📊 Benchmark Results

📈 Comparing against baseline from main branch. Green 🟢 = faster, Red 🔺 = slower.

workflow with no steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 0.043s 1.006s 0.963s 10 1.00x
💻 Local Nitro 0.045s (+18.4% 🔺) 1.005s (~) 0.960s 10 1.06x
🌐 Redis Next.js (Turbopack) 0.057s 1.006s 0.950s 10 1.32x
🐘 Postgres Next.js (Turbopack) 0.060s 1.011s 0.951s 10 1.40x
🐘 Postgres Express 0.062s (+11.2% 🔺) 1.011s (~) 0.949s 10 1.45x
🐘 Postgres Nitro 0.065s (-6.9% 🟢) 1.011s (~) 0.946s 10 1.52x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 0.490s (-25.8% 🟢) 2.611s (+9.2% 🔺) 2.121s 10 1.00x
▲ Vercel Nitro 0.515s (-11.8% 🟢) 3.160s (+48.4% 🔺) 2.645s 10 1.05x
▲ Vercel Express 0.578s (-16.4% 🟢) 2.609s (-11.3% 🟢) 2.032s 10 1.18x

🔍 Observability: Next.js (Turbopack) | Nitro | Express

workflow with 1 step

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 1.103s 2.006s 0.903s 10 1.00x
💻 Local Nitro 1.126s (+3.0%) 2.006s (~) 0.880s 10 1.02x
🌐 Redis Next.js (Turbopack) 1.129s 2.006s 0.878s 10 1.02x
🐘 Postgres Next.js (Turbopack) 1.141s 2.009s 0.868s 10 1.03x
🐘 Postgres Nitro 1.141s (-1.8%) 2.013s (~) 0.872s 10 1.03x
🐘 Postgres Express 1.141s (-1.0%) 2.010s (~) 0.869s 10 1.04x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.170s (+4.0%) 4.093s (+21.5% 🔺) 1.923s 10 1.00x
▲ Vercel Express 2.184s (+3.6%) 4.053s (+8.6% 🔺) 1.868s 10 1.01x
▲ Vercel Next.js (Turbopack) 2.238s (+8.5% 🔺) 3.761s (+2.6%) 1.523s 10 1.03x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 10 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 10.652s 11.024s 0.373s 3 1.00x
🌐 Redis Next.js (Turbopack) 10.777s 11.023s 0.246s 3 1.01x
🐘 Postgres Next.js (Turbopack) 10.898s 11.024s 0.126s 3 1.02x
🐘 Postgres Express 10.923s (~) 11.026s (~) 0.104s 3 1.03x
💻 Local Nitro 10.928s (+2.9%) 11.024s (~) 0.097s 3 1.03x
🐘 Postgres Nitro 10.955s (-1.7%) 11.024s (-8.5% 🟢) 0.069s 3 1.03x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 17.667s (-1.3%) 19.080s (-2.8%) 1.413s 2 1.00x
▲ Vercel Express 17.831s (-8.0% 🟢) 19.299s (-9.0% 🟢) 1.468s 2 1.01x
▲ Vercel Next.js (Turbopack) 18.829s (+9.4% 🔺) 20.417s (+7.2% 🔺) 1.588s 2 1.07x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 25 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 14.173s 15.030s 0.856s 4 1.00x
🌐 Redis Next.js (Turbopack) 14.331s 15.029s 0.698s 4 1.01x
🐘 Postgres Next.js (Turbopack) 14.461s 15.028s 0.567s 4 1.02x
🐘 Postgres Express 14.546s (-1.8%) 15.025s (~) 0.479s 4 1.03x
🐘 Postgres Nitro 14.566s (-2.2%) 15.026s (~) 0.460s 4 1.03x
💻 Local Nitro 15.012s (+5.6% 🔺) 15.280s (+1.7%) 0.267s 4 1.06x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 32.630s (~) 34.645s (+1.7%) 2.015s 2 1.00x
▲ Vercel Express 33.584s (~) 35.097s (~) 1.513s 2 1.03x
▲ Vercel Next.js (Turbopack) 33.813s (+0.5%) 34.935s (~) 1.121s 2 1.04x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 50 sequential steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 13.417s 14.027s 0.610s 7 1.00x
🐘 Postgres Next.js (Turbopack) 13.658s 14.022s 0.364s 7 1.02x
🐘 Postgres Nitro 14.008s (-5.7% 🟢) 14.451s (-5.0%) 0.444s 7 1.04x
🐘 Postgres Express 14.393s (~) 15.192s (+1.0%) 0.799s 6 1.07x
💻 Local Next.js (Turbopack) 14.867s 15.027s 0.160s 6 1.11x
💻 Local Nitro 16.750s (+12.5% 🔺) 17.031s (+13.3% 🔺) 0.281s 6 1.25x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 59.616s (-3.4%) 61.536s (-2.1%) 1.920s 2 1.00x
▲ Vercel Next.js (Turbopack) 59.831s (-2.3%) 61.661s (-1.4%) 1.830s 2 1.00x
▲ Vercel Express 63.489s (+4.7%) 65.238s (+3.7%) 1.748s 2 1.06x

🔍 Observability: Nitro | Next.js (Turbopack) | Express

Promise.all with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 1.245s (-4.3%) 2.010s (~) 0.766s 15 1.00x
🐘 Postgres Next.js (Turbopack) 1.245s 2.010s 0.765s 15 1.00x
🐘 Postgres Express 1.293s (+0.9%) 2.011s (~) 0.718s 15 1.04x
🌐 Redis Next.js (Turbopack) 1.329s 2.006s 0.677s 15 1.07x
💻 Local Nitro 1.556s (+6.7% 🔺) 2.006s (~) 0.450s 15 1.25x
💻 Local Next.js (Turbopack) 1.561s 2.073s 0.511s 15 1.25x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.401s (-4.9%) 4.693s (+22.2% 🔺) 2.292s 7 1.00x
▲ Vercel Express 2.985s (+5.0% 🔺) 4.633s (+4.7%) 1.648s 7 1.24x
▲ Vercel Next.js (Turbopack) 3.065s (+18.3% 🔺) 4.765s (+16.2% 🔺) 1.700s 7 1.28x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Promise.all with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 2.339s (-4.2%) 3.010s (~) 0.671s 10 1.00x
🐘 Postgres Nitro 2.341s (-6.1% 🟢) 3.010s (~) 0.670s 10 1.00x
🐘 Postgres Next.js (Turbopack) 2.428s 3.012s 0.585s 10 1.04x
🌐 Redis Next.js (Turbopack) 2.569s 3.008s 0.439s 10 1.10x
💻 Local Next.js (Turbopack) 2.730s 3.008s 0.278s 10 1.17x
💻 Local Nitro 2.965s (+17.3% 🔺) 3.453s (+14.8% 🔺) 0.488s 9 1.27x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 2.649s (-18.9% 🟢) 4.187s (-19.0% 🟢) 1.538s 8 1.00x
▲ Vercel Next.js (Turbopack) 3.241s (+10.4% 🔺) 4.723s (+10.1% 🔺) 1.482s 7 1.22x
▲ Vercel Nitro 3.622s (+39.0% 🔺) 5.235s (+42.7% 🔺) 1.613s 6 1.37x

🔍 Observability: Express | Next.js (Turbopack) | Nitro

Promise.all with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 3.472s (-5.3% 🟢) 4.013s (~) 0.541s 8 1.00x
🐘 Postgres Express 3.491s (-2.2%) 4.012s (~) 0.522s 8 1.01x
🐘 Postgres Next.js (Turbopack) 3.651s 4.013s 0.362s 8 1.05x
🌐 Redis Next.js (Turbopack) 4.125s 4.725s 0.600s 7 1.19x
💻 Local Next.js (Turbopack) 7.038s 7.764s 0.726s 4 2.03x
💻 Local Nitro 8.159s (+22.1% 🔺) 9.021s (+28.6% 🔺) 0.861s 4 2.35x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 3.190s (+1.2%) 4.790s (-1.7%) 1.601s 7 1.00x
▲ Vercel Nitro 3.801s (+12.2% 🔺) 5.785s (+20.7% 🔺) 1.984s 6 1.19x
▲ Vercel Next.js (Turbopack) 4.440s (+6.6% 🔺) 6.146s (+10.7% 🔺) 1.706s 6 1.39x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

Promise.race with 10 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 1.237s 2.010s 0.772s 15 1.00x
🐘 Postgres Express 1.261s (-2.3%) 2.010s (~) 0.749s 15 1.02x
🐘 Postgres Nitro 1.265s (-2.0%) 2.011s (~) 0.746s 15 1.02x
🌐 Redis Next.js (Turbopack) 1.299s 2.006s 0.707s 15 1.05x
💻 Local Next.js (Turbopack) 1.493s 2.005s 0.513s 15 1.21x
💻 Local Nitro 1.566s (+7.0% 🔺) 2.073s (+3.4%) 0.507s 15 1.27x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.146s (-6.9% 🟢) 4.254s (+13.2% 🔺) 2.107s 8 1.00x
▲ Vercel Express 2.627s (+16.0% 🔺) 4.465s (+14.3% 🔺) 1.839s 7 1.22x
▲ Vercel Next.js (Turbopack) 3.812s (+47.4% 🔺) 5.535s (+46.4% 🔺) 1.723s 6 1.78x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Promise.race with 25 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Express 2.325s (-5.2% 🟢) 3.010s (~) 0.685s 10 1.00x
🐘 Postgres Nitro 2.374s (-4.8%) 3.012s (~) 0.637s 10 1.02x
🐘 Postgres Next.js (Turbopack) 2.392s 3.011s 0.619s 10 1.03x
🌐 Redis Next.js (Turbopack) 2.548s 3.007s 0.460s 10 1.10x
💻 Local Next.js (Turbopack) 2.648s 3.007s 0.359s 10 1.14x
💻 Local Nitro 3.077s (+7.3% 🔺) 3.759s (+20.9% 🔺) 0.682s 8 1.32x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 2.885s (~) 4.398s (+2.2%) 1.513s 7 1.00x
▲ Vercel Express 2.923s (+22.3% 🔺) 4.612s (+16.8% 🔺) 1.689s 7 1.01x
▲ Vercel Next.js (Turbopack) 2.951s (-1.5%) 4.556s (+6.2% 🔺) 1.605s 7 1.02x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Promise.race with 50 concurrent steps

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Nitro 3.475s (-4.7%) 4.014s (~) 0.539s 8 1.00x
🐘 Postgres Express 3.481s (-3.2%) 4.018s (~) 0.537s 8 1.00x
🐘 Postgres Next.js (Turbopack) 3.635s 4.014s 0.379s 8 1.05x
🌐 Redis Next.js (Turbopack) 4.183s 4.868s 0.684s 7 1.20x
💻 Local Next.js (Turbopack) 7.324s 8.016s 0.692s 4 2.11x
💻 Local Nitro 9.073s (+22.7% 🔺) 9.523s (+18.8% 🔺) 0.450s 4 2.61x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 3.257s (+5.0% 🔺) 5.102s (+8.3% 🔺) 1.845s 7 1.00x
▲ Vercel Express 3.389s (+8.4% 🔺) 5.053s (+7.9% 🔺) 1.664s 6 1.04x
▲ Vercel Next.js (Turbopack) 4.038s (+12.6% 🔺) 5.487s (+11.4% 🔺) 1.450s 6 1.24x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 10 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 0.678s 1.004s 0.326s 60 1.00x
🌐 Redis Next.js (Turbopack) 0.709s 1.005s 0.295s 60 1.05x
🐘 Postgres Next.js (Turbopack) 0.775s 1.024s 0.249s 59 1.14x
🐘 Postgres Nitro 0.824s (-14.9% 🟢) 1.008s (-18.4% 🟢) 0.183s 60 1.22x
🐘 Postgres Express 0.836s (-8.6% 🟢) 1.025s (-5.1% 🟢) 0.188s 59 1.23x
💻 Local Nitro 0.971s (+44.1% 🔺) 1.057s (+5.3% 🔺) 0.086s 57 1.43x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 10.214s (+2.7%) 12.110s (-2.0%) 1.896s 5 1.00x
▲ Vercel Express 10.898s (+1.4%) 13.051s (+5.9% 🔺) 2.153s 5 1.07x
▲ Vercel Nitro 11.299s (+12.2% 🔺) 13.400s (+17.3% 🔺) 2.101s 5 1.11x

🔍 Observability: Next.js (Turbopack) | Express | Nitro

workflow with 25 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 1.674s 2.006s 0.332s 45 1.00x
🐘 Postgres Next.js (Turbopack) 1.889s 2.077s 0.189s 44 1.13x
🐘 Postgres Nitro 1.947s (-15.8% 🟢) 2.205s (-26.8% 🟢) 0.257s 41 1.16x
🐘 Postgres Express 1.992s (-10.5% 🟢) 2.378s (-21.1% 🟢) 0.386s 38 1.19x
💻 Local Next.js (Turbopack) 2.188s 3.007s 0.819s 30 1.31x
💻 Local Nitro 2.977s (+35.7% 🔺) 3.257s (+8.3% 🔺) 0.281s 28 1.78x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Next.js (Turbopack) 33.772s (+4.1%) 35.926s (+6.8% 🔺) 2.153s 3 1.00x
▲ Vercel Nitro 36.133s (+20.4% 🔺) 37.778s (+18.5% 🔺) 1.645s 3 1.07x
▲ Vercel Express 36.364s (+20.5% 🔺) 38.136s (+19.2% 🔺) 1.772s 3 1.08x

🔍 Observability: Next.js (Turbopack) | Nitro | Express

workflow with 50 sequential data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 3.445s 4.076s 0.631s 30 1.00x
🐘 Postgres Next.js (Turbopack) 3.778s 4.011s 0.234s 30 1.10x
🐘 Postgres Express 4.044s (-10.0% 🟢) 4.631s (-7.7% 🟢) 0.586s 26 1.17x
🐘 Postgres Nitro 4.086s (-14.7% 🟢) 4.705s (-7.7% 🟢) 0.618s 26 1.19x
💻 Local Next.js (Turbopack) 7.194s 7.764s 0.570s 16 2.09x
💻 Local Nitro 9.162s (+26.5% 🔺) 9.864s (+23.1% 🔺) 0.701s 13 2.66x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 90.243s (+4.7%) 91.975s (+5.4% 🔺) 1.732s 2 1.00x
▲ Vercel Express 92.162s (+2.5%) 93.461s (+1.9%) 1.298s 2 1.02x
▲ Vercel Next.js (Turbopack) 96.144s (+11.6% 🔺) 97.815s (+10.5% 🔺) 1.670s 2 1.07x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 10 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 0.255s 1.008s 0.753s 60 1.00x
🐘 Postgres Express 0.270s (-8.9% 🟢) 1.008s (~) 0.738s 60 1.06x
🐘 Postgres Nitro 0.273s (-14.1% 🟢) 1.008s (~) 0.735s 60 1.07x
🌐 Redis Next.js (Turbopack) 0.394s 1.004s 0.611s 60 1.54x
💻 Local Next.js (Turbopack) 0.558s 1.004s 0.447s 60 2.19x
💻 Local Nitro 0.584s (+4.0%) 1.004s (~) 0.421s 60 2.29x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 1.786s (-9.1% 🟢) 3.897s (+8.5% 🔺) 2.112s 16 1.00x
▲ Vercel Nitro 1.894s (+12.7% 🔺) 3.877s (+17.4% 🔺) 1.983s 16 1.06x
▲ Vercel Next.js (Turbopack) 2.243s (+18.6% 🔺) 4.198s (+11.9% 🔺) 1.955s 15 1.26x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

workflow with 25 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 0.482s 1.008s 0.525s 90 1.00x
🐘 Postgres Express 0.499s (-8.8% 🟢) 1.007s (~) 0.508s 90 1.03x
🐘 Postgres Nitro 0.503s (-13.7% 🟢) 1.008s (~) 0.505s 90 1.04x
🌐 Redis Next.js (Turbopack) 1.194s 2.006s 0.812s 45 2.48x
💻 Local Nitro 2.478s (+5.7% 🔺) 3.008s (~) 0.530s 30 5.14x
💻 Local Next.js (Turbopack) 2.541s 3.008s 0.468s 30 5.27x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 3.060s (+1.9%) 5.025s (+13.0% 🔺) 1.965s 18 1.00x
▲ Vercel Express 3.316s (+18.3% 🔺) 5.177s (+10.5% 🔺) 1.861s 18 1.08x
▲ Vercel Next.js (Turbopack) 4.446s (+37.1% 🔺) 6.196s (+31.2% 🔺) 1.750s 15 1.45x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

workflow with 50 concurrent data payload steps (10KB)

💻 Local Development

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
🐘 Postgres 🥇 Next.js (Turbopack) 0.763s 1.008s 0.245s 120 1.00x
🐘 Postgres Express 0.781s (-15.0% 🟢) 1.008s (-18.7% 🟢) 0.227s 120 1.02x
🐘 Postgres Nitro 0.794s (-18.0% 🟢) 1.009s (-27.9% 🟢) 0.215s 119 1.04x
🌐 Redis Next.js (Turbopack) 2.764s 3.033s 0.269s 40 3.62x
💻 Local Next.js (Turbopack) 9.718s 10.271s 0.553s 12 12.73x
💻 Local Nitro 11.258s (+13.4% 🔺) 11.848s (+14.4% 🔺) 0.589s 11 14.75x
💻 Local Express ⚠️ missing - - - -

▲ Production (Vercel)

World Framework Workflow Time Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 7.169s (+8.0% 🔺) 9.096s (+3.9%) 1.928s 14 1.00x
▲ Vercel Next.js (Turbopack) 7.859s (-75.5% 🟢) 9.783s (-71.0% 🟢) 1.924s 13 1.10x
▲ Vercel Nitro 8.346s (+15.4% 🔺) 10.090s (+16.3% 🔺) 1.744s 12 1.16x

🔍 Observability: Express | Next.js (Turbopack) | Nitro

Stream Benchmarks (includes TTFB metrics)
workflow with stream

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
💻 Local 🥇 Next.js (Turbopack) 0.141s 1.000s 0.010s 1.015s 0.874s 10 1.00x
🌐 Redis Next.js (Turbopack) 0.175s 1.000s 0.002s 1.008s 0.833s 10 1.24x
🐘 Postgres Next.js (Turbopack) 0.194s 1.001s 0.001s 1.013s 0.818s 10 1.37x
💻 Local Nitro 0.202s (+45.4% 🔺) 1.003s (~) 0.012s (+22.7% 🔺) 1.018s (~) 0.816s 10 1.43x
🐘 Postgres Nitro 0.207s (-13.8% 🟢) 0.996s (~) 0.001s (-27.8% 🟢) 1.011s (~) 0.804s 10 1.46x
🐘 Postgres Express 0.225s (+1.1%) 0.993s (~) 0.001s (+7.7% 🔺) 1.011s (~) 0.786s 10 1.59x
💻 Local Express ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 1.620s (+4.8%) 2.946s (+12.7% 🔺) 0.251s (-45.1% 🟢) 3.994s (+11.1% 🔺) 2.374s 10 1.00x
▲ Vercel Express 1.656s (-22.6% 🟢) 2.916s (-12.9% 🟢) 0.367s (+224.8% 🔺) 3.908s (-6.4% 🟢) 2.252s 10 1.02x
▲ Vercel Next.js (Turbopack) 1.664s (+1.7%) 2.778s (-4.8%) 0.502s (+25.7% 🔺) 3.891s (~) 2.227s 10 1.03x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

stream pipeline with 5 transform steps (1MB)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 0.492s 1.001s 0.005s 1.013s 0.521s 60 1.00x
💻 Local Next.js (Turbopack) 0.573s 1.007s 0.009s 1.023s 0.449s 59 1.17x
🐘 Postgres Express 0.604s (-14.1% 🟢) 1.005s (~) 0.005s (-18.3% 🟢) 1.025s (-0.8%) 0.421s 59 1.23x
🐘 Postgres Next.js (Turbopack) 0.607s 1.008s 0.005s 1.024s 0.418s 59 1.23x
🐘 Postgres Nitro 0.638s (-12.0% 🟢) 1.020s (+1.5%) 0.004s (-26.6% 🟢) 1.042s (+1.1%) 0.404s 58 1.30x
💻 Local Nitro 0.729s (+28.4% 🔺) 1.009s (~) 0.009s (-5.1% 🟢) 1.022s (~) 0.293s 59 1.48x
💻 Local Express ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 4.497s (-15.8% 🟢) 5.862s (-4.6%) 0.363s (-42.3% 🟢) 7.038s (-6.7% 🟢) 2.541s 9 1.00x
▲ Vercel Express 4.532s (~) 6.077s (+8.6% 🔺) 0.196s (-17.2% 🟢) 6.920s (+2.7%) 2.388s 9 1.01x
▲ Vercel Next.js (Turbopack) 4.867s (+6.6% 🔺) 5.982s (+7.0% 🔺) 0.389s (+72.9% 🔺) 7.007s (+8.4% 🔺) 2.140s 9 1.08x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

10 parallel streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 0.894s 0.999s 0.000s 1.004s 0.110s 60 1.00x
🐘 Postgres Next.js (Turbopack) 0.961s 1.254s 0.000s 1.278s 0.317s 47 1.07x
🐘 Postgres Nitro 0.975s (-16.6% 🟢) 1.196s (-40.1% 🟢) 0.000s (-69.4% 🟢) 1.243s (-38.6% 🟢) 0.267s 49 1.09x
🐘 Postgres Express 0.980s (-12.7% 🟢) 1.328s (-25.4% 🟢) 0.000s (-24.4% 🟢) 1.343s (-25.8% 🟢) 0.363s 45 1.10x
💻 Local Next.js (Turbopack) 1.226s 2.016s 0.000s 2.021s 0.795s 30 1.37x
💻 Local Nitro 1.236s (+9.0% 🔺) 2.021s (~) 0.000s (~) 2.024s (~) 0.787s 30 1.38x
💻 Local Express ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Express 2.890s (-9.8% 🟢) 4.245s (-1.5%) 0.000s (-100.0% 🟢) 5.053s (-1.7%) 2.163s 12 1.00x
▲ Vercel Nitro 3.312s (+9.5% 🔺) 4.315s (+9.9% 🔺) 0.000s (-46.2% 🟢) 5.039s (+13.9% 🔺) 1.727s 13 1.15x
▲ Vercel Next.js (Turbopack) 3.659s (+3.9%) 4.873s (+4.7%) 0.000s (+Infinity% 🔺) 5.707s (+9.6% 🔺) 2.048s 11 1.27x

🔍 Observability: Express | Nitro | Next.js (Turbopack)

fan-out fan-in 10 streams (1MB each)

💻 Local Development

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
🌐 Redis 🥇 Next.js (Turbopack) 1.567s 2.035s 0.000s 2.040s 0.473s 30 1.00x
🐘 Postgres Express 1.745s (-15.2% 🟢) 2.098s (-15.3% 🟢) 0.000s (+Infinity% 🔺) 2.115s (-15.4% 🟢) 0.370s 29 1.11x
🐘 Postgres Nitro 1.750s (-22.7% 🟢) 2.138s (-27.6% 🟢) 0.000s (-62.5% 🟢) 2.155s (-27.4% 🟢) 0.405s 28 1.12x
🐘 Postgres Next.js (Turbopack) 1.863s 2.144s 0.000s 2.153s 0.290s 28 1.19x
💻 Local Nitro 3.515s (+1.6%) 4.032s (~) 0.001s (-15.4% 🟢) 4.036s (~) 0.521s 15 2.24x
💻 Local Next.js (Turbopack) 3.696s 4.228s 0.001s 4.234s 0.538s 15 2.36x
💻 Local Express ⚠️ missing - - - - -

▲ Production (Vercel)

World Framework Workflow Time TTFB Slurp Wall Time Overhead Samples vs Fastest
▲ Vercel 🥇 Nitro 4.053s (+17.1% 🔺) 5.087s (+21.4% 🔺) 0.000s (+Infinity% 🔺) 5.740s (+20.9% 🔺) 1.686s 11 1.00x
▲ Vercel Express 4.384s (+11.1% 🔺) 5.628s (+12.9% 🔺) 0.000s (NaN%) 6.496s (+10.9% 🔺) 2.113s 10 1.08x
▲ Vercel Next.js (Turbopack) 5.113s (+14.2% 🔺) 5.836s (+7.8% 🔺) 0.000s (+10.0% 🔺) 6.596s (+10.9% 🔺) 1.483s 10 1.26x

🔍 Observability: Nitro | Express | Next.js (Turbopack)

Summary

Fastest Framework by World

Winner determined by most benchmark wins

World 🥇 Fastest Framework Wins
💻 Local Next.js (Turbopack) 18/21
🐘 Postgres Next.js (Turbopack) 14/21
▲ Vercel Nitro 13/21
Fastest World by Framework

Winner determined by most benchmark wins

Framework 🥇 Fastest World Wins
Express 🐘 Postgres 19/21
Next.js (Turbopack) 🐘 Postgres 9/21
Nitro 🐘 Postgres 16/21
Column Definitions
  • Workflow Time: Runtime reported by workflow (completedAt - createdAt) - primary metric
  • TTFB: Time to First Byte - time from workflow start until first stream byte received (stream benchmarks only)
  • Slurp: Time from first byte to complete stream consumption (stream benchmarks only)
  • Wall Time: Total testbench time (trigger workflow + poll for result)
  • Overhead: Testbench overhead (Wall Time - Workflow Time)
  • Samples: Number of benchmark iterations run
  • vs Fastest: How much slower compared to the fastest configuration for this benchmark

Worlds:

  • 💻 Local: In-memory filesystem world (local development)
  • 🐘 Postgres: PostgreSQL database world (local development)
  • ▲ Vercel: Vercel production/preview deployment
  • 🌐 Turso: Community world (local development)
  • 🌐 MongoDB: Community world (local development)
  • 🌐 Redis: Community world (local development)
  • 🌐 Jazz: Community world (local development)

📋 View full workflow run


// --- Max delivery check ---
// Enforce max delivery limit before any infrastructure calls.
// This prevents runaway steps from consuming infinite queue deliveries.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// This prevents runaway steps from consuming infinite queue deliveries.
// This prevents runaway steps from consuming infinite queue deliveries.
// At this point, we want to do the minimal amount of work (no fetching
// of the step details, etc. We simply attempt to mark the step as failed
// and enqueue the workflow once, and if either of those fails, the message
// is still consumed but with adequate logging that an error occurred.

Comment on lines +145 to +147
'Failed to post run_failed for max deliveries exceeded, consuming message anyway',
{
workflowRunId: runId,
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like the step error message, this should also be more verbose and explain that a persistent outage is preventing us from failing the run normally etc. etc.

EntityConflictError.is(err) ||
RunExpiredError.is(err)
) {
// Run already finished, consume the message
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// Run already finished, consume the message
// Run already finished, consume the message silently

return;
}
runtimeLogger.error(
'Failed to post run_failed for max deliveries exceeded, consuming message anyway',
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Like the step error message, this should also be more verbose and explain that a persistent outage is preventing us from failing the run normally etc. etc.


// --- Max delivery check ---
// Enforce max delivery limit before any infrastructure calls.
// This prevents runaway workflows from consuming infinite queue deliveries.
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// This prevents runaway workflows from consuming infinite queue deliveries.
// This prevents runaway workflows from consuming infinite queue deliveries.
// At this point, we want to do the minimal amount of work (no fetching
// of the workflow events, etc. We simply attempt to mark the run as failed
// and if that fails, the message is still consumed but with adequate logging
// that an error occurred preventing us from failing the run.

);

// Small backoff to avoid tight retry loops on persistent failures
await setTimeout(Math.min(1000, 100 * (attempt + 1)));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's make this 5s to mimic VQS behavior

@@ -0,0 +1 @@
export const MAX_QUEUE_DELIVERIES = 64;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should be 48 with a comment to VQS behavior

Copy link
Member

@VaguelySerious VaguelySerious left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Weirdly unit tested are failing with Serde stuff, seems unrelated

pranaygp and others added 3 commits March 23, 2026 16:58
Replace VQS maxDeliveries cap with handler-level enforcement. Handlers
now gracefully fail runs/steps after excessive queue redeliveries,
preventing "phantom stuck" runs.

- Add MAX_QUEUE_DELIVERIES constant (64) and enforce in both workflow
  and step handlers with run_failed/step_failed events
- Remove maxDeliveries from VQS trigger configs (builders + sveltekit)
- Improve world-local queue: safety limit loop, structured logging
  with runId/stepId, backoff delay on failures
- Add MAX_DELIVERIES_EXCEEDED error code

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- Expand max-delivery comments explaining minimal-work approach
- Make workflow handler error message verbose (matching step handler)
- Fix comment: "consume the message silently"
- Reduce local queue safety limit from 1000 to 256

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
VQS uses linear 5s backoff for attempts 1-32, then exponential capped
at 2h. At 48 attempts total elapsed time is ~20h, safely under the
24h message visibility limit. Local world now uses 5s linear backoff
to approximate VQS timing.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants