Skip to content

Conversation

@pranaygp
Copy link
Collaborator

@pranaygp pranaygp commented Nov 29, 2025

Add some simple vitest based performance benchmarking for local world, postgres world, and vercel world, along with a github action to track it

Every PR will now run benchmarks across a few frameworks and all worlds. Once complete, a summary is posted with the results: https://github.com/vercel/workflow/actions/runs/19782846073/attempts/1#summary-56685707991

CleanShot 2025-11-29 at 01 07 18@2x

@changeset-bot
Copy link

changeset-bot bot commented Nov 29, 2025

⚠️ No Changeset found

Latest commit: a9858e9

Merging this PR will not cause a version bump for any packages. If these changes should not result in a new version, you're good to go. If these changes should result in a version bump, you need to add a changeset.

This PR includes no changesets

When changesets are added to this PR, you'll see the packages that this PR includes changesets for and the associated semver types

Click here to learn what changesets are, and how to add one.

Click here if you're a maintainer who wants to add a changeset to this PR

@vercel
Copy link
Contributor

vercel bot commented Nov 29, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
example-nextjs-workflow-turbopack Ready Ready Preview Comment Nov 29, 2025 11:15am
example-nextjs-workflow-webpack Ready Ready Preview Comment Nov 29, 2025 11:15am
example-workflow Ready Ready Preview Comment Nov 29, 2025 11:15am
workbench-express-workflow Ready Ready Preview Comment Nov 29, 2025 11:15am
workbench-hono-workflow Ready Ready Preview Comment Nov 29, 2025 11:15am
workbench-nitro-workflow Ready Ready Preview Comment Nov 29, 2025 11:15am
workbench-nuxt-workflow Ready Ready Preview Comment Nov 29, 2025 11:15am
workbench-sveltekit-workflow Ready Ready Preview Comment Nov 29, 2025 11:15am
workbench-vite-workflow Ready Ready Preview Comment Nov 29, 2025 11:15am
workflow-docs Ready Ready Preview Comment Nov 29, 2025 11:15am

Copy link
Collaborator Author

This stack of pull requests is managed by Graphite. Learn more about stacking.

@pranaygp pranaygp marked this pull request as draft November 29, 2025 08:13
@pranaygp pranaygp force-pushed the pranaygp/11-28-add_benchmarking branch from c84c8bc to 13b0aa9 Compare November 29, 2025 08:13
@pranaygp pranaygp force-pushed the pranaygp/11-28-add_benchmarking branch from 13b0aa9 to 441623c Compare November 29, 2025 08:29
Copy link
Contributor

@vercel vercel bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Additional Suggestions:

  1. The GET handler is missing workflow metadata headers required by the benchmarking system to calculate execution times.
View Details
📝 Patch Details
diff --git a/workbench/sveltekit/src/routes/api/trigger/+server.ts b/workbench/sveltekit/src/routes/api/trigger/+server.ts
index f1017f6..77fab65 100644
--- a/workbench/sveltekit/src/routes/api/trigger/+server.ts
+++ b/workbench/sveltekit/src/routes/api/trigger/+server.ts
@@ -99,13 +99,25 @@ export const GET: RequestHandler = async ({ request }) => {
     const run = getRun(runId);
     const returnValue = await run.returnValue;
     console.log('Return value:', returnValue);
+
+    // Include run metadata in headers
+    const [createdAt, startedAt, completedAt] = await Promise.all([
+      run.createdAt,
+      run.startedAt,
+      run.completedAt,
+    ]);
+    const headers: HeadersInit =
+      returnValue instanceof ReadableStream
+        ? { 'Content-Type': 'application/octet-stream' }
+        : {};
+
+    headers['X-Workflow-Run-Created-At'] = createdAt?.toISOString() || '';
+    headers['X-Workflow-Run-Started-At'] = startedAt?.toISOString() || '';
+    headers['X-Workflow-Run-Completed-At'] = completedAt?.toISOString() || '';
+
     return returnValue instanceof ReadableStream
-      ? new Response(returnValue, {
-          headers: {
-            'Content-Type': 'application/octet-stream',
-          },
-        })
-      : Response.json(returnValue);
+      ? new Response(returnValue, { headers })
+      : Response.json(returnValue, { headers });
   } catch (error) {
     if (error instanceof Error) {
       if (WorkflowRunNotCompletedError.is(error)) {

Analysis

Missing workflow metadata headers in SvelteKit GET handler breaks benchmark timing calculations

What fails: SvelteKit's GET handler in /workbench/sveltekit/src/routes/api/trigger/+server.ts does not extract or set workflow metadata headers (X-Workflow-Run-Created-At, X-Workflow-Run-Started-At, X-Workflow-Run-Completed-At), preventing the benchmark system from calculating execution times.

How to reproduce:

  1. Run the benchmark suite against the SvelteKit workbench: pnpm bench
  2. Observe the resulting bench-timings-*.json file - the executionTimeMs field will be missing/undefined for SvelteKit workflows

Result: Benchmark records have null createdAt/completedAt timestamps and no executionTimeMs calculated. The benchmark code in packages/core/e2e/bench.bench.ts (lines 79-85) extracts these headers and uses them to calculate execution time (lines 163-167), but without the headers, executionTimeMs remains undefined.

Expected: All framework implementations (Express, Hono, Next.js-turbopack, Nitro-v3) set these headers in their GET handler responses, allowing the benchmark to properly calculate executionTimeMs = completedAt - createdAt.

Fix implemented: Added the same header extraction and response code to SvelteKit's GET handler, extracting run.createdAt, run.startedAt, and run.completedAt timestamps and setting them as ISO string headers in the response, matching the pattern used in all other framework implementations.

2. The GET handler is missing workflow metadata headers required by the benchmarking system to calculate execution times\.
View Details
📝 Patch Details
diff --git a/workbench/vite/routes/api/trigger.get.ts b/workbench/vite/routes/api/trigger.get.ts
index a7ef468..4487105 100644
--- a/workbench/vite/routes/api/trigger.get.ts
+++ b/workbench/vite/routes/api/trigger.get.ts
@@ -38,13 +38,25 @@ export default async ({ url }: { req: Request; url: URL }) => {
     const run = getRun(runId);
     const returnValue = await run.returnValue;
     console.log('Return value:', returnValue);
+
+    // Include run metadata in headers
+    const [createdAt, startedAt, completedAt] = await Promise.all([
+      run.createdAt,
+      run.startedAt,
+      run.completedAt,
+    ]);
+    const headers: HeadersInit =
+      returnValue instanceof ReadableStream
+        ? { 'Content-Type': 'application/octet-stream' }
+        : {};
+
+    headers['X-Workflow-Run-Created-At'] = createdAt?.toISOString() || '';
+    headers['X-Workflow-Run-Started-At'] = startedAt?.toISOString() || '';
+    headers['X-Workflow-Run-Completed-At'] = completedAt?.toISOString() || '';
+
     return returnValue instanceof ReadableStream
-      ? new Response(returnValue, {
-          headers: {
-            'Content-Type': 'application/octet-stream',
-          },
-        })
-      : Response.json(returnValue);
+      ? new Response(returnValue, { headers })
+      : Response.json(returnValue, { headers });
   } catch (error) {
     if (error instanceof Error) {
       if (WorkflowRunNotCompletedError.is(error)) {

Analysis

Missing workflow timing metadata headers in Vite trigger endpoint

What fails: GET handler in workbench/vite/routes/api/trigger.get.ts returns responses without the X-Workflow-Run-Created-At, X-Workflow-Run-Started-At, and X-Workflow-Run-Completed-At headers required by the benchmarking system.

How to reproduce:

# Build and run Vite workbench, then:
curl -s "http://localhost:5173/api/trigger?runId=<completed-run-id>" \
  -H "Accept: application/json" | grep -i "X-Workflow"
# Returns no headers with timing metadata

Result: Response headers are missing timing metadata. The benchmark code in packages/core/e2e/bench.bench.ts (lines 79-85) attempts to extract these headers to calculate execution times, but gets null values instead, leaving executionTimeMs undefined.

Expected: All frameworks (Express, Hono, Next.js Turbopack, Nitro v3) include these headers in their GET response. The Vite implementation should match this pattern to enable benchmark timing calculations.

Reference implementations:

  • workbench/express/src/index.ts - Uses res.setHeader() to set timing headers
  • workbench/hono/src/index.ts - Includes timing headers in response
  • workbench/nextjs-turbopack/app/api/trigger/route.ts - Sets timing headers in response
  • workbench/nitro-v3/routes/api/trigger.get.ts - Sets timing headers in response
3. The GET handler is missing workflow metadata headers required by the benchmarking system to calculate execution times\.
View Details
📝 Patch Details
diff --git a/workbench/nextjs-webpack/app/api/trigger/route.ts b/workbench/nextjs-webpack/app/api/trigger/route.ts
index d456735..e1e0e0a 100644
--- a/workbench/nextjs-webpack/app/api/trigger/route.ts
+++ b/workbench/nextjs-webpack/app/api/trigger/route.ts
@@ -97,13 +97,25 @@ export async function GET(req: Request) {
     const run = getRun(runId);
     const returnValue = await run.returnValue;
     console.log('Return value:', returnValue);
+
+    // Include run metadata in headers
+    const [createdAt, startedAt, completedAt] = await Promise.all([
+      run.createdAt,
+      run.startedAt,
+      run.completedAt,
+    ]);
+    const headers: HeadersInit =
+      returnValue instanceof ReadableStream
+        ? { 'Content-Type': 'application/octet-stream' }
+        : {};
+
+    headers['X-Workflow-Run-Created-At'] = createdAt?.toISOString() || '';
+    headers['X-Workflow-Run-Started-At'] = startedAt?.toISOString() || '';
+    headers['X-Workflow-Run-Completed-At'] = completedAt?.toISOString() || '';
+
     return returnValue instanceof ReadableStream
-      ? new Response(returnValue, {
-          headers: {
-            'Content-Type': 'application/octet-stream',
-          },
-        })
-      : Response.json(returnValue);
+      ? new Response(returnValue, { headers })
+      : Response.json(returnValue, { headers });
   } catch (error) {
     if (error instanceof Error) {
       if (WorkflowRunNotCompletedError.is(error)) {

Analysis

Missing workflow timing metadata headers in nextjs-webpack trigger endpoint

What fails: The GET handler in workbench/nextjs-webpack/app/api/trigger/route.ts returns workflow results without the X-Workflow-Run-Created-At, X-Workflow-Run-Started-At, and X-Workflow-Run-Completed-At headers required by the benchmarking system, causing executionTimeMs calculations to fail with null/undefined timing values.

How to reproduce:

  1. Run benchmarks using the nextjs-webpack workbench: DEPLOYMENT_URL=http://localhost:3000 npm run bench
  2. The benchmark code extracts timing headers from the GET response at packages/core/e2e/bench.bench.ts lines 79-85
  3. These headers are missing from nextjs-webpack but present in nextjs-turbopack

Result: The benchmark records timing data with all header values as null:

{
  "createdAt": null,
  "startedAt": null,
  "completedAt": null,
  "executionTimeMs": undefined
}

Expected: Response should include timing metadata headers like the nextjs-turbopack implementation does, allowing the benchmark code (lines 163-167 of bench.bench.ts) to calculate executionTimeMs = completedAt.getTime() - createdAt.getTime()

Fix implemented: Added the missing header-extraction and response-building code to match the nextjs-turbopack implementation, extracting run.createdAt, run.startedAt, and run.completedAt and including them in the response headers with ISO string formatting.

Fix on Vercel

@pranaygp pranaygp marked this pull request as ready for review November 29, 2025 11:06
@pranaygp pranaygp force-pushed the pranaygp/11-28-add_benchmarking branch from 6224574 to 1af51b7 Compare November 29, 2025 11:08
@github-actions
Copy link
Contributor

github-actions bot commented Nov 29, 2025

📊 Benchmark Comparison

Cross-matrix comparison of workflow performance across frameworks and backends.

workflow with no steps

Backend Framework Workflow Time Wall Time Overhead vs Fastest
💻 Local 🥇 Next.js (Turbopack) 0.070s 1.012s 0.942s 1.00x
💻 Local Express 0.098s 1.009s 0.912s 1.40x
💻 Local nitro-v3 0.119s 1.013s 0.894s 1.70x
▲ Vercel Next.js (Turbopack) 0.518s 1.348s 0.830s 7.40x
▲ Vercel nitro-v3 0.560s 1.601s 1.041s 8.00x
▲ Vercel Express 0.569s 1.520s 0.951s 8.12x
🐘 Postgres Express 1.519s 1.923s 0.404s 21.71x
🐘 Postgres Next.js (Turbopack) 1.584s 2.018s 0.434s 22.63x
🐘 Postgres nitro-v3 1.786s 2.012s 0.226s 25.51x

workflow with 1 step

Backend Framework Workflow Time Wall Time Overhead vs Fastest
💻 Local 🥇 Next.js (Turbopack) 0.160s 1.011s 0.851s 1.00x
💻 Local Express 0.236s 1.009s 0.773s 1.48x
💻 Local nitro-v3 0.321s 1.009s 0.688s 2.01x
▲ Vercel nitro-v3 1.477s 2.429s 0.953s 9.24x
▲ Vercel Express 1.489s 2.551s 1.062s 9.32x
▲ Vercel Next.js (Turbopack) 1.563s 2.514s 0.950s 9.78x
🐘 Postgres Next.js (Turbopack) 3.661s 4.020s 0.359s 22.91x
🐘 Postgres Express 5.610s 6.021s 0.411s 35.11x
🐘 Postgres nitro-v3 5.702s 6.016s 0.314s 35.69x

workflow with 10 sequential steps

Backend Framework Workflow Time Wall Time Overhead vs Fastest
💻 Local 🥇 Next.js (Turbopack) 1.102s 2.011s 0.909s 1.00x
💻 Local Express 1.682s 2.014s 0.332s 1.53x
💻 Local nitro-v3 2.247s 3.013s 0.766s 2.04x
▲ Vercel Next.js (Turbopack) 10.092s 10.944s 0.852s 9.16x
▲ Vercel Express 10.410s 11.141s 0.731s 9.45x
▲ Vercel nitro-v3 10.747s 11.930s 1.184s 9.76x
🐘 Postgres Next.js (Turbopack) 21.718s 22.047s 0.329s 19.72x
🐘 Postgres nitro-v3 41.440s 41.954s 0.515s 37.62x
🐘 Postgres Express 41.498s 41.964s 0.466s 37.67x

workflow with 10 parallel steps

Backend Framework Workflow Time Wall Time Overhead vs Fastest
💻 Local 🥇 Next.js (Turbopack) 0.553s 1.011s 0.458s 1.00x
💻 Local Express 0.761s 1.113s 0.352s 1.38x
💻 Local nitro-v3 0.970s 1.117s 0.147s 1.75x
▲ Vercel nitro-v3 2.653s 3.694s 1.041s 4.80x
▲ Vercel Express 2.665s 3.514s 0.848s 4.82x
▲ Vercel Next.js (Turbopack) 2.680s 3.477s 0.797s 4.85x
🐘 Postgres Next.js (Turbopack) 4.732s 5.018s 0.286s 8.56x
🐘 Postgres Express 5.233s 5.216s -0.017s 9.47x
🐘 Postgres nitro-v3 5.683s 6.013s 0.330s 10.28x

Summary: Fastest Framework by Backend

Backend Fastest Framework Workflow Time
💻 Local Next.js (Turbopack) 0.471s (avg)
🐘 Postgres Next.js (Turbopack) 7.923s (avg)
▲ Vercel Next.js (Turbopack) 3.713s (avg)

Summary: Fastest Backend by Framework

Framework Fastest Backend Workflow Time
Express 💻 Local 0.694s (avg)
Next.js (Turbopack) 💻 Local 0.471s (avg)
nitro-v3 💻 Local 0.914s (avg)
Column Definitions
  • Workflow Time: Runtime reported by workflow (completedAt - createdAt) - primary metric
  • Wall Time: Total testbench time (trigger workflow + poll for result)
  • Overhead: Testbench overhead (Wall Time - Workflow Time)
  • vs Fastest: How much slower compared to the fastest configuration for this benchmark

Backends:

  • 💻 Local: In-memory filesystem backend
  • 🐘 Postgres: PostgreSQL database backend
  • ▲ Vercel: Vercel production backend

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants