diff --git a/packages/utils/docs/profiler.md b/packages/utils/docs/profiler.md index e5249632c..daa8452c6 100644 --- a/packages/utils/docs/profiler.md +++ b/packages/utils/docs/profiler.md @@ -47,7 +47,6 @@ The `Profiler` class provides a clean, type-safe API for performance monitoring utils: { track: 'Utils', color: 'primary' }, core: { track: 'Core', color: 'primary-light' }, }, - enabled: true, }); ``` @@ -207,7 +206,6 @@ const profiler = new Profiler({ utils: { track: 'Utils', color: 'primary' }, core: { track: 'Core', color: 'primary-light' }, }, - enabled: true, }); // Simple measurement @@ -283,6 +281,31 @@ The profiler automatically subscribes to process events (`exit`, `SIGINT`, `SIGT The `close()` method is idempotent and safe to call from exit handlers. It unsubscribes from exit handlers, closes the WAL sink, and unsubscribes from the performance observer, ensuring all buffered performance data is written before process termination. +### Profiler Lifecycle States + +The NodeJSProfiler follows a state machine with three distinct states: + +**State Machine Flow** + +``` +idle ⇄ running + ↓ ↓ + └──→ closed +``` + +- **idle**: Profiler is initialized but not actively collecting measurements. WAL sink is closed and performance observer is unsubscribed. +- **running**: Profiler is actively collecting performance measurements. WAL sink is open and performance observer is subscribed. +- **closed**: Profiler has been closed and all buffered data has been flushed to disk. Resources have been fully released. This state is irreversible. + +**State Transitions:** + +- `idle` → `running`: Occurs when `setEnabled(true)` is called. Enables profiling, opens WAL sink, and subscribes to performance observer. +- `running` → `idle`: Occurs when `setEnabled(false)` is called. Disables profiling, unsubscribes from performance observer, and closes WAL sink (sink will be reopened on re-enable). +- `running` → `closed`: Occurs when `close()` is called. Disables profiling, unsubscribes, closes sink, finalizes shards, and unsubscribes exit handlers (irreversible). +- `idle` → `closed`: Occurs when `close()` is called. Closes sink if it was opened, finalizes shards, and unsubscribes exit handlers (irreversible). + +Once a state transition to `closed` occurs, there are no transitions back to previous states. This ensures data integrity and prevents resource leaks. + ## Configuration ```ts @@ -295,22 +318,86 @@ new NodejsProfiler(options: NodejsProfilerOptions` | _required_ | Function that encodes raw PerformanceEntry objects into domain-specific types | -| `captureBufferedEntries` | `boolean` | `true` | Whether to capture performance entries that occurred before observation started | -| `flushThreshold` | `number` | `20` | Threshold for triggering queue flushes based on queue length | -| `maxQueueSize` | `number` | `10_000` | Maximum number of items allowed in the queue before new entries are dropped | +| Property | Type | Default | Description | +| ------------------------ | --------------------------------------- | ---------------- | ------------------------------------------------------------------------------------- | +| `format` | `ProfilerFormat` | _required_ | WAL format configuration for sharded write-ahead logging, including `encodePerfEntry` | +| `measureName` | `string` | _auto-generated_ | Optional folder name for sharding. If not provided, a new group ID will be generated | +| `outDir` | `string` | `'tmp/profiles'` | Output directory for WAL shards and final files | +| `format.encodePerfEntry` | `PerformanceEntryEncoder` | _required_ | Function that encodes raw PerformanceEntry objects into domain-specific types | +| `captureBufferedEntries` | `boolean` | `true` | Whether to capture performance entries that occurred before observation started | +| `flushThreshold` | `number` | `20` | Threshold for triggering queue flushes based on queue length | +| `maxQueueSize` | `number` | `10_000` | Maximum number of items allowed in the queue before new entries are dropped | + +### Environment Variables + +The NodeJSProfiler can be configured using environment variables, which override the corresponding options when not explicitly provided: + +| Environment Variable | Type | Default | Description | +| -------------------------- | -------- | ---------------- | ---------------------------------------------------------------------------------------------------------------------------------- | +| `CP_PROFILING` | `string` | _unset_ | Enables or disables profiling globally. Set to `'true'` to enable, `'false'` or unset to disable. | +| `DEBUG` | `string` | _unset_ | Enables debug mode for profiler state transitions. When set to `'true'`, state transitions create performance marks for debugging. | +| `CP_PROFILER_OUT_DIR` | `string` | `'tmp/profiles'` | Output directory for WAL shards and final files. Overrides the `outDir` option. | +| `CP_PROFILER_MEASURE_NAME` | `string` | _auto-generated_ | Measure name used for sharding. Overrides the `measureName` option. If not provided, a new group ID will be generated. | + +```bash +# Enable profiling with custom output directory +CP_PROFILING=true CP_PROFILER_OUT_DIR=/path/to/profiles npm run dev + +# Enable profiling with debug mode and custom measure name +CP_PROFILING=true DEBUG=true CP_PROFILER_MEASURE_NAME=my-measure npm run dev +``` ## API Methods The NodeJSProfiler inherits all API methods from the base Profiler class and adds additional methods for queue management and WAL lifecycle control. -| Method | Description | -| ------------------------------------ | ------------------------------------------------------------------------------- | -| `getStats()` | Returns comprehensive queue statistics for monitoring and debugging. | -| `flush()` | Forces immediate writing of all queued performance entries to the WAL. | -| `setEnabled(enabled: boolean): void` | Controls profiling at runtime with automatic WAL/observer lifecycle management. | +| Method | Description | +| ------------------------------------ | ---------------------------------------------------------------------------------------- | +| `stats` | Returns comprehensive queue statistics and profiling state for monitoring and debugging. | +| `state` | Returns current profiler state (`'idle' \| 'running' \| 'closed'`). | +| `close()` | Closes profiler and releases resources. Idempotent, safe for exit handlers. | +| `flush()` | Forces immediate writing of all queued performance entries to the WAL. | +| `setEnabled(enabled: boolean): void` | Controls profiling at runtime with automatic WAL/observer lifecycle management. | + +### Profiler state + +```ts +profiler.state: 'idle' | 'running' | 'closed' +``` + +Returns the current profiler state. Use this to check the profiler's lifecycle state without accessing the full stats object. + +```ts +// Check current state +if (profiler.state === 'running') { + console.log('Profiler is actively collecting measurements'); +} else if (profiler.state === 'idle') { + console.log('Profiler is initialized but not collecting'); +} else { + console.log('Profiler has been closed'); +} +``` + +### Closing the profiler + +```ts +profiler.close(): void +``` + +Closes profiler and releases resources. This method is idempotent and safe to call from exit handlers. When called, it transitions the profiler to the `closed` state, which is irreversible. All buffered data is flushed, shards are finalized, and exit handlers are unsubscribed. + +```ts +// Close profiler when done +profiler.close(); + +// Safe to call multiple times (idempotent) +profiler.close(); // No-op if already closed + +// Check if closed +if (profiler.state === 'closed') { + console.log('Profiler resources have been released'); +} +``` ### Runtime control with Write Ahead Log lifecycle management @@ -327,13 +414,23 @@ await performHeavyOperation(); profiler.setEnabled(true); // WAL reopens and observer resubscribes ``` -### Queue statistics +### Profiler statistics ```ts -profiler.getStats(): { - enabled: boolean; - observing: boolean; - walOpen: boolean; +profiler.stats: { + profilerState: 'idle' | 'running' | 'closed'; + debug: boolean; + sharderState: 'active' | 'finalized' | 'cleaned'; + shardCount: number; + groupId: string; + isCoordinator: boolean; + isFinalized: boolean; + isCleaned: boolean; + finalFilePath: string; + shardFileCount: number; + shardFiles: string[]; + shardOpen: boolean; + shardPath: string; isSubscribed: boolean; queued: number; dropped: number; @@ -345,16 +442,6 @@ profiler.getStats(): { } ``` -Returns comprehensive queue statistics for monitoring and debugging. Provides insight into the current state of the performance entry queue, useful for monitoring memory usage and processing throughput. - -```ts -const stats = profiler.getStats(); -console.log(`Enabled: ${stats.enabled}, WAL Open: ${stats.walOpen}, Observing: ${stats.observing}, Subscribed: ${stats.isSubscribed}, Queued: ${stats.queued}`); -if (stats.enabled && stats.walOpen && stats.observing && stats.isSubscribed && stats.queued > stats.flushThreshold) { - console.log('Queue nearing capacity, consider manual flush'); -} -``` - ### Manual flushing ```ts diff --git a/packages/utils/eslint.config.js b/packages/utils/eslint.config.js index ecb88a924..468f67b1c 100644 --- a/packages/utils/eslint.config.js +++ b/packages/utils/eslint.config.js @@ -13,7 +13,11 @@ export default tseslint.config( }, }, { - files: ['packages/utils/src/lib/**/wal*.ts'], + files: [ + 'packages/utils/src/lib/**/wal*.ts', + 'packages/utils/src/lib/**/wal*.test.ts', + 'packages/utils/src/lib/profiler/*.test.ts', + ], rules: { 'n/no-sync': 'off', }, diff --git a/packages/utils/mocks/README.md b/packages/utils/mocks/README.md new file mode 100644 index 000000000..da91fde91 --- /dev/null +++ b/packages/utils/mocks/README.md @@ -0,0 +1,29 @@ +# Mocks + +## multiprocess-profiling + +The `profiler-worker.mjs` script demonstrates multiprocess profiling by spawning N child processes that perform work and generate performance traces. + +### Expected Output + +**Console:** + +- JSON object containing profiler statistics (profiler state, shard info, queue stats, etc.) + +**Files:** + +- A timestamped directory in `CP_PROFILER_OUT_DIR` (e.g., `20260131-210017-052/`) + - `trace....jsonl` - WAL format trace files (one per process) + - `trace..json` - Consolidated trace file in Chrome DevTools format + +### Usage + +```bash +CP_PROFILING=true DEBUG=true CP_PROFILER_OUT_DIR=/path/to/output npx tsx packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs +``` + +**Example:** + +```bash + CP_PROFILING=true DEBUG=true CP_PROFILER_OUT_DIR=./tmp/int/utils npx tsx --tsconfig tsconfig.base.json packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs 3 +``` diff --git a/packages/utils/mocks/fixtures/minimal-trace-async-events-user-timing-devtools-colors.json b/packages/utils/mocks/fixtures/minimal-trace-async-events-user-timing-devtools-colors.json new file mode 100644 index 000000000..6efadb1a8 --- /dev/null +++ b/packages/utils/mocks/fixtures/minimal-trace-async-events-user-timing-devtools-colors.json @@ -0,0 +1,104 @@ +{ + "traceEvents": [ + { + "cat": "disabled-by-default-devtools.timeline", + "name": "TracingStartedInBrowser", + "ph": "I", + "pid": 1, + "tid": 0, + "ts": 1, + "s": "t", + "args": { + "data": { + "frames": [ + { + "processId": 1, + "url": "file://has-to-be-a-valid-URL-pattern" + } + ] + } + } + }, + { + "args": { + "description": "Artificial RunTask event to mark end of the trace" + }, + "cat": "devtools.timeline", + "dur": 10, + "name": "RunTask", + "ph": "X", + "pid": 1, + "tid": 0, + "ts": 1 + }, + { + "args": { + "data": { + "detail": "{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"CustomT\",\"trackGroup\":\"CustomG\"}}" + } + }, + "cat": "blink.user_timing", + "name": "measure-1:start", + "id2": { "local": "0x2" }, + "s": "t", + "ph": "I", + "pid": 1, + "tid": 1, + "ts": 44 + }, + + { + "cat": "blink.user_timing", + "s": "t", + "ph": "b", + "name": "measure-1", + "pid": 1, + "tid": 1, + "ts": 45, + "id2": { "local": "0x3" }, + "args": { + "detail": "{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"CustomT\",\"trackGroup\":\"CustomG\"}}" + } + }, + { + "cat": "blink.user_timing", + "s": "t", + "ph": "e", + "name": "measure-1", + "pid": 1, + "tid": 1, + "ts": 65, + "id2": { "local": "0x3" }, + "args": { + "detail": "{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"CustomT\",\"trackGroup\":\"CustomG\"}}" + } + }, + { + "args": { + "data": { + "detail": "{\"devtools\":{\"dataType\":\"marker\",\"track\":\"CustomT\",\"trackGroup\":\"CustomG\"}}" + } + }, + "cat": "blink.user_timing", + "name": "measure-1:end", + "id2": { "local": "0x5" }, + "s": "t", + "ph": "I", + "pid": 1, + "tid": 1, + "ts": 66 + }, + { + "args": { + "description": "Artificial RunTask event to mark end of the trace" + }, + "cat": "devtools.timeline", + "dur": 10, + "name": "RunTask", + "ph": "X", + "pid": 1, + "tid": 0, + "ts": 165 + } + ] +} diff --git a/packages/utils/mocks/multiprocess-profiling/profiler-worker-child.mjs b/packages/utils/mocks/multiprocess-profiling/profiler-worker-child.mjs new file mode 100644 index 000000000..0125c132e --- /dev/null +++ b/packages/utils/mocks/multiprocess-profiling/profiler-worker-child.mjs @@ -0,0 +1,14 @@ +import { NodejsProfiler } from '../../src/lib/profiler/profiler-node.js'; +import { + createBufferedEvents, + getProfilerConfig, + performDummyWork, +} from './utils.js'; + +await createBufferedEvents(); + +const profiler = new NodejsProfiler(getProfilerConfig()); + +await performDummyWork(profiler); + +profiler.close(); diff --git a/packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs b/packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs new file mode 100644 index 000000000..2ab9f96ef --- /dev/null +++ b/packages/utils/mocks/multiprocess-profiling/profiler-worker.mjs @@ -0,0 +1,77 @@ +import { spawn } from 'node:child_process'; +import path from 'node:path'; +import { fileURLToPath } from 'node:url'; +import { NodejsProfiler } from '../../src/lib/profiler/profiler-node.js'; +import { createBufferedEvents, getProfilerConfig } from './utils.js'; + +const [numProcesses] = process.argv.slice(2); + +if (!numProcesses) { + console.error('Usage: node profiler-worker.mjs '); + // eslint-disable-next-line unicorn/no-process-exit,n/no-process-exit + process.exit(1); +} + +const numProcs = Number.parseInt(numProcesses, 10); +if (Number.isNaN(numProcs) || numProcs < 1) { + console.error('numProcesses must be a positive integer'); + // eslint-disable-next-line unicorn/no-process-exit,n/no-process-exit + process.exit(1); +} + +const workerScriptPath = path.join( + fileURLToPath(path.dirname(import.meta.url)), + './profiler-worker-child.mjs', +); + +let profiler; +try { + await createBufferedEvents(); + + profiler = new NodejsProfiler(getProfilerConfig()); + + await profiler.measureAsync('profiler-worker', async () => { + const processes = Array.from( + { length: numProcs }, + (_, i) => + new Promise((resolve, reject) => { + const child = spawn('npx', ['tsx', workerScriptPath], { + stdio: 'pipe', + shell: process.platform === 'win32', + }); + + child.on('close', code => { + if (code === 0) { + resolve(code); + } else { + reject(new Error(`Process ${i + 1} exited with code ${code}`)); + } + }); + + child.on('error', reject); + }), + ); + await Promise.all(processes); + }); + + profiler.close(); + // eslint-disable-next-line no-console + console.log(JSON.stringify(profiler.stats, null, 2)); +} catch (error) { + // Ensure profiler is closed and stats are output even on error + if (profiler && profiler.stats.profilerState !== 'closed') { + profiler.close(); + } + // Output stats if profiler was initialized, otherwise exit with error + if (profiler) { + // eslint-disable-next-line no-console + console.log(JSON.stringify(profiler.stats, null, 2)); + // Exit successfully since we've output the stats that the test needs + // eslint-disable-next-line unicorn/no-process-exit,n/no-process-exit + process.exit(0); + } else { + console.error('Failed to initialize profiler:', error); + // eslint-disable-next-line unicorn/no-process-exit,n/no-process-exit + process.exit(1); + } +} diff --git a/packages/utils/mocks/multiprocess-profiling/utils.ts b/packages/utils/mocks/multiprocess-profiling/utils.ts new file mode 100644 index 000000000..e4d5a593a --- /dev/null +++ b/packages/utils/mocks/multiprocess-profiling/utils.ts @@ -0,0 +1,109 @@ +import { + NodejsProfiler, + type NodejsProfilerOptions, +} from '../../src/lib/profiler/profiler-node.js'; +import { entryToTraceEvents } from '../../src/lib/profiler/trace-file-utils.js'; +import type { TraceEvent } from '../../src/lib/profiler/trace-file.type.js'; +import { traceEventWalFormat } from '../../src/lib/profiler/wal-json-trace.js'; +import { + asOptions, + trackEntryPayload, +} from '../../src/lib/user-timing-extensibility-api-utils.js'; +import type { + ActionTrackEntryPayload, + TrackMeta, +} from '../../src/lib/user-timing-extensibility-api.type.js'; + +export function getTrackConfig(): TrackMeta { + return { + track: `Track: ${process.pid}`, + trackGroup: 'Multiprocess', + }; +} + +/** + * Default profiler configuration for multiprocess profiling mocks + */ +export function getProfilerConfig( + options?: Partial< + NodejsProfilerOptions> + >, +): NodejsProfilerOptions> { + return { + format: { + ...traceEventWalFormat(), + encodePerfEntry: entryToTraceEvents, + }, + ...getTrackConfig(), + ...options, + }; +} + +/** + * Creates buffered performance marks and measures before profiler initialization + */ +export async function createBufferedEvents(): Promise { + const bM1 = `buffered-mark-${process.pid}`; + performance.mark( + bM1, + asOptions( + trackEntryPayload({ + ...getTrackConfig(), + color: 'tertiary', + }), + ), + ); + const intervalDelay = Math.floor(Math.random() * 50) + 25; + await new Promise(resolve => setTimeout(resolve, intervalDelay)); + performance.measure(`buffered-${process.pid}`, { + start: bM1, + ...asOptions( + trackEntryPayload({ + ...getTrackConfig(), + color: 'tertiary', + }), + ), + }); +} + +/** + * Performs dummy work with random intervals and work packages + */ +export async function performDummyWork( + profiler: NodejsProfiler, +): Promise { + profiler.marker(`process-${process.pid}:process-start`, { + tooltipText: `Process ${process.pid} started`, + }); + + // Random number of intervals (1-3) - reduced from 2-5 + const numIntervals = Math.floor(Math.random() * 3) + 1; + + // eslint-disable-next-line functional/no-loop-statements + for (let interval = 0; interval < numIntervals; interval++) { + // Random interval delay (25-100ms) + const intervalDelay = Math.floor(Math.random() * 75) + 25; + await new Promise(resolve => setTimeout(resolve, intervalDelay)); + + // Random number of work packages per interval (1-3) + const numWorkPackages = Math.floor(Math.random() * 3) + 1; + + // eslint-disable-next-line functional/no-loop-statements + for (let pkg = 0; pkg < numWorkPackages; pkg++) { + // Random work size (100-2,500,000 elements) + const workSize = Math.floor(Math.random() * 2_500_000); + + profiler.measure( + `process-${process.pid}:interval-${interval}:work-${pkg}`, + () => { + const arr = Array.from({ length: workSize }, (_, i) => i); + return arr.reduce((sum, x) => sum + x * Math.random(), 0); + }, + ); + } + } + + profiler.marker(`process-${process.pid}:process-end`, { + tooltipText: `Process ${process.pid} completed ${numIntervals} intervals`, + }); +} diff --git a/packages/utils/mocks/omit-trace-json.ts b/packages/utils/mocks/omit-trace-json.ts new file mode 100644 index 000000000..b6c236a71 --- /dev/null +++ b/packages/utils/mocks/omit-trace-json.ts @@ -0,0 +1,226 @@ +import * as fs from 'node:fs/promises'; +import path from 'node:path'; +import { + createTraceFile, + decodeEvent, + encodeEvent, + frameName, + frameTreeNodeId, +} from '../src/lib/profiler/trace-file-utils.js'; +import type { + TraceEvent, + TraceEventContainer, + TraceMetadata, +} from '../src/lib/profiler/trace-file.type'; + +const BASE_TS = 1_700_000_005_000_000; +const FIXED_TIME = '2026-01-28T14:29:27.995Z'; + +/* ───────────── IO ───────────── */ +const read = (p: string) => fs.readFile(p, 'utf8').then(s => s.trim()); +const parseJsonl = (s: string) => + s + .split('\n') + .filter(Boolean) + .map(l => JSON.parse(l)); +const parseDecodeJsonl = (s: string) => parseJsonl(s).map(decodeEvent); + +/* ───────────── Metadata ───────────── */ +const normMeta = ( + m?: TraceMetadata | Record, + keepGen = true, +): TraceMetadata | undefined => + m + ? ({ + ...(keepGen + ? m + : Object.fromEntries( + Object.entries(m).filter(([k]) => k !== 'generatedAt'), + )), + startTime: FIXED_TIME, + ...(keepGen && { generatedAt: FIXED_TIME }), + } as TraceMetadata) + : undefined; + +/* ───────────── Detail ───────────── */ +const normalizeDetail = (d: unknown): unknown => { + const o = + typeof d === 'string' + ? JSON.parse(d) + : typeof d === 'object' && d + ? d + : null; + const props = o?.devtools?.properties; + if (!Array.isArray(props)) return d; + + const isTransition = props.some( + e => Array.isArray(e) && e[0] === 'Transition', + ); + + return { + ...o, + devtools: { + ...o.devtools, + properties: props.map(e => { + if (!Array.isArray(e) || typeof e[0] !== 'string') return e; + const [k, v] = e; + if (isTransition) { + if (k.toLowerCase() === 'groupid') return [k, 'group-id']; + if (k.toLowerCase().includes('path')) + return [k, `path/to/${path.basename(String(v))}`]; + } + if (k.includes('Path') || k.includes('Files')) + return [ + k, + Array.isArray(v) + ? v.map(x => path.basename(String(x))) + : path.basename(String(v)), + ]; + return e; + }), + }, + }; +}; + +/* ───────────── Context ───────────── */ +const uniq = (v: (T | undefined)[]) => [ + ...new Set(v.filter(Boolean) as T[]), +]; +const ctx = (e: TraceEvent[], base = BASE_TS) => ({ + pid: new Map( + [...uniq(e.map(x => x.pid))] + .sort() + .map((v, i) => [v, 10_001 + i]), + ), + tid: new Map( + [...uniq(e.map(x => x.tid))] + .sort() + .map((v, i) => [v, i + 1]), + ), + ts: new Map( + [...uniq(e.map(x => x.ts))] + .sort() + .map((v, i) => [v, base + i * 100]), + ), + id: new Map( + [...uniq(e.map(x => x.id2?.local))] + .sort() + .map((v, i) => [v, `0x${(i + 1).toString(16)}`]), + ), +}); + +/* ───────────── Event normalization ───────────── */ +const mapIf = (v: T | undefined, m: Map, k: string) => + v != null && m.has(v) ? { [k]: m.get(v)! } : {}; + +const normalizeEvent = ( + e: TraceEvent, + c: ReturnType, +): TraceEvent => { + const pid = c.pid.get(e.pid) ?? e.pid; + const tid = c.tid.get(e.tid) ?? e.tid; + + const args = e.args && { + ...e.args, + ...(e.args.detail !== undefined && { + detail: normalizeDetail(e.args.detail), + }), + ...(e.args.data && + typeof e.args.data === 'object' && { + data: { + ...(e.args.data as any), + ...(pid && + tid && + 'frameTreeNodeId' in e.args.data && { + frameTreeNodeId: frameTreeNodeId(pid, tid), + }), + ...(Array.isArray((e.args.data as any).frames) && + pid && + tid && { + frames: (e.args.data as any).frames.map((f: any) => ({ + ...f, + processId: pid, + frame: frameName(pid, tid), + })), + }), + }, + }), + }; + + return { + ...e, + ...mapIf(e.pid, c.pid, 'pid'), + ...mapIf(e.tid, c.tid, 'tid'), + ...mapIf(e.ts, c.ts, 'ts'), + ...(e.id2?.local && + c.id.has(e.id2.local) && { + id2: { ...e.id2, local: c.id.get(e.id2.local)! }, + }), + ...(args && { args }), + }; +}; + +/* ───────────── Public normalization ───────────── */ +export const normalizeTraceEvents = ( + events: TraceEvent[], + { baseTimestampUs = BASE_TS } = {}, +) => { + if (events.length === 0) return []; + const decoded = events.map(decodeEvent); + const c = ctx(decoded, baseTimestampUs); + return decoded.map(e => normalizeEvent(e, c)); +}; + +export const normalizeAndFormatEvents = ( + input: TraceEvent[] | string, + opts?: { baseTimestampUs: number }, +) => + typeof input === 'string' + ? input.trim() + ? normalizeTraceEvents(parseJsonl(input).map(decodeEvent), opts) + .map(encodeEvent) + .map(o => JSON.stringify(o)) + .join('\n') + (input.endsWith('\n') ? '\n' : '') + : input + : normalizeTraceEvents(input, opts); + +/* ───────────── Loaders ───────────── */ +export const loadAndOmitTraceJsonl = (p: `${string}.jsonl`, o?: any) => + read(p).then(s => normalizeAndFormatEvents(parseDecodeJsonl(s), o)); + +export const loadTraceJsonlForSnapshot = loadAndOmitTraceJsonl; + +export const loadAndOmitTraceJson = async ( + p: string, + o?: { baseTimestampUs: number }, +): Promise => { + const j = JSON.parse(await read(p)); + if (!j?.traceEvents) return { traceEvents: [] }; + const r = { + traceEvents: normalizeAndFormatEvents(j.traceEvents.map(decodeEvent), o), + ...(j.displayTimeUnit && { displayTimeUnit: j.displayTimeUnit }), + ...(j.metadata && { metadata: normMeta(j.metadata) }), + }; + JSON.stringify(r); + return r; +}; + +export const loadNormalizedTraceJson = async ( + p: `${string}.json`, +): Promise => { + const j = JSON.parse(await read(p)); + const r = createTraceFile({ + traceEvents: normalizeTraceEvents(j.traceEvents?.map(decodeEvent) ?? []), + metadata: normMeta(j.metadata, false), + startTime: j.metadata?.startTime, + }); + const { displayTimeUnit, ...rest } = r; + return rest; +}; + +export const loadNormalizedTraceJsonl = async ( + p: `${string}.jsonl`, +): Promise => + createTraceFile({ + traceEvents: normalizeTraceEvents(parseDecodeJsonl(await read(p))), + }); diff --git a/packages/utils/mocks/omit-trace-json.unit.test.ts b/packages/utils/mocks/omit-trace-json.unit.test.ts new file mode 100644 index 000000000..6b40b3c32 --- /dev/null +++ b/packages/utils/mocks/omit-trace-json.unit.test.ts @@ -0,0 +1,405 @@ +import { vol } from 'memfs'; +import { expect } from 'vitest'; +import { MEMFS_VOLUME } from '@code-pushup/test-utils'; +import type { TraceEvent } from '../src/lib/profiler/trace-file.type'; +import { + loadAndOmitTraceJson, + loadAndOmitTraceJsonl, + normalizeAndFormatEvents, +} from './omit-trace-json.js'; + +describe('normalizeAndFormatEvents', () => { + it('should return empty string unchanged', () => { + expect(normalizeAndFormatEvents('')).toBe(''); + }); + + it('should return whitespace-only string unchanged', () => { + expect(normalizeAndFormatEvents(' \n\t ')).toBe(' \n\t '); + }); + + it('should return empty JSONL unchanged', () => { + expect(normalizeAndFormatEvents('\n\n')).toBe('\n\n'); + }); + + it('should normalize single event with all fields', () => { + expect( + normalizeAndFormatEvents( + '{"pid":12345,"tid":999,"ts":1234567890,"id2":{"local":"0xabc123"},"name":"test"}\n', + ), + ).toBe( + '{"pid":10001,"tid":1,"ts":1700000005000000,"id2":{"local":"0x1"},"name":"test"}\n', + ); + }); + + it('should normalize ts field with custom baseTimestampUs', () => { + const customBase = 2_000_000_000_000_000; + expect( + normalizeAndFormatEvents('{"ts":1234567890}\n', { + baseTimestampUs: customBase, + }), + ).toBe('{"ts":2000000000000000}\n'); + }); + + it('should preserve event order when timestamps are out of order', () => { + const input = + '{"ts":300,"name":"third"}\n{"ts":100,"name":"first"}\n{"ts":200,"name":"second"}\n'; + expect(normalizeAndFormatEvents(input)).toBe( + '{"ts":1700000005000200,"name":"third"}\n{"ts":1700000005000000,"name":"first"}\n{"ts":1700000005000100,"name":"second"}\n', + ); + }); + + it('should preserve event order when PIDs are out of order', () => { + const input = + '{"pid":300,"name":"third"}\n{"pid":100,"name":"first"}\n{"pid":200,"name":"second"}\n'; + expect(normalizeAndFormatEvents(input)).toBe( + '{"pid":10003,"name":"third"}\n{"pid":10001,"name":"first"}\n{"pid":10002,"name":"second"}\n', + ); + }); + + it('should handle decoding of instantEvents with args.data.detail', () => { + const rawInstantEvent: TraceEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'plugin-eslint:run-eslint:start', + pid: 8057, + tid: 0, + ts: 1_769_814_970_883_535, + args: { + data: { + detail: + '{"devtools":{"dataType":"track-entry","track":"External","trackGroup":"<✓> Code PushUp","color":"secondary"}}', + }, + }, + }; + + expect(normalizeAndFormatEvents([rawInstantEvent])).toStrictEqual([ + { + cat: 'blink.user_timing', + ph: 'i', + name: 'plugin-eslint:run-eslint:start', + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + args: { + data: { + detail: { + devtools: { + dataType: 'track-entry', + track: 'External', + trackGroup: '<✓> Code PushUp', + color: 'secondary', + }, + }, + }, + }, + }, + ]); + }); + + it('should handle decoding of spanEvents with args.detail', () => { + const rawSpanEvent = { + cat: 'blink.user_timing', + s: 't', + ph: 'b' as const, + name: 'plugin-eslint:run-eslint', + pid: 8057, + tid: 0, + ts: 1_769_814_970_883_536, + id2: { local: '0x3' }, + args: { + detail: + '{"devtools":{"dataType":"track-entry","track":"External","trackGroup":"<✓> Code PushUp","color":"secondary"}}', + }, + } as TraceEvent; + + expect(normalizeAndFormatEvents([rawSpanEvent])).toStrictEqual([ + { + cat: 'blink.user_timing', + s: 't', + ph: 'b', + name: 'plugin-eslint:run-eslint', + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + id2: { local: '0x1' }, + args: { + detail: { + devtools: { + dataType: 'track-entry', + track: 'External', + trackGroup: '<✓> Code PushUp', + color: 'secondary', + }, + }, + }, + }, + ]); + }); + + it('should handle events with frame normalization', () => { + const rawEvent = { + cat: 'devtools.timeline', + s: 't', + ph: 'i' as const, + name: 'TracingStartedInBrowser', + pid: 8057, + tid: 0, + ts: 1_769_814_970_882_268, + args: { + data: { + frameTreeNodeId: 805_700, + frames: [ + { + frame: 'FRAME0P8057T0', + isInPrimaryMainFrame: true, + processId: 8057, + url: 'trace.json', + }, + ], + }, + }, + } as TraceEvent; + + expect(normalizeAndFormatEvents([rawEvent])).toStrictEqual([ + { + cat: 'devtools.timeline', + s: 't', + ph: 'i', + name: 'TracingStartedInBrowser', + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + args: { + data: { + frameTreeNodeId: 805_700, + frames: [ + { + frame: 'FRAME0P8057T0', + isInPrimaryMainFrame: true, + processId: 8057, + url: 'trace.json', + }, + ], + }, + }, + }, + ]); + }); + + it('should handle multiple events with different pid/tid/ts/id2', () => { + const events = [ + { + cat: 'test', + ph: 'i' as const, + pid: 100, + tid: 5, + ts: 100, + name: 'first', + }, + { + cat: 'test', + ph: 'b' as const, + pid: 200, + tid: 3, + ts: 300, + name: 'second', + id2: { local: '0xabc' }, + }, + { + cat: 'test', + ph: 'b' as const, + pid: 150, + tid: 7, + ts: 200, + name: 'third', + id2: { local: '0xdef' }, + }, + ] as TraceEvent[]; + + expect(normalizeAndFormatEvents(events)).toStrictEqual([ + { + cat: 'test', + ph: 'i', + pid: 10_001, + tid: 2, + ts: 1_700_000_005_000_000, + name: 'first', + }, // pid 100->10001, tid 5->2 (sorted: 3->1, 5->2, 7->3) + { + cat: 'test', + ph: 'b', + pid: 10_003, + tid: 1, + ts: 1_700_000_005_000_200, + name: 'second', + id2: { local: '0x1' }, + }, // pid 200->10003, tid 3->1 + { + cat: 'test', + ph: 'b', + pid: 10_002, + tid: 3, + ts: 1_700_000_005_000_100, + name: 'third', + id2: { local: '0x2' }, + }, // pid 150->10002, tid 7->3 + ]); + }); + + it('should handle empty array', () => { + expect(normalizeAndFormatEvents([])).toStrictEqual([]); + }); + + it('should handle events with both args.detail and args.data.detail', () => { + const rawEvent: TraceEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'test', + pid: 8057, + tid: 0, + ts: 1_769_814_970_883_535, + args: { + detail: '{"type":"mark"}', + data: { detail: '{"type":"span"}' }, + }, + }; + + expect(normalizeAndFormatEvents([rawEvent])).toStrictEqual([ + { + cat: 'blink.user_timing', + ph: 'i', + name: 'test', + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + args: { + detail: { type: 'mark' }, + data: { detail: { type: 'span' } }, + }, + }, + ]); + }); +}); + +describe('loadAndOmitTraceJsonl', () => { + it('should load and normalize JSONL file', async () => { + vol.fromJSON( + { + 'trace.jsonl': + '{"pid":12345,"tid":999,"ts":1234567890,"name":"test"}\n{"pid":54321,"tid":888,"ts":9876543210,"name":"test2"}\n', + }, + MEMFS_VOLUME, + ); + + await expect(loadAndOmitTraceJsonl('trace.jsonl')).resolves.toStrictEqual([ + { pid: 10_001, tid: 2, ts: 1_700_000_005_000_000, name: 'test' }, // tid 999 maps to 2 (sorted: 888->1, 999->2) + { pid: 10_002, tid: 1, ts: 1_700_000_005_000_100, name: 'test2' }, // tid 888 maps to 1 + ]); + }); + + it('should decode args.detail and args.data.detail from JSONL', async () => { + vol.fromJSON( + { + 'trace.jsonl': + '{"pid":8057,"tid":0,"ts":1769814970883535,"args":{"data":{"detail":"{\\"devtools\\":{\\"dataType\\":\\"track-entry\\"}}"}}}\n{"pid":8057,"tid":0,"ts":1769814970883536,"args":{"detail":"{\\"devtools\\":{\\"dataType\\":\\"track-entry\\"}}"}}\n', + }, + MEMFS_VOLUME, + ); + + await expect(loadAndOmitTraceJsonl('trace.jsonl')).resolves.toStrictEqual([ + { + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_000, + args: { data: { detail: { devtools: { dataType: 'track-entry' } } } }, + }, + { + pid: 10_001, + tid: 0, + ts: 1_700_000_005_000_100, + args: { detail: { devtools: { dataType: 'track-entry' } } }, + }, + ]); + }); + + it('should use custom baseTimestampUs', async () => { + vol.fromJSON( + { + 'trace.jsonl': '{"ts":1234567890}\n', + }, + MEMFS_VOLUME, + ); + + await expect( + loadAndOmitTraceJsonl('trace.jsonl', { + baseTimestampUs: 2_000_000_000_000_000, + }), + ).resolves.toStrictEqual([{ ts: 2_000_000_000_000_000 }]); + }); +}); + +describe('loadAndOmitTraceJson', () => { + it('should load and normalize single trace container', async () => { + vol.fromJSON( + { + 'trace.json': JSON.stringify({ + traceEvents: [ + { pid: 8057, tid: 0, ts: 1_769_814_970_882_268, name: 'test' }, + ], + }), + }, + MEMFS_VOLUME, + ); + + await expect(loadAndOmitTraceJson('trace.json')).resolves.toStrictEqual({ + traceEvents: [ + { pid: 10_001, tid: 0, ts: 1_700_000_005_000_000, name: 'test' }, + ], + }); + }); + + it('should normalize metadata timestamps', async () => { + vol.fromJSON( + { + 'trace.json': JSON.stringify({ + metadata: { + generatedAt: '2025-01-01T00:00:00.000Z', + startTime: '2025-01-01T00:00:00.000Z', + other: 'value', + }, + traceEvents: [], + }), + }, + MEMFS_VOLUME, + ); + + const result = await loadAndOmitTraceJson('trace.json'); + expect(result).toStrictEqual({ + traceEvents: [], + metadata: { + generatedAt: '2026-01-28T14:29:27.995Z', + startTime: '2026-01-28T14:29:27.995Z', + other: 'value', + }, + }); + }); + + it('should use custom baseTimestampUs', async () => { + vol.fromJSON( + { + 'trace.json': JSON.stringify({ + traceEvents: [{ ts: 1_234_567_890 }], + }), + }, + MEMFS_VOLUME, + ); + + await expect( + loadAndOmitTraceJson('trace.json', { + baseTimestampUs: 2_000_000_000_000_000, + }), + ).resolves.toStrictEqual({ + traceEvents: [{ ts: 2_000_000_000_000_000 }], + }); + }); +}); diff --git a/packages/utils/src/lib/create-runner-files.ts b/packages/utils/src/lib/create-runner-files.ts index 5cb402580..8a8495555 100644 --- a/packages/utils/src/lib/create-runner-files.ts +++ b/packages/utils/src/lib/create-runner-files.ts @@ -1,8 +1,8 @@ import { writeFile } from 'node:fs/promises'; import path from 'node:path'; -import { threadId } from 'node:worker_threads'; import type { RunnerFilesPaths } from '@code-pushup/models'; import { ensureDirectoryExists, pluginWorkDir } from './file-system.js'; +import { getUniqueProcessThreadId } from './process-id.js'; /** * Function to create timestamp nested plugin runner files for config and output. @@ -14,9 +14,7 @@ export async function createRunnerFiles( pluginSlug: string, configJSON: string, ): Promise { - // Use timestamp + process ID + threadId - // This prevents race conditions when running the same plugin for multiple projects in parallel - const uniqueId = `${(performance.timeOrigin + performance.now()) * 10}-${process.pid}-${threadId}`; + const uniqueId = getUniqueProcessThreadId(); const runnerWorkDir = path.join(pluginWorkDir(pluginSlug), uniqueId); const runnerConfigPath = path.join(runnerWorkDir, 'plugin-config.json'); const runnerOutputPath = path.join(runnerWorkDir, 'runner-output.json'); diff --git a/packages/utils/src/lib/errors.ts b/packages/utils/src/lib/errors.ts index 3ce467bfd..c30a05a54 100644 --- a/packages/utils/src/lib/errors.ts +++ b/packages/utils/src/lib/errors.ts @@ -30,3 +30,20 @@ export function stringifyError( } return JSON.stringify(error); } + +/** + * Extend an error with a new mamessage and keeps the original as cause. + * @param error - The error to extend + * @param message - The new message to add to the error + * @returns A new error with the extended message and the original as cause + */ +export function extendError( + error: unknown, + message: string, + { appendMessage = false } = {}, +) { + const errorMessage = appendMessage + ? `${message}\n${stringifyError(error)}` + : message; + return new Error(errorMessage, { cause: error }); +} diff --git a/packages/utils/src/lib/errors.unit.test.ts b/packages/utils/src/lib/errors.unit.test.ts index 6424819ae..ccb84d2c9 100644 --- a/packages/utils/src/lib/errors.unit.test.ts +++ b/packages/utils/src/lib/errors.unit.test.ts @@ -1,7 +1,7 @@ import ansis from 'ansis'; import { z } from 'zod'; import { SchemaValidationError } from '@code-pushup/models'; -import { stringifyError } from './errors.js'; +import { extendError, stringifyError } from './errors.js'; describe('stringifyError', () => { it('should use only message from plain Error instance', () => { @@ -113,3 +113,25 @@ describe('stringifyError', () => { ).toBe(`SchemaValidationError: Invalid ${ansis.bold('User')} […]`); }); }); + +describe('extendError', () => { + it('adds message, appends original error, and keeps cause', () => { + const original = new Error('boom'); + + const extended = extendError(original, 'wrap failed', { + appendMessage: true, + }); + + expect(extended.message).toBe('wrap failed\nboom'); + expect(extended.cause).toBe(original); + }); + + it('uses only the provided message by default', () => { + const original = new Error('boom'); + + const extended = extendError(original, 'wrap failed'); + + expect(extended.message).toBe('wrap failed'); + expect(extended.cause).toBe(original); + }); +}); diff --git a/packages/utils/src/lib/execute-process.int.test.ts b/packages/utils/src/lib/execute-process.int.test.ts index 9440116ce..a7242beaa 100644 --- a/packages/utils/src/lib/execute-process.int.test.ts +++ b/packages/utils/src/lib/execute-process.int.test.ts @@ -129,7 +129,7 @@ process:complete throwError: true, }), ), - ).rejects.toThrow('Process failed with exit code 1'); + ).rejects.toThrowError('Process failed with exit code 1'); expect(logger.debug).toHaveBeenCalledWith( expect.stringMatching(/process:start.*Error: dummy-error/s), { force: true }, diff --git a/packages/utils/src/lib/exit-process.int.test.ts b/packages/utils/src/lib/exit-process.int.test.ts index d915f6317..2f5975dbd 100644 --- a/packages/utils/src/lib/exit-process.int.test.ts +++ b/packages/utils/src/lib/exit-process.int.test.ts @@ -25,7 +25,7 @@ describe('subscribeProcessExit', () => { }); it('should install event listeners for all expected events', () => { - expect(() => subscribeProcessExit({ onError, onExit })).not.toThrow(); + expect(() => subscribeProcessExit({ onError, onExit })).not.toThrowError(); expect(processOnSpy).toHaveBeenCalledWith( 'uncaughtException', @@ -42,36 +42,33 @@ describe('subscribeProcessExit', () => { }); it('should call onError with error and kind for uncaughtException', () => { - expect(() => subscribeProcessExit({ onError })).not.toThrow(); + expect(() => subscribeProcessExit({ onError })).not.toThrowError(); const testError = new Error('Test uncaught exception'); (process as any).emit('uncaughtException', testError); - expect(onError).toHaveBeenCalledWith(testError, 'uncaughtException'); - expect(onError).toHaveBeenCalledOnce(); + expect(onError).toHaveBeenCalledExactlyOnceWith(testError, 'uncaughtException'); expect(onExit).not.toHaveBeenCalled(); }); it('should call onError with reason and kind for unhandledRejection', () => { - expect(() => subscribeProcessExit({ onError })).not.toThrow(); + expect(() => subscribeProcessExit({ onError })).not.toThrowError(); const testReason = 'Test unhandled rejection'; (process as any).emit('unhandledRejection', testReason); - expect(onError).toHaveBeenCalledWith(testReason, 'unhandledRejection'); - expect(onError).toHaveBeenCalledOnce(); + expect(onError).toHaveBeenCalledExactlyOnceWith(testReason, 'unhandledRejection'); expect(onExit).not.toHaveBeenCalled(); }); it('should call onExit and exit with code 0 for SIGINT', () => { - expect(() => subscribeProcessExit({ onExit })).not.toThrow(); + expect(() => subscribeProcessExit({ onExit })).not.toThrowError(); (process as any).emit('SIGINT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGINT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGINT, { kind: 'signal', signal: 'SIGINT', }); @@ -79,12 +76,11 @@ describe('subscribeProcessExit', () => { }); it('should call onExit and exit with code 0 for SIGTERM', () => { - expect(() => subscribeProcessExit({ onExit })).not.toThrow(); + expect(() => subscribeProcessExit({ onExit })).not.toThrowError(); (process as any).emit('SIGTERM'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGTERM, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGTERM, { kind: 'signal', signal: 'SIGTERM', }); @@ -92,12 +88,11 @@ describe('subscribeProcessExit', () => { }); it('should call onExit and exit with code 0 for SIGQUIT', () => { - expect(() => subscribeProcessExit({ onExit })).not.toThrow(); + expect(() => subscribeProcessExit({ onExit })).not.toThrowError(); (process as any).emit('SIGQUIT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGQUIT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGQUIT, { kind: 'signal', signal: 'SIGQUIT', }); @@ -105,23 +100,21 @@ describe('subscribeProcessExit', () => { }); it('should call onExit for successful process termination with exit code 0', () => { - expect(() => subscribeProcessExit({ onExit })).not.toThrow(); + expect(() => subscribeProcessExit({ onExit })).not.toThrowError(); (process as any).emit('exit', 0); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(0, { kind: 'exit' }); + expect(onExit).toHaveBeenCalledExactlyOnceWith(0, { kind: 'exit' }); expect(onError).not.toHaveBeenCalled(); expect(processExitSpy).not.toHaveBeenCalled(); }); it('should call onExit for failed process termination with exit code 1', () => { - expect(() => subscribeProcessExit({ onExit })).not.toThrow(); + expect(() => subscribeProcessExit({ onExit })).not.toThrowError(); (process as any).emit('exit', 1); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(1, { kind: 'exit' }); + expect(onExit).toHaveBeenCalledExactlyOnceWith(1, { kind: 'exit' }); expect(onError).not.toHaveBeenCalled(); expect(processExitSpy).not.toHaveBeenCalled(); }); diff --git a/packages/utils/src/lib/exit-process.unit.test.ts b/packages/utils/src/lib/exit-process.unit.test.ts index 3226e650c..087736677 100644 --- a/packages/utils/src/lib/exit-process.unit.test.ts +++ b/packages/utils/src/lib/exit-process.unit.test.ts @@ -26,7 +26,7 @@ describe('subscribeProcessExit', () => { }); it('should install event listeners for all expected events', () => { - expect(() => subscribeProcessExit({ onError, onExit })).not.toThrow(); + expect(() => subscribeProcessExit({ onError, onExit })).not.toThrowError(); expect(processOnSpy).toHaveBeenCalledWith( 'uncaughtException', @@ -43,38 +43,35 @@ describe('subscribeProcessExit', () => { }); it('should call onError with error and kind for uncaughtException', () => { - expect(() => subscribeProcessExit({ onError })).not.toThrow(); + expect(() => subscribeProcessExit({ onError })).not.toThrowError(); const testError = new Error('Test uncaught exception'); (process as any).emit('uncaughtException', testError); - expect(onError).toHaveBeenCalledWith(testError, 'uncaughtException'); - expect(onError).toHaveBeenCalledOnce(); + expect(onError).toHaveBeenCalledExactlyOnceWith(testError, 'uncaughtException'); expect(onExit).not.toHaveBeenCalled(); }); it('should call onError with reason and kind for unhandledRejection', () => { - expect(() => subscribeProcessExit({ onError })).not.toThrow(); + expect(() => subscribeProcessExit({ onError })).not.toThrowError(); const testReason = 'Test unhandled rejection'; (process as any).emit('unhandledRejection', testReason); - expect(onError).toHaveBeenCalledWith(testReason, 'unhandledRejection'); - expect(onError).toHaveBeenCalledOnce(); + expect(onError).toHaveBeenCalledExactlyOnceWith(testReason, 'unhandledRejection'); expect(onExit).not.toHaveBeenCalled(); }); it('should call onExit with correct code and reason for SIGINT', () => { expect(() => subscribeProcessExit({ onExit, exitOnSignal: true }), - ).not.toThrow(); + ).not.toThrowError(); (process as any).emit('SIGINT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGINT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGINT, { kind: 'signal', signal: 'SIGINT', }); @@ -85,12 +82,11 @@ describe('subscribeProcessExit', () => { it('should call onExit with correct code and reason for SIGTERM', () => { expect(() => subscribeProcessExit({ onExit, exitOnSignal: true }), - ).not.toThrow(); + ).not.toThrowError(); (process as any).emit('SIGTERM'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGTERM, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGTERM, { kind: 'signal', signal: 'SIGTERM', }); @@ -101,12 +97,11 @@ describe('subscribeProcessExit', () => { it('should call onExit with correct code and reason for SIGQUIT', () => { expect(() => subscribeProcessExit({ onExit, exitOnSignal: true }), - ).not.toThrow(); + ).not.toThrowError(); (process as any).emit('SIGQUIT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGQUIT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGQUIT, { kind: 'signal', signal: 'SIGQUIT', }); @@ -117,12 +112,11 @@ describe('subscribeProcessExit', () => { it('should not exit process when exitOnSignal is false', () => { expect(() => subscribeProcessExit({ onExit, exitOnSignal: false }), - ).not.toThrow(); + ).not.toThrowError(); (process as any).emit('SIGINT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGINT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGINT, { kind: 'signal', signal: 'SIGINT', }); @@ -131,12 +125,11 @@ describe('subscribeProcessExit', () => { }); it('should not exit process when exitOnSignal is not set', () => { - expect(() => subscribeProcessExit({ onExit })).not.toThrow(); + expect(() => subscribeProcessExit({ onExit })).not.toThrowError(); (process as any).emit('SIGTERM'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGTERM, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGTERM, { kind: 'signal', signal: 'SIGTERM', }); @@ -145,13 +138,12 @@ describe('subscribeProcessExit', () => { }); it('should call onExit with exit code and reason for normal exit', () => { - expect(() => subscribeProcessExit({ onExit })).not.toThrow(); + expect(() => subscribeProcessExit({ onExit })).not.toThrowError(); const exitCode = 42; (process as any).emit('exit', exitCode); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(exitCode, { kind: 'exit' }); + expect(onExit).toHaveBeenCalledExactlyOnceWith(exitCode, { kind: 'exit' }); expect(onError).not.toHaveBeenCalled(); expect(processExitSpy).not.toHaveBeenCalled(); }); @@ -159,19 +151,17 @@ describe('subscribeProcessExit', () => { it('should call onExit with fatal reason when exitOnFatal is true', () => { expect(() => subscribeProcessExit({ onError, onExit, exitOnFatal: true }), - ).not.toThrow(); + ).not.toThrowError(); const testError = new Error('Test uncaught exception'); (process as any).emit('uncaughtException', testError); - expect(onError).toHaveBeenCalledWith(testError, 'uncaughtException'); - expect(onError).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(1, { + expect(onError).toHaveBeenCalledExactlyOnceWith(testError, 'uncaughtException'); + expect(onExit).toHaveBeenCalledExactlyOnceWith(1, { kind: 'fatal', fatal: 'uncaughtException', }); - expect(onExit).toHaveBeenCalledOnce(); }); it('should use custom fatalExitCode when exitOnFatal is true', () => { @@ -182,37 +172,33 @@ describe('subscribeProcessExit', () => { exitOnFatal: true, fatalExitCode: 42, }), - ).not.toThrow(); + ).not.toThrowError(); const testError = new Error('Test uncaught exception'); (process as any).emit('uncaughtException', testError); - expect(onError).toHaveBeenCalledWith(testError, 'uncaughtException'); - expect(onError).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(42, { + expect(onError).toHaveBeenCalledExactlyOnceWith(testError, 'uncaughtException'); + expect(onExit).toHaveBeenCalledExactlyOnceWith(42, { kind: 'fatal', fatal: 'uncaughtException', }); - expect(onExit).toHaveBeenCalledOnce(); }); it('should call onExit with fatal reason for unhandledRejection when exitOnFatal is true', () => { expect(() => subscribeProcessExit({ onError, onExit, exitOnFatal: true }), - ).not.toThrow(); + ).not.toThrowError(); const testReason = 'Test unhandled rejection'; (process as any).emit('unhandledRejection', testReason); - expect(onError).toHaveBeenCalledWith(testReason, 'unhandledRejection'); - expect(onError).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(1, { + expect(onError).toHaveBeenCalledExactlyOnceWith(testReason, 'unhandledRejection'); + expect(onExit).toHaveBeenCalledExactlyOnceWith(1, { kind: 'fatal', fatal: 'unhandledRejection', }); - expect(onExit).toHaveBeenCalledOnce(); }); it('should have correct SIGINT exit code on Windows', () => { @@ -244,11 +230,10 @@ describe('subscribeProcessExit', () => { it('should call onExit only once even when close is called multiple times', () => { expect(() => subscribeProcessExit({ onExit, exitOnSignal: true }), - ).not.toThrow(); + ).not.toThrowError(); (process as any).emit('SIGINT'); - expect(onExit).toHaveBeenCalledOnce(); - expect(onExit).toHaveBeenCalledWith(SIGNAL_EXIT_CODES().SIGINT, { + expect(onExit).toHaveBeenCalledExactlyOnceWith(SIGNAL_EXIT_CODES().SIGINT, { kind: 'signal', signal: 'SIGINT', }); diff --git a/packages/utils/src/lib/file-system.int.test.ts b/packages/utils/src/lib/file-system.int.test.ts index 77d16eeff..0f50942b5 100644 --- a/packages/utils/src/lib/file-system.int.test.ts +++ b/packages/utils/src/lib/file-system.int.test.ts @@ -47,11 +47,11 @@ describe('importModule', () => { it('should throw if the file does not exist', async () => { await expect( importModule({ filepath: 'path/to/non-existent-export.mjs' }), - ).rejects.toThrow("File 'path/to/non-existent-export.mjs' does not exist"); + ).rejects.toThrowError("File 'path/to/non-existent-export.mjs' does not exist"); }); it('should throw if path is a directory', async () => { - await expect(importModule({ filepath: mockDir })).rejects.toThrow( + await expect(importModule({ filepath: mockDir })).rejects.toThrowError( `Expected '${mockDir}' to be a file`, ); }); @@ -59,7 +59,7 @@ describe('importModule', () => { it('should throw if file is not valid JS', async () => { await expect( importModule({ filepath: path.join(mockDir, 'invalid-js-file.json') }), - ).rejects.toThrow( + ).rejects.toThrowError( `${path.join(mockDir, 'invalid-js-file.json')} is not a valid JS file`, ); }); diff --git a/packages/utils/src/lib/git/git.commits-and-tags.int.test.ts b/packages/utils/src/lib/git/git.commits-and-tags.int.test.ts index d37b97533..9d64edcac 100644 --- a/packages/utils/src/lib/git/git.commits-and-tags.int.test.ts +++ b/packages/utils/src/lib/git/git.commits-and-tags.int.test.ts @@ -32,7 +32,7 @@ describe('getCurrentBranchOrTag', () => { it('getCurrentBranchOrTag should throw if no branch or tag is given', async () => { await expect( getCurrentBranchOrTag(currentBranchOrTagGitMock), - ).rejects.toThrow('No names found, cannot describe anything'); + ).rejects.toThrowError('No names found, cannot describe anything'); }); }); @@ -104,7 +104,7 @@ describe('getHashes', () => { describe('without a branch and commits', () => { it('should throw', async () => { - await expect(getHashes({}, gitMock)).rejects.toThrow( + await expect(getHashes({}, gitMock)).rejects.toThrowError( "your current branch 'main' does not have any commits yet", ); }); @@ -165,7 +165,7 @@ describe('getHashes', () => { it('should throw if "from" is undefined but "to" is defined', async () => { await expect( getHashes({ from: undefined, to: 'a' }, gitMock), - ).rejects.toThrow( + ).rejects.toThrowError( 'filter needs the "from" option defined to accept the "to" option.', ); }); diff --git a/packages/utils/src/lib/git/git.commits-and-tags.unit.test.ts b/packages/utils/src/lib/git/git.commits-and-tags.unit.test.ts index cf2cb89e4..797184b08 100644 --- a/packages/utils/src/lib/git/git.commits-and-tags.unit.test.ts +++ b/packages/utils/src/lib/git/git.commits-and-tags.unit.test.ts @@ -53,7 +53,7 @@ describe('filterLogs', () => { }); it('should throw for "to" without "from" filter', () => { - expect(() => filterLogs([], { to: 'e' })).toThrow( + expect(() => filterLogs([], { to: 'e' })).toThrowError( 'filter needs the "from" option defined to accept the "to" option.', ); }); @@ -163,7 +163,7 @@ describe('getSemverTags', () => { }); it('should throw if "from" is undefined but "to" is defined', async () => { - await expect(getSemverTags({ from: undefined, to: 'a' })).rejects.toThrow( + await expect(getSemverTags({ from: undefined, to: 'a' })).rejects.toThrowError( 'filter needs the "from" option defined to accept the "to" option', ); }); diff --git a/packages/utils/src/lib/git/git.int.test.ts b/packages/utils/src/lib/git/git.int.test.ts index 151cffdd0..8999b297f 100644 --- a/packages/utils/src/lib/git/git.int.test.ts +++ b/packages/utils/src/lib/git/git.int.test.ts @@ -82,7 +82,7 @@ describe('git utils in a git repo', () => { it('safeCheckout should throw if a given branch does not exist', async () => { await expect( safeCheckout('non-existing-branch', undefined, emptyGit), - ).rejects.toThrow( + ).rejects.toThrowError( "pathspec 'non-existing-branch' did not match any file(s) known to git", ); }); @@ -133,7 +133,7 @@ describe('git utils in a git repo', () => { }); it('safeCheckout should throw if history is dirty', async () => { - await expect(safeCheckout('master', undefined, emptyGit)).rejects.toThrow( + await expect(safeCheckout('master', undefined, emptyGit)).rejects.toThrowError( `Working directory needs to be clean before we you can proceed. Commit your local changes or stash them: \n ${JSON.stringify( { not_added: ['new-file.md'], diff --git a/packages/utils/src/lib/git/git.unit.test.ts b/packages/utils/src/lib/git/git.unit.test.ts index 240f7695f..8783ec0c8 100644 --- a/packages/utils/src/lib/git/git.unit.test.ts +++ b/packages/utils/src/lib/git/git.unit.test.ts @@ -11,7 +11,7 @@ describe('guardAgainstLocalChanges', () => { guardAgainstLocalChanges({ status: () => Promise.resolve({ files: [''] }), } as unknown as SimpleGit), - ).rejects.toThrow( + ).rejects.toThrowError( new GitStatusError({ files: [''] } as unknown as StatusResult), ); }); diff --git a/packages/utils/src/lib/logger.int.test.ts b/packages/utils/src/lib/logger.int.test.ts index d0dd327bb..44bf8444f 100644 --- a/packages/utils/src/lib/logger.int.test.ts +++ b/packages/utils/src/lib/logger.int.test.ts @@ -159,7 +159,7 @@ ${ansis.red('Failed to load config')} "ENOENT: no such file or directory, open '.code-pushup/eslint/results.json'", ); }), - ).rejects.toThrow( + ).rejects.toThrowError( "ENOENT: no such file or directory, open '.code-pushup/eslint/results.json'", ); expect(stdout).toBe( @@ -349,7 +349,7 @@ ${ansis.magenta('└')} ${ansis.green(`Total line coverage is ${ansis.bold('82%' expect(stdout).toBe(`${ansis.cyan('⠋')} Uploading report to portal`); - await expect(task).rejects.toThrow('GraphQL error: Invalid API key'); + await expect(task).rejects.toThrowError('GraphQL error: Invalid API key'); expect(stdout).toBe( `${ansis.red('✖')} Uploading report to portal → ${ansis.red('GraphQL error: Invalid API key')}\n`, @@ -502,7 +502,7 @@ ${ansis.green('✔')} Uploaded report to portal ${ansis.gray('(42 ms)')} expect(stdout).toBe(`${ansis.cyan('⠋')} Uploading report to portal`); vi.advanceTimersByTime(42); - await expect(task).rejects.toThrow('GraphQL error: Invalid API key'); + await expect(task).rejects.toThrowError('GraphQL error: Invalid API key'); expect(stdout).toBe( ` @@ -575,7 +575,7 @@ ${ansis.red('✖')} Uploading report to portal → ${ansis.red('GraphQL error: I `${ansis.cyan('⠋')} ${ansis.blue('$')} npx eslint . --format=json`, ); - await expect(command).rejects.toThrow('Process failed with exit code 1'); + await expect(command).rejects.toThrowError('Process failed with exit code 1'); expect(stdout).toBe( `${ansis.red('✖')} ${ansis.red('$')} npx eslint . --format=json\n`, @@ -831,7 +831,7 @@ ${ansis.cyan('-')} ${ansis.blue('$')} npx eslint . --format=json`, ); vi.advanceTimersToNextTimer(); - await expect(group).rejects.toThrow('Process failed with exit code 1'); + await expect(group).rejects.toThrowError('Process failed with exit code 1'); expect(stdout).toBe( ` @@ -920,7 +920,7 @@ ${ansis.red.bold('Cancelled by SIGINT')} return 'ESLint reported 0 problems'; }); - await expect(group).rejects.toThrow('Process failed with exit code 2'); + await expect(group).rejects.toThrowError('Process failed with exit code 2'); expect(ansis.strip(stdout)).toBe( ` @@ -950,7 +950,7 @@ ${ansis.red.bold('Cancelled by SIGINT')} await logger.group('Inner group', async () => 'Inner group complete'); return 'Outer group complete'; }), - ).rejects.toThrow( + ).rejects.toThrowError( 'Internal Logger error - nested groups are not supported', ); }); @@ -963,7 +963,7 @@ ${ansis.red.bold('Cancelled by SIGINT')} await logger.group('Some group', async () => 'Group completed'); return 'Async process completed'; }), - ).rejects.toThrow( + ).rejects.toThrowError( 'Internal Logger error - creating group in active spinner is not supported', ); }); @@ -976,7 +976,7 @@ ${ansis.red.bold('Cancelled by SIGINT')} logger.task('Task 1', async () => 'DONE'), logger.task('Task 2', async () => 'DONE'), ]), - ).rejects.toThrow( + ).rejects.toThrowError( 'Internal Logger error - concurrent spinners are not supported', ); }); @@ -990,7 +990,7 @@ ${ansis.red.bold('Cancelled by SIGINT')} await logger.task('Task 2', async () => 'DONE'); return 'DONE'; }), - ).rejects.toThrow( + ).rejects.toThrowError( 'Internal Logger error - concurrent spinners are not supported', ); }); diff --git a/packages/utils/src/lib/performance-observer.int.test.ts b/packages/utils/src/lib/performance-observer.int.test.ts index 33d93967f..f72118db6 100644 --- a/packages/utils/src/lib/performance-observer.int.test.ts +++ b/packages/utils/src/lib/performance-observer.int.test.ts @@ -31,7 +31,7 @@ describe('PerformanceObserverSink', () => { }); it('creates instance with required options', () => { - expect(() => new PerformanceObserverSink(options)).not.toThrow(); + expect(() => new PerformanceObserverSink(options)).not.toThrowError(); }); it('unsubscribe stops observing performance entries', async () => { diff --git a/packages/utils/src/lib/performance-observer.ts b/packages/utils/src/lib/performance-observer.ts index 79446e974..fced4fc56 100644 --- a/packages/utils/src/lib/performance-observer.ts +++ b/packages/utils/src/lib/performance-observer.ts @@ -3,8 +3,6 @@ import { PerformanceObserver, performance, } from 'node:perf_hooks'; -import { isEnvVarEnabled } from './env.js'; -import { PROFILER_DEBUG_ENV_VAR } from './profiler/constants.js'; import type { AppendableSink } from './wal.js'; /** @@ -122,14 +120,12 @@ export type PerformanceObserverOptions = { * @default DEFAULT_MAX_QUEUE_SIZE (10000) */ maxQueueSize?: number; - /** - * Name of the environment variable to check for debug mode. - * When the env var is set to 'true', encode failures create performance marks for debugging. + * Whether debug mode is enabled for encode failures. + * When true, encode failures create performance marks for debugging. * - * @default 'CP_PROFILER_DEBUG' */ - debugEnvVar?: string; + debug: boolean; }; /** @@ -151,7 +147,7 @@ export type PerformanceObserverOptions = { * - Queue cleared after successful batch writes * * - Item Disposition Scenarios 💥 - * - **Encode Failure**: ❌ Items lost when `encode()` throws. Creates perf mark if debug env var (specified by `debugEnvVar`) is set to 'true'. + * - **Encode Failure**: ❌ Items lost when `encode()` throws. Creates perf mark if 'DEBUG' env var is set to 'true'. * - **Sink Write Failure**: 💾 Items stay in queue when sink write fails during flush * - **Sink Closed**: 💾 Items stay in queue when sink is closed during flush * - **Proactive Flush Throws**: 💾 Items stay in queue when `flush()` throws during threshold check @@ -197,6 +193,46 @@ export class PerformanceObserverSink { /** Whether debug mode is enabled for encode failures */ #debug: boolean; + private processPerformanceEntries(entries: PerformanceEntry[]) { + entries.forEach(entry => { + if (OBSERVED_TYPE_SET.has(entry.entryType as ObservedEntryType)) { + try { + const items = this.encode(entry); + items.forEach(item => { + // ❌ MAX QUEUE OVERFLOW + if (this.#queue.length >= this.#maxQueueSize) { + this.#dropped++; // Items are lost forever + return; + } + + if ( + this.#queue.length >= + this.#maxQueueSize - this.#flushThreshold + ) { + this.flush(); + } + this.#queue.push(item); + this.#addedSinceLastFlush++; + }); + } catch (error) { + // ❌ Encode failure: item lost forever as user has to fix encode function. + this.#dropped++; + if (this.#debug) { + try { + performance.mark(errorToPerfMark(error, entry)); + } catch { + // Ignore mark failures to prevent double errors + } + } + } + } + }); + + if (this.#addedSinceLastFlush >= this.#flushThreshold) { + this.flush(); + } + } + /** * Creates a new PerformanceObserverSink with the specified configuration. * @@ -210,7 +246,7 @@ export class PerformanceObserverSink { captureBufferedEntries, flushThreshold = DEFAULT_FLUSH_THRESHOLD, maxQueueSize = DEFAULT_MAX_QUEUE_SIZE, - debugEnvVar = PROFILER_DEBUG_ENV_VAR, + debug, } = options; this.#encodePerfEntry = encodePerfEntry; this.#sink = sink; @@ -218,14 +254,13 @@ export class PerformanceObserverSink { this.#maxQueueSize = maxQueueSize; validateFlushThreshold(flushThreshold, this.#maxQueueSize); this.#flushThreshold = flushThreshold; - this.#debug = isEnvVarEnabled(debugEnvVar); + this.#debug = debug; } /** * Returns whether debug mode is enabled for encode failures. * - * Debug mode is determined by the environment variable specified by `debugEnvVar` - * (defaults to 'CP_PROFILER_DEBUG'). When enabled, encode failures create + * Debug mode is determined by the environment variable 'DEBUG' * performance marks for debugging. * * @returns true if debug mode is enabled, false otherwise @@ -283,53 +318,26 @@ export class PerformanceObserverSink { } this.#observer = new PerformanceObserver(list => { - list.getEntries().forEach(entry => { - if (OBSERVED_TYPE_SET.has(entry.entryType as ObservedEntryType)) { - try { - const items = this.encode(entry); - items.forEach(item => { - // ❌ MAX QUEUE OVERFLOW - if (this.#queue.length >= this.#maxQueueSize) { - this.#dropped++; // Items are lost forever - return; - } - - if ( - this.#queue.length >= - this.#maxQueueSize - this.#flushThreshold - ) { - this.flush(); - } - this.#queue.push(item); - this.#addedSinceLastFlush++; - }); - } catch (error) { - // ❌ Encode failure: item lost forever as user has to fix encode function. - this.#dropped++; - if (this.#debug) { - try { - performance.mark(errorToPerfMark(error, entry)); - } catch { - // Ignore mark failures to prevent double errors - } - } - } - } - }); - - if (this.#addedSinceLastFlush >= this.#flushThreshold) { - this.flush(); - } + this.processPerformanceEntries(list.getEntries()); }); + // When buffered mode is enabled, Node.js PerformanceObserver invokes + // the callback synchronously with all buffered entries before observe() returns. + // However, entries created before any observer existed may not be buffered by Node.js. + // We manually retrieve entries from the performance buffer using getEntriesByType() + // to capture entries that were created before the observer was created. + if (this.#buffered) { + const existingMarks = performance.getEntriesByType('mark'); + const existingMeasures = performance.getEntriesByType('measure'); + const allEntries = [...existingMarks, ...existingMeasures]; + this.processPerformanceEntries(allEntries); + } + this.#observer.observe({ entryTypes: OBSERVED_TYPES, - buffered: this.#buffered, + // @NOTE: This is for unknown reasons not working, and we manually do it above + // buffered: this.#buffered, }); - - if (this.#buffered) { - this.flush(); - } } /** diff --git a/packages/utils/src/lib/performance-observer.unit.test.ts b/packages/utils/src/lib/performance-observer.unit.test.ts index 56c48b333..b37d63400 100644 --- a/packages/utils/src/lib/performance-observer.unit.test.ts +++ b/packages/utils/src/lib/performance-observer.unit.test.ts @@ -27,7 +27,7 @@ describe('validateFlushThreshold', () => { ({ flushThreshold }) => { expect(() => validateFlushThreshold(flushThreshold, DEFAULT_MAX_QUEUE_SIZE), - ).not.toThrow(); + ).not.toThrowError(); }, ); @@ -48,7 +48,7 @@ describe('validateFlushThreshold', () => { ({ flushThreshold, expectedError }) => { expect(() => validateFlushThreshold(flushThreshold, DEFAULT_MAX_QUEUE_SIZE), - ).toThrow(expectedError); + ).toThrowError(expectedError); }, ); }); @@ -69,6 +69,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 1, + debug: false, }; performance.clearMarks(); @@ -76,7 +77,7 @@ describe('PerformanceObserverSink', () => { }); it('creates instance with required options without starting to observe', () => { - expect(() => new PerformanceObserverSink(options)).not.toThrow(); + expect(() => new PerformanceObserverSink(options)).not.toThrowError(); expect(MockPerformanceObserver.instances).toHaveLength(0); }); @@ -86,8 +87,9 @@ describe('PerformanceObserverSink', () => { new PerformanceObserverSink({ sink, encodePerfEntry, + debug: false, }), - ).not.toThrow(); + ).not.toThrowError(); expect(MockPerformanceObserver.instances).toHaveLength(0); }); @@ -98,8 +100,9 @@ describe('PerformanceObserverSink', () => { ...options, captureBufferedEntries: true, flushThreshold: 10, + debug: false, }), - ).not.toThrow(); + ).not.toThrowError(); expect(MockPerformanceObserver.instances).toHaveLength(0); }); @@ -119,7 +122,7 @@ describe('PerformanceObserverSink', () => { ...options, flushThreshold, }), - ).toThrow(expectedError); + ).toThrowError(expectedError); }, ); @@ -146,35 +149,6 @@ describe('PerformanceObserverSink', () => { ); }); - it('internal PerformanceObserver should observe buffered by default', () => { - const observer = new PerformanceObserverSink(options); - - observer.subscribe(); - expect( - MockPerformanceObserver.lastInstance()?.observe, - ).toHaveBeenCalledWith( - expect.objectContaining({ - buffered: true, - }), - ); - }); - - it('internal PerformanceObserver should observe buffered if buffered option is provided', () => { - const observer = new PerformanceObserverSink({ - ...options, - captureBufferedEntries: true, - }); - - observer.subscribe(); - expect( - MockPerformanceObserver.lastInstance()?.observe, - ).toHaveBeenCalledWith( - expect.objectContaining({ - buffered: true, - }), - ); - }); - it('internal PerformanceObserver should process observed entries', () => { const observer = new PerformanceObserverSink({ ...options, @@ -301,10 +275,11 @@ describe('PerformanceObserverSink', () => { const observer = new PerformanceObserverSink({ sink, encodePerfEntry, + debug: false, }); - expect(() => observer.flush()).not.toThrow(); - expect(() => observer.flush()).not.toThrow(); + expect(() => observer.flush()).not.toThrowError(); + expect(() => observer.flush()).not.toThrowError(); expect(sink.getWrittenItems()).toStrictEqual([]); }); @@ -313,6 +288,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 10, + debug: false, }); sink.open(); observer.subscribe(); @@ -326,8 +302,8 @@ describe('PerformanceObserverSink', () => { }); sink.close(); - expect(() => observer.flush()).not.toThrow(); - expect(() => observer.flush()).not.toThrow(); + expect(() => observer.flush()).not.toThrowError(); + expect(() => observer.flush()).not.toThrowError(); expect(observer.getStats()).toHaveProperty('queued', 1); observer.unsubscribe(); @@ -353,48 +329,22 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: false, }); observer.subscribe(); const mockObserver = MockPerformanceObserver.lastInstance(); performance.mark('test-mark'); - expect(() => mockObserver?.triggerObserverCallback()).not.toThrow(); + expect(() => mockObserver?.triggerObserverCallback()).not.toThrowError(); const stats = observer.getStats(); expect(stats.dropped).toBe(1); expect(stats.queued).toBe(0); }); - describe('debug mode with env var', () => { - const originalEnv = process.env.CP_PROFILER_DEBUG; - - beforeEach(() => { - // Restore original env before each test - if (originalEnv === undefined) { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - } else { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = originalEnv; - } - }); - - afterEach(() => { - // Restore original env after each test - if (originalEnv === undefined) { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - } else { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = originalEnv; - } - }); - - it('creates performance mark when encode fails and debug mode is enabled via env var', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - + describe('debug mode', () => { + it('creates performance mark when encode fails and debug mode is enabled', () => { const failingEncode = vi.fn(() => { throw new Error('EncodeError'); }); @@ -403,6 +353,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: true, }); observer.subscribe(); @@ -423,9 +374,6 @@ describe('PerformanceObserverSink', () => { }); it('does not create performance mark when encode fails and debug mode is disabled', () => { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - const failingEncode = vi.fn(() => { throw new Error('EncodeError'); }); @@ -434,6 +382,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: false, }); performance.clearMarks(); @@ -454,9 +403,6 @@ describe('PerformanceObserverSink', () => { }); it('handles encode errors for unnamed entries correctly', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const failingEncode = vi.fn(() => { throw new Error('EncodeError'); }); @@ -465,6 +411,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: true, }); observer.subscribe(); @@ -482,9 +429,6 @@ describe('PerformanceObserverSink', () => { }); it('handles non-Error objects thrown from encode function', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const failingEncode = vi.fn(() => { throw 'String error'; }); @@ -493,6 +437,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: true, }); observer.subscribe(); @@ -527,6 +472,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry: failingEncode, flushThreshold: 10, + debug: false, }); observer.subscribe(); @@ -556,6 +502,7 @@ describe('PerformanceObserverSink', () => { sink: failingSink as any, encodePerfEntry, flushThreshold: 10, + debug: false, }); observer.subscribe(); @@ -569,7 +516,7 @@ describe('PerformanceObserverSink', () => { expect(statsBefore.queued).toBe(1); // flush should not throw, but failed items stay in queue for retry - expect(() => observer.flush()).not.toThrow(); + expect(() => observer.flush()).not.toThrowError(); const statsAfter = observer.getStats(); expect(statsAfter.dropped).toBe(0); // Items not dropped, kept for retry @@ -582,6 +529,7 @@ describe('PerformanceObserverSink', () => { encodePerfEntry, maxQueueSize: 20, flushThreshold: 10, + debug: false, }); expect(observer.getStats()).toStrictEqual( @@ -597,6 +545,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 10, + debug: false, }); observer.subscribe(); @@ -618,6 +567,7 @@ describe('PerformanceObserverSink', () => { encodePerfEntry, maxQueueSize: smallQueueSize, flushThreshold: smallQueueSize, + debug: false, }); const flushSpy = vi.spyOn(observer, 'flush').mockImplementation(() => {}); @@ -646,6 +596,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 2, + debug: false, }); observer.subscribe(); @@ -682,6 +633,7 @@ describe('PerformanceObserverSink', () => { encodePerfEntry: (entry: PerformanceEntry) => [ `${entry.name}:${entry.duration}`, ], + debug: false, }); observer.subscribe(); @@ -700,6 +652,7 @@ describe('PerformanceObserverSink', () => { sink, encodePerfEntry, flushThreshold: 10, + debug: false, }); expect(observer.getStats().addedSinceLastFlush).toBe(0); @@ -735,75 +688,28 @@ describe('PerformanceObserverSink', () => { }); describe('debug getter', () => { - const originalEnv = process.env.CP_PROFILER_DEBUG; - - beforeEach(() => { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - }); - - afterEach(() => { - if (originalEnv === undefined) { - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - } else { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = originalEnv; - } - }); - - it('returns false when debug env var is not set', () => { - const observer = new PerformanceObserverSink(options); - - expect(observer.debug).toBeFalse(); - }); - - it('returns true when debug env var is set to "true"', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - - const observer = new PerformanceObserverSink(options); - - expect(observer.debug).toBeTrue(); - }); - - it('returns false when debug env var is set to a value other than "true"', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'false'; - - const observer = new PerformanceObserverSink(options); - - expect(observer.debug).toBeFalse(); - }); - - it('returns false when debug env var is set to empty string', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = ''; - - const observer = new PerformanceObserverSink(options); + it('returns false when debug is disabled', () => { + const observer = new PerformanceObserverSink({ + ...options, + debug: false, + }); expect(observer.debug).toBeFalse(); }); - it('respects custom debugEnvVar option', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CUSTOM_DEBUG_VAR = 'true'; - + it('returns true when debug is enabled', () => { const observer = new PerformanceObserverSink({ ...options, - debugEnvVar: 'CUSTOM_DEBUG_VAR', + debug: true, }); expect(observer.debug).toBeTrue(); - - // eslint-disable-next-line functional/immutable-data - delete process.env.CUSTOM_DEBUG_VAR; }); - it('returns false when custom debugEnvVar is not set', () => { + it('returns false when debug is disabled via options', () => { const observer = new PerformanceObserverSink({ ...options, - debugEnvVar: 'CUSTOM_DEBUG_VAR', + debug: false, }); expect(observer.debug).toBeFalse(); diff --git a/packages/utils/src/lib/plugin-url-config.unit.test.ts b/packages/utils/src/lib/plugin-url-config.unit.test.ts index 475e1f3e4..dd6916123 100644 --- a/packages/utils/src/lib/plugin-url-config.unit.test.ts +++ b/packages/utils/src/lib/plugin-url-config.unit.test.ts @@ -144,12 +144,12 @@ describe('pluginUrlContextSchema', () => { [{ urlCount: 2 }, /expected record/i], [{ urlCount: 2, weights: { 1: 1 } }, /weights count must match/i], ])('should throw error for invalid context: %j', (pattern, expectedError) => { - expect(() => pluginUrlContextSchema.parse(pattern)).toThrow(expectedError); + expect(() => pluginUrlContextSchema.parse(pattern)).toThrowError(expectedError); }); it('should accept valid context', () => { expect(() => pluginUrlContextSchema.parse({ urlCount: 2, weights: { 1: 1, 2: 1 } }), - ).not.toThrow(); + ).not.toThrowError(); }); }); diff --git a/packages/utils/src/lib/process-id.ts b/packages/utils/src/lib/process-id.ts new file mode 100644 index 000000000..7199729a0 --- /dev/null +++ b/packages/utils/src/lib/process-id.ts @@ -0,0 +1,141 @@ +import process from 'node:process'; +import { threadId } from 'node:worker_threads'; + +/** + * Counter interface for generating sequential instance IDs. + * Encapsulates increment logic within the counter-implementation. + */ +export type Counter = { + /** + * Returns the next counter-value and increments the internal state. + * @returns The next counter-value + */ + next: () => number; +}; + +/** + * Base regex pattern for time ID format: yyyymmdd-hhmmss-ms + */ +export const TIME_ID_BASE = /\d{8}-\d{6}-\d{3}/; + +/** + * Regex patterns for validating process and instance ID formats. + * All patterns use strict anchors (^ and $) to ensure complete matches. + */ +export const ID_PATTERNS = Object.freeze({ + /** + * Time ID / Run ID format: yyyymmdd-hhmmss-ms + * Example: "20240101-120000-000" + * Used by: getUniqueTimeId() + */ + TIME_ID: new RegExp(`^${TIME_ID_BASE.source}$`), + /** + * Group ID format: alias by convention, semantically represents a group of instances + * Example: "20240101-120000-000" + * Used by: grouping related instances by time + */ + GROUP_ID: new RegExp(`^${TIME_ID_BASE.source}$`), + /** + * Process/Thread ID format: timeId-pid-threadId + * Example: "20240101-120000-000-12345-1" + * Used by: getUniqueProcessThreadId() + */ + PROCESS_THREAD_ID: new RegExp(`^${TIME_ID_BASE.source}-\\d+-\\d+$`), + /** + * Instance ID format: timeId.pid.threadId.counter + * Example: "20240101-120000-000.12345.1.1" + * Used by: getUniqueInstanceId() + */ + INSTANCE_ID: new RegExp(`^${TIME_ID_BASE.source}\\.\\d+\\.\\d+\\.\\d+$`), + /** @deprecated Use INSTANCE_ID instead */ + SHARD_ID: new RegExp(`^${TIME_ID_BASE.source}\\.\\d+\\.\\d+\\.\\d+$`), + /** @deprecated Use TIME_ID instead */ + READABLE_DATE: new RegExp(`^${TIME_ID_BASE.source}$`), +} as const); + +/** + * Generates a unique run ID. + * This ID uniquely identifies a run/execution with a globally unique, sortable, human-readable date string. + * Format: yyyymmdd-hhmmss-ms + * Example: "20240101-120000-000" + * + * @returns A unique run ID string in readable date format + */ +export function getUniqueTimeId(): string { + return sortableReadableDateString( + Math.floor(performance.timeOrigin + performance.now()), + ); +} + +/** + * Generates a unique process/thread ID. + * This ID uniquely identifies a process/thread execution and prevents race conditions when running + * the same plugin for multiple projects in parallel. + * Format: timeId-pid-threadId + * Example: "20240101-120000-000-12345-1" + * + * @returns A unique ID string combining timestamp, process ID, and thread ID + */ +export function getUniqueProcessThreadId(): string { + return `${getUniqueTimeId()}-${process.pid}-${threadId}`; +} + +/** + * Generates a unique instance ID based on performance time origin, process ID, thread ID, and instance count. + * This ID uniquely identifies an instance across processes and threads. + * Format: timestamp.pid.threadId.counter + * Example: "20240101-120000-000.12345.1.1" + * + * @param counter - Counter that provides the next instance count value + * @returns A unique ID string combining timestamp, process ID, thread ID, and counter + */ +export function getUniqueInstanceId(counter: Counter): string { + return `${getUniqueTimeId()}.${process.pid}.${threadId}.${counter.next()}`; +} + +/** + * Generates a unique instance ID and updates a static class property. + * Encapsulates the read → increment → write pattern safely within a single execution context. + * + * @param getCount - Function that returns the current instance count + * @param setCount - Function that sets the new instance count + * @returns A unique ID string combining timestamp, process ID, thread ID, and counter + */ +export function getUniqueInstanceIdAndUpdate( + getCount: () => number, + setCount: (value: number) => void, +): string { + // eslint-disable-next-line functional/no-let + let value = getCount(); + const counter: Counter = { + next() { + return ++value; + }, + }; + const id = getUniqueInstanceId(counter); + setCount(value); + return id; +} + +/** + * Converts a timestamp in milliseconds to a sortable, human-readable date string. + * Format: yyyymmdd-hhmmss-ms + * Example: "20240101-120000-000" + * + * @param timestampMs - Timestamp in milliseconds + * @returns A sortable date string in yyyymmdd-hhmmss-ms format + */ +export function sortableReadableDateString(timestampMs: number): string { + const date = new Date(timestampMs); + const MILLISECONDS_PER_SECOND = 1000; + const yyyy = date.getFullYear(); + const mm = String(date.getMonth() + 1).padStart(2, '0'); + const dd = String(date.getDate()).padStart(2, '0'); + const hh = String(date.getHours()).padStart(2, '0'); + const min = String(date.getMinutes()).padStart(2, '0'); + const ss = String(date.getSeconds()).padStart(2, '0'); + // eslint-disable-next-line @typescript-eslint/no-magic-numbers + const ms = String(timestampMs % MILLISECONDS_PER_SECOND).padStart(3, '0'); + + return `${yyyy}${mm}${dd}-${hh}${min}${ss}-${ms}`; +} diff --git a/packages/utils/src/lib/process-id.unit.test.ts b/packages/utils/src/lib/process-id.unit.test.ts new file mode 100644 index 000000000..91b0c2a7a --- /dev/null +++ b/packages/utils/src/lib/process-id.unit.test.ts @@ -0,0 +1,71 @@ +import { ID_PATTERNS, getUniqueTimeId } from './process-id.js'; +import { getShardId } from './wal-sharded.js'; + +describe('getShardId (formerly getUniqueReadableInstanceId)', () => { + it('should generate shard ID with readable timestamp', () => { + const result = getShardId(); + + expect(result).toMatch(ID_PATTERNS.INSTANCE_ID); + expect(result).toStartWith('20231114-221320-000.'); + }); + + it('should generate different shard IDs for different calls', () => { + const result1 = getShardId(); + const result2 = getShardId(); + + expect(result1).not.toBe(result2); + expect(result1).toStartWith('20231114-221320-000.'); + expect(result2).toStartWith('20231114-221320-000.'); + }); + + it('should handle zero values', () => { + const result = getShardId(); + expect(result).toStartWith('20231114-221320-000.'); + }); + + it('should handle negative timestamps', () => { + const result = getShardId(); + + expect(result).toStartWith('20231114-221320-000.'); + }); + + it('should handle large timestamps', () => { + const result = getShardId(); + + expect(result).toStartWith('20231114-221320-000.'); + }); + + it('should generate incrementing counter', () => { + const result1 = getShardId(); + const result2 = getShardId(); + + const parts1 = result1.split('.'); + const parts2 = result2.split('.'); + const counter1 = parts1.at(-1) as string; + const counter2 = parts2.at(-1) as string; + + expect(Number.parseInt(counter1, 10)).toBe( + Number.parseInt(counter2, 10) - 1, + ); + }); +}); + +describe('getUniqueTimeId (formerly getUniqueRunId)', () => { + it('should work with mocked timeOrigin', () => { + const result = getUniqueTimeId(); + + expect(result).toBe('20231114-221320-000'); + expect(result).toMatch(ID_PATTERNS.GROUP_ID); + }); + + it('should generate new ID on each call (not idempotent)', () => { + const result1 = getUniqueTimeId(); + const result2 = getUniqueTimeId(); + + // Note: getUniqueTimeId is not idempotent - it generates a new ID each call + // based on current time, so results will be different + expect(result1).toMatch(ID_PATTERNS.GROUP_ID); + expect(result2).toMatch(ID_PATTERNS.GROUP_ID); + // They may be the same if called within the same millisecond, but generally different + }); +}); diff --git a/packages/utils/src/lib/profiler/__snapshots__/buffered-test.json b/packages/utils/src/lib/profiler/__snapshots__/buffered-test.json new file mode 100644 index 000000000..09b949711 --- /dev/null +++ b/packages/utils/src/lib/profiler/__snapshots__/buffered-test.json @@ -0,0 +1 @@ +{"traceEvents":[{"cat":"devtools.timeline","ph":"i","name":"TracingStartedInBrowser","pid":10001,"tid":1,"ts":1700000005000000,"args":{"data":{"frameTreeNodeId":1000101,"frames":[{"frame":"FRAME0P10001T1","isInPrimaryMainFrame":true,"isOutermostMainFrame":true,"name":"","processId":10001,"url":"generated-trace"}],"persistentIds":true}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000000,"ph":"X","name":"[trace padding start]","dur":20000,"args":{}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000100,"name":"write-buffered-j-jl:measure:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000200,"name":"write-buffered-j-jl:measure","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\",\"tooltipText\":\"Buffered sync measurement returned :\\\"sync success\\\"\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000300,"name":"write-buffered-j-jl:measure","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\",\"tooltipText\":\"Buffered sync measurement returned :\\\"sync success\\\"\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000400,"name":"write-buffered-j-jl:measure:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000500,"name":"write-buffered-j-jl:async-measure:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000600,"name":"write-buffered-j-jl:async-measure","ph":"b","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\",\"tooltipText\":\"sync measurement returned :\\\"async success\\\"\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000700,"name":"write-buffered-j-jl:async-measure","ph":"e","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\",\"tooltipText\":\"sync measurement returned :\\\"async success\\\"\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000800,"name":"write-buffered-j-jl:async-measure:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"track-entry\",\"track\":\"Buffered Track\",\"trackGroup\":\"Buffered Track\",\"color\":\"tertiary\"}}"}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000900,"ph":"X","name":"[trace padding end]","dur":20000,"args":{}}],"metadata":{"source":"DevTools","startTime":"2026-01-28T14:29:27.995Z","hardwareConcurrency":1,"dataOrigin":"TraceEvents"}} \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/comprehensive-stats-trace-events.jsonl b/packages/utils/src/lib/profiler/__snapshots__/comprehensive-stats-trace-events.jsonl deleted file mode 100644 index 5583ed827..000000000 --- a/packages/utils/src/lib/profiler/__snapshots__/comprehensive-stats-trace-events.jsonl +++ /dev/null @@ -1,8 +0,0 @@ -{"cat":"blink.user_timing","ph":"i","name":"stats-profiler:operation-1:start","pid":10001,"tid":1,"ts":1700000005000000,"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","ph":"b","name":"stats-profiler:operation-1","id2":{"local":"0x1"},"pid":10001,"tid":1,"ts":1700000005000001,"args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","ph":"e","name":"stats-profiler:operation-1","id2":{"local":"0x1"},"pid":10001,"tid":1,"ts":1700000005000002,"args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","ph":"i","name":"stats-profiler:operation-1:end","pid":10001,"tid":1,"ts":1700000005000003,"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","ph":"i","name":"stats-profiler:operation-2:start","pid":10001,"tid":1,"ts":1700000005000004,"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","ph":"b","name":"stats-profiler:operation-2","id2":{"local":"0x2"},"pid":10001,"tid":1,"ts":1700000005000005,"args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","ph":"e","name":"stats-profiler:operation-2","id2":{"local":"0x2"},"pid":10001,"tid":1,"ts":1700000005000006,"args":{"data":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","ph":"i","name":"stats-profiler:operation-2:end","pid":10001,"tid":1,"ts":1700000005000007,"args":{"detail":"{\"devtools\":{\"track\":\"Stats\",\"dataType\":\"track-entry\"}}"}} diff --git a/packages/utils/src/lib/profiler/__snapshots__/custom-tracks-trace-events.jsonl b/packages/utils/src/lib/profiler/__snapshots__/custom-tracks-trace-events.jsonl deleted file mode 100644 index 43f83dbdb..000000000 --- a/packages/utils/src/lib/profiler/__snapshots__/custom-tracks-trace-events.jsonl +++ /dev/null @@ -1,4 +0,0 @@ -{"cat":"blink.user_timing","ph":"i","name":"api-server:user-lookup:start","pid":10001,"tid":1,"ts":1700000005000000,"args":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","ph":"b","name":"api-server:user-lookup","id2":{"local":"0x1"},"pid":10001,"tid":1,"ts":1700000005000001,"args":{"data":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","ph":"e","name":"api-server:user-lookup","id2":{"local":"0x1"},"pid":10001,"tid":1,"ts":1700000005000002,"args":{"data":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","ph":"i","name":"api-server:user-lookup:end","pid":10001,"tid":1,"ts":1700000005000003,"args":{"detail":"{\"devtools\":{\"track\":\"cache\",\"dataType\":\"track-entry\"}}"}} diff --git a/packages/utils/src/lib/profiler/__snapshots__/debugMode-test.json b/packages/utils/src/lib/profiler/__snapshots__/debugMode-test.json new file mode 100644 index 000000000..a1791c7d5 --- /dev/null +++ b/packages/utils/src/lib/profiler/__snapshots__/debugMode-test.json @@ -0,0 +1 @@ +{"traceEvents":[{"cat":"devtools.timeline","ph":"i","name":"TracingStartedInBrowser","pid":10001,"tid":1,"ts":1700000005000000,"args":{"data":{"frameTreeNodeId":1000101,"frames":[{"frame":"FRAME0P10001T1","isInPrimaryMainFrame":true,"isOutermostMainFrame":true,"name":"","processId":10001,"url":"generated-trace"}],"persistentIds":true}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000000,"ph":"X","name":"[trace padding start]","dur":20000,"args":{}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000100,"name":"Enable profiler","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"marker\",\"tooltipText\":\"set enable to true\"}}"}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000200,"ph":"X","name":"[trace padding end]","dur":20000,"args":{}}],"metadata":{"source":"DevTools","startTime":"2026-01-28T14:29:27.995Z","hardwareConcurrency":1,"dataOrigin":"TraceEvents"}} \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.json b/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.json new file mode 100644 index 000000000..bd6cfdc70 --- /dev/null +++ b/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.json @@ -0,0 +1 @@ +{"traceEvents":[{"cat":"devtools.timeline","ph":"i","name":"TracingStartedInBrowser","pid":10001,"tid":1,"ts":1700000005000000,"args":{"data":{"frameTreeNodeId":1000101,"frames":[{"frame":"FRAME0P10001T1","isInPrimaryMainFrame":true,"isOutermostMainFrame":true,"name":"","processId":10001,"url":"generated-trace"}],"persistentIds":true}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005000000,"ph":"X","name":"[trace padding start]","dur":20000,"args":{}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000100,"name":"Enable profiler","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"marker\",\"tooltipText\":\"set enable to true\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000200,"name":"write-j-jl:sync-measure:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000300,"name":"write-j-jl:sync-measure","ph":"b","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000400,"name":"write-j-jl:sync-measure","ph":"e","id2":{"local":"0x1"},"args":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000500,"name":"write-j-jl:sync-measure:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000600,"name":"write-j-jl:async-measure:start","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000700,"name":"write-j-jl:async-measure","ph":"b","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000800,"name":"write-j-jl:async-measure","ph":"e","id2":{"local":"0x2"},"args":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005000900,"name":"write-j-jl:async-measure:end","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"track\":\"int-test-track\",\"dataType\":\"track-entry\"}}"}}},{"cat":"blink.user_timing","pid":10001,"tid":1,"ts":1700000005001000,"name":"Disable profiler","ph":"I","args":{"data":{"detail":"{\"devtools\":{\"dataType\":\"marker\",\"tooltipText\":\"set enable to false\"}}"}}},{"cat":"devtools.timeline","pid":10001,"tid":1,"ts":1700000005001100,"ph":"X","name":"[trace padding end]","dur":20000,"args":{}}],"metadata":{"source":"DevTools","startTime":"2026-01-28T14:29:27.995Z","hardwareConcurrency":1,"dataOrigin":"TraceEvents"}} \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.jsonl b/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.jsonl new file mode 100644 index 000000000..cee32fc3e --- /dev/null +++ b/packages/utils/src/lib/profiler/__snapshots__/entries-write-to-shard.jsonl @@ -0,0 +1,186 @@ +[ + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "marker", + "tooltipText": "set enable to true", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "Enable profiler", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000000, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "write-j-jl:sync-measure:start", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000100, + }, + { + "args": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + "cat": "blink.user_timing", + "id2": { + "local": "0x1", + }, + "name": "write-j-jl:sync-measure", + "ph": "b", + "pid": 10001, + "tid": 1, + "ts": 1700000005000200, + }, + { + "args": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + "cat": "blink.user_timing", + "id2": { + "local": "0x1", + }, + "name": "write-j-jl:sync-measure", + "ph": "e", + "pid": 10001, + "tid": 1, + "ts": 1700000005000300, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "write-j-jl:sync-measure:end", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000400, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "write-j-jl:async-measure:start", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000500, + }, + { + "args": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + "cat": "blink.user_timing", + "id2": { + "local": "0x2", + }, + "name": "write-j-jl:async-measure", + "ph": "b", + "pid": 10001, + "tid": 1, + "ts": 1700000005000600, + }, + { + "args": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + "cat": "blink.user_timing", + "id2": { + "local": "0x2", + }, + "name": "write-j-jl:async-measure", + "ph": "e", + "pid": 10001, + "tid": 1, + "ts": 1700000005000700, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "track-entry", + "track": "int-test-track", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "write-j-jl:async-measure:end", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000800, + }, + { + "args": { + "data": { + "detail": { + "devtools": { + "dataType": "marker", + "tooltipText": "set enable to false", + }, + }, + }, + }, + "cat": "blink.user_timing", + "name": "Disable profiler", + "ph": "I", + "pid": 10001, + "tid": 1, + "ts": 1700000005000900, + }, +] \ No newline at end of file diff --git a/packages/utils/src/lib/profiler/__snapshots__/sharded-path-trace-events.jsonl b/packages/utils/src/lib/profiler/__snapshots__/sharded-path-trace-events.jsonl deleted file mode 100644 index 2a30bcd0a..000000000 --- a/packages/utils/src/lib/profiler/__snapshots__/sharded-path-trace-events.jsonl +++ /dev/null @@ -1,4 +0,0 @@ -{"cat":"blink.user_timing","ph":"i","name":"write-test:test-operation:start","pid":10001,"tid":1,"ts":1700000005000000,"args":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}} -{"cat":"blink.user_timing","ph":"b","name":"write-test:test-operation","id2":{"local":"0x1"},"pid":10001,"tid":1,"ts":1700000005000001,"args":{"data":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","ph":"e","name":"write-test:test-operation","id2":{"local":"0x1"},"pid":10001,"tid":1,"ts":1700000005000002,"args":{"data":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}}} -{"cat":"blink.user_timing","ph":"i","name":"write-test:test-operation:end","pid":10001,"tid":1,"ts":1700000005000003,"args":{"detail":"{\"devtools\":{\"track\":\"Test\",\"dataType\":\"track-entry\"}}"}} diff --git a/packages/utils/src/lib/profiler/constants.ts b/packages/utils/src/lib/profiler/constants.ts index c0e515787..03c60f7e0 100644 --- a/packages/utils/src/lib/profiler/constants.ts +++ b/packages/utils/src/lib/profiler/constants.ts @@ -12,16 +12,48 @@ export const PROFILER_ENABLED_ENV_VAR = 'CP_PROFILING'; * When set to 'true', profiler state transitions create performance marks for debugging. * * @example - * CP_PROFILER_DEBUG=true npm run dev + * DEBUG=true npm run dev */ -export const PROFILER_DEBUG_ENV_VAR = 'CP_PROFILER_DEBUG'; +export const PROFILER_DEBUG_ENV_VAR = 'DEBUG'; /** - * Environment variable name for setting the Sharded WAL Coordinator ID. + * Environment variable name for setting the Profiler Sharder ID. * This ID is used to identify the coordinator instance in a sharded Write-Ahead Logging setup. * * @example - * CP_SHARDED_WAL_COORDINATOR_ID=coordinator-1 npm run dev + * CP_PROFILER_SHARDER_ID=coordinator-1 npm run dev */ -export const SHARDED_WAL_COORDINATOR_ID_ENV_VAR = - 'CP_SHARDED_WAL_COORDINATOR_ID'; +export const PROFILER_SHARDER_ID_ENV_VAR = 'CP_PROFILER_SHARDER_ID'; + +/** + * Default output directory for persisted profiler data. + * Matches the default persist output directory from models. + */ +export const PROFILER_PERSIST_OUT_DIR = 'tmp/profiles'; + +/** + * Environment variable name for setting the output directory for profiler data. + * When set, profiler data is written to the specified directory. + * + * @example + * CP_PROFILER_OUT_DIR=/path/to/output npm run dev + */ +export const PROFILER_OUT_DIR_ENV_VAR = 'CP_PROFILER_OUT_DIR'; + +/** + * Environment variable name for setting the measure name for profiler data. + * When set, the measure name is used to identify the profiler data. + */ +export const PROFILER_MEASURE_NAME_ENV_VAR = 'CP_PROFILER_MEASURE_NAME'; + +/** + * Default base name for WAL files. + * Used as the base name for sharded WAL files (e.g., "trace"). + */ +export const PROFILER_OUT_BASENAME = 'trace'; + +/** + * Default base name for WAL files. + * Used as the base name for sharded WAL files (e.g., "trace"). + */ +export const PROFILER_DEBUG_MEASURE_PREFIX = 'debug'; diff --git a/packages/utils/src/lib/profiler/profiler-node.int.test.ts b/packages/utils/src/lib/profiler/profiler-node.int.test.ts index 1b903ee5a..f4727e497 100644 --- a/packages/utils/src/lib/profiler/profiler-node.int.test.ts +++ b/packages/utils/src/lib/profiler/profiler-node.int.test.ts @@ -1,317 +1,458 @@ import fs from 'node:fs'; +import fsPromises from 'node:fs/promises'; import path from 'node:path'; +import process from 'node:process'; +import { fileURLToPath } from 'node:url'; +import { afterAll, afterEach, beforeEach, expect } from 'vitest'; +import { awaitObserverCallbackAndFlush } from '@code-pushup/test-utils'; import { - awaitObserverCallbackAndFlush, - omitTraceJson, -} from '@code-pushup/test-utils'; + loadAndOmitTraceJsonl, + loadNormalizedTraceJson, +} from '../../../mocks/omit-trace-json.js'; +import { executeProcess } from '../execute-process.js'; import type { PerformanceEntryEncoder } from '../performance-observer.js'; -import { WAL_ID_PATTERNS } from '../wal.js'; -import { NodejsProfiler } from './profiler-node.js'; +import { + asOptions, + trackEntryPayload, +} from '../user-timing-extensibility-api-utils.js'; +import type { + ActionTrackEntryPayload, + TrackEntryPayload, +} from '../user-timing-extensibility-api.type.js'; +import { + PROFILER_DEBUG_ENV_VAR, + PROFILER_ENABLED_ENV_VAR, + PROFILER_MEASURE_NAME_ENV_VAR, + PROFILER_OUT_BASENAME, + PROFILER_OUT_DIR_ENV_VAR, + PROFILER_SHARDER_ID_ENV_VAR, +} from './constants.js'; +import { NodejsProfiler, type NodejsProfilerOptions } from './profiler-node.js'; import { entryToTraceEvents } from './trace-file-utils.js'; -import type { UserTimingTraceEvent } from './trace-file.type.js'; +import type { TraceEvent } from './trace-file.type.js'; +import { traceEventWalFormat } from './wal-json-trace.js'; describe('NodeJS Profiler Integration', () => { - const traceEventEncoder: PerformanceEntryEncoder = + const traceEventEncoder: PerformanceEntryEncoder = entryToTraceEvents; + const testSuitDir = path.join(process.cwd(), 'tmp', 'int', 'utils'); + const activeProfilers: NodejsProfiler[] = []; + + const workerScriptPath = path.resolve( + fileURLToPath(path.dirname(import.meta.url)), + '../../../mocks/multiprocess-profiling/profiler-worker.mjs', + ); + + function nodejsProfiler( + optionsOrMeasureName: + | string + | (Partial< + NodejsProfilerOptions< + TraceEvent, + Record + > + > & { measureName: string }), + ): NodejsProfiler { + const options = + typeof optionsOrMeasureName === 'string' + ? { measureName: optionsOrMeasureName } + : optionsOrMeasureName; + const profiler = new NodejsProfiler({ + ...options, + track: options.track ?? 'int-test-track', + format: { + ...traceEventWalFormat(), + encodePerfEntry: traceEventEncoder, + baseName: options.format?.baseName ?? PROFILER_OUT_BASENAME, + }, + outDir: testSuitDir, + enabled: options.enabled ?? true, + debug: options.debug ?? false, + measureName: options.measureName, + }); + // eslint-disable-next-line functional/immutable-data + activeProfilers.push(profiler); + return profiler; + } + + async function create3rdPartyMeasures(prefix: string) { + const defaultPayload: TrackEntryPayload = { + track: 'Buffered Track', + trackGroup: 'Buffered Track', + color: 'tertiary', + }; + + await new Promise(resolve => setTimeout(resolve, 10)); + + expect(() => + performance.mark( + `${prefix}${prefix ? ':' : ''}measure:start`, + asOptions(trackEntryPayload(defaultPayload)), + ), + ).not.toThrowError(); + + const largeArray = Array.from({ length: 100_000 }, (_, i) => i); + const result = largeArray + .map(x => x * x) + .filter(x => x % 2 === 0) + .reduce((sum, x) => sum + x, 0); + expect(result).toBeGreaterThan(0); + expect('sync success').toBe('sync success'); + expect(() => + performance.mark( + `${prefix}${prefix ? ':' : ''}measure:end`, + asOptions(trackEntryPayload(defaultPayload)), + ), + ).not.toThrowError(); + + performance.measure(`${prefix}${prefix ? ':' : ''}measure`, { + start: `${prefix}${prefix ? ':' : ''}measure:start`, + end: `${prefix}${prefix ? ':' : ''}measure:end`, + ...asOptions( + trackEntryPayload({ + ...defaultPayload, + tooltipText: 'Buffered sync measurement returned :"sync success"', + }), + ), + }); - let nodejsProfiler: NodejsProfiler; + await new Promise(resolve => setTimeout(resolve, 10)); - beforeEach(() => { - performance.clearMarks(); - performance.clearMeasures(); - vi.stubEnv('CP_PROFILING', undefined!); - vi.stubEnv('CP_PROFILER_DEBUG', undefined!); - - // Clean up trace files from previous test runs - const traceFilesDir = path.join(process.cwd(), 'tmp', 'int', 'utils'); - // eslint-disable-next-line n/no-sync - if (fs.existsSync(traceFilesDir)) { - // eslint-disable-next-line n/no-sync - const files = fs.readdirSync(traceFilesDir); - // eslint-disable-next-line functional/no-loop-statements - for (const file of files) { - if (file.endsWith('.json') || file.endsWith('.jsonl')) { - // eslint-disable-next-line n/no-sync - fs.unlinkSync(path.join(traceFilesDir, file)); - } - } - } - - nodejsProfiler = new NodejsProfiler({ - prefix: 'test', - track: 'test-track', - encodePerfEntry: traceEventEncoder, - filename: path.join(process.cwd(), 'tmp', 'int', 'utils', 'trace.json'), - enabled: true, + expect(() => + performance.mark( + `${prefix}:async-measure:start`, + asOptions(trackEntryPayload(defaultPayload)), + ), + ).not.toThrowError(); + // Heavy work: More CPU-intensive operations + const matrix = Array.from({ length: 1000 }, () => + Array.from({ length: 1000 }, (_, i) => i), + ); + const flattened = matrix.flat(); + const sum = flattened.reduce((acc, val) => acc + val, 0); + expect(sum).toBeGreaterThan(0); + await expect(Promise.resolve('async success')).resolves.toBe( + 'async success', + ); + expect(() => + performance.mark( + `${prefix}:async-measure:end`, + asOptions(trackEntryPayload(defaultPayload)), + ), + ).not.toThrowError(); + + performance.measure(`${prefix}:async-measure`, { + start: `${prefix}:async-measure:start`, + end: `${prefix}:async-measure:end`, + ...asOptions( + trackEntryPayload({ + ...defaultPayload, + tooltipText: 'sync measurement returned :"async success"', + }), + ), }); - }); + } - afterEach(() => { - if (nodejsProfiler && nodejsProfiler.state !== 'closed') { - nodejsProfiler.close(); - } - vi.stubEnv('CP_PROFILING', undefined!); - vi.stubEnv('CP_PROFILER_DEBUG', undefined!); - }); + async function createBasicMeasures(profiler: NodejsProfiler) { + expect(() => + profiler.marker(`Enable profiler`, { + tooltipText: 'set enable to true', + }), + ).not.toThrowError(); - it('should initialize with sink opened when enabled', () => { - expect(nodejsProfiler.isEnabled()).toBeTrue(); - expect(nodejsProfiler.stats.walOpen).toBeTrue(); - }); + await new Promise(resolve => setTimeout(resolve, 50)); - it('should create performance entries and write to sink', () => { - expect(nodejsProfiler.measure('test-operation', () => 'success')).toBe( - 'success', - ); - }); + expect(profiler.measure(`sync-measure`, () => 'success')).toBe('success'); + + await new Promise(resolve => setTimeout(resolve, 50)); - it('should handle async operations', async () => { await expect( - nodejsProfiler.measureAsync('async-test', async () => { - await new Promise(resolve => setTimeout(resolve, 1)); - return 'async-result'; + profiler.measureAsync(`async-measure`, () => + Promise.resolve('async success'), + ), + ).resolves.toBe('async success'); + + await new Promise(resolve => setTimeout(resolve, 50)); + + expect(() => + profiler.marker(`Disable profiler`, { + tooltipText: 'set enable to false', }), - ).resolves.toBe('async-result'); - }); + ).not.toThrowError(); + } - it('should disable profiling and close sink', () => { - nodejsProfiler.setEnabled(false); - expect(nodejsProfiler.isEnabled()).toBeFalse(); - expect(nodejsProfiler.stats.walOpen).toBeFalse(); + beforeEach(async () => { + if (fs.existsSync(testSuitDir)) { + fs.rmSync(testSuitDir, { recursive: true, force: true }); + } + fs.mkdirSync(testSuitDir, { recursive: true }); - expect(nodejsProfiler.measure('disabled-test', () => 'success')).toBe( - 'success', - ); + performance.clearMarks(); + performance.clearMeasures(); + vi.stubEnv(PROFILER_ENABLED_ENV_VAR, undefined!); + vi.stubEnv(PROFILER_DEBUG_ENV_VAR, undefined!); + // eslint-disable-next-line functional/immutable-data, @typescript-eslint/no-dynamic-delete + delete process.env[PROFILER_SHARDER_ID_ENV_VAR]; }); - it('should re-enable profiling correctly', () => { - nodejsProfiler.setEnabled(false); - expect(nodejsProfiler.stats.walOpen).toBeFalse(); - - nodejsProfiler.setEnabled(true); + afterEach(() => { + // eslint-disable-next-line functional/no-loop-statements + for (const profiler of activeProfilers) { + if (profiler.stats.profilerState !== 'closed') { + profiler.close(); + } + } + // eslint-disable-next-line functional/immutable-data + activeProfilers.length = 0; - expect(nodejsProfiler.isEnabled()).toBeTrue(); - expect(nodejsProfiler.stats.walOpen).toBeTrue(); + vi.stubEnv(PROFILER_ENABLED_ENV_VAR, undefined!); + vi.stubEnv(PROFILER_DEBUG_ENV_VAR, undefined!); + // eslint-disable-next-line functional/immutable-data, @typescript-eslint/no-dynamic-delete + delete process.env[PROFILER_SHARDER_ID_ENV_VAR]; + }); - expect(nodejsProfiler.measure('re-enabled-test', () => 42)).toBe(42); + afterAll(async () => { + // Final cleanup of test directory + if (fs.existsSync(testSuitDir)) { + // await fsPromises.rm(testSuitDir, { recursive: true, force: true }); + } }); - it('should support custom tracks', async () => { - const traceTracksFile = path.join( - process.cwd(), - 'tmp', - 'int', - 'utils', - 'trace-tracks.json', + it('should initialize with shard opened when enabled', () => { + const profiler = nodejsProfiler('initialize-shard-opened'); + expect(profiler.isEnabled()).toBe(true); + expect(profiler.stats).toEqual( + expect.objectContaining({ + profilerState: 'running', + shardOpen: true, + isSubscribed: true, + }), ); - const profilerWithTracks = new NodejsProfiler({ - prefix: 'api-server', - track: 'HTTP', - tracks: { - db: { track: 'Database', color: 'secondary' }, - cache: { track: 'Cache', color: 'primary' }, - }, - encodePerfEntry: traceEventEncoder, - filename: traceTracksFile, - enabled: true, - }); + }); - expect(profilerWithTracks.filePath).toBe(traceTracksFile); + it('should create mark and measure performance entries and write to .jsonl and .json', async () => { + const measureName = 'entries-write-to-shard'; + const prefix = 'write-j-jl'; + const profiler = nodejsProfiler({ + prefix, + measureName, + }); - expect( - profilerWithTracks.measure('user-lookup', () => 'user123', { - track: 'cache', - }), - ).toBe('user123'); + await createBasicMeasures(profiler); - await awaitObserverCallbackAndFlush(profilerWithTracks); - profilerWithTracks.close(); + await awaitObserverCallbackAndFlush(profiler); + await expect( + loadAndOmitTraceJsonl(profiler.stats.shardPath as `${string}.jsonl`), + ).resolves.toMatchFileSnapshot(`__snapshots__/${measureName}.jsonl`); + profiler.close(); - // eslint-disable-next-line n/no-sync - const content = fs.readFileSync(traceTracksFile, 'utf8'); - const normalizedContent = omitTraceJson(content); - await expect(normalizedContent).toMatchFileSnapshot( - '__snapshots__/custom-tracks-trace-events.jsonl', + const snapshotData = await loadNormalizedTraceJson( + profiler.stats.finalFilePath as `${string}.json`, + ); + expect(JSON.stringify(snapshotData)).toMatchFileSnapshot( + `__snapshots__/${measureName}.json`, ); }); - it('should capture buffered entries when buffered option is enabled', () => { - const bufferedProfiler = new NodejsProfiler({ - prefix: 'buffered-test', - track: 'Test', - encodePerfEntry: traceEventEncoder, + it('should capture buffered entries when buffered option is enabled', async () => { + const measureName = 'buffered-test'; + const prefix = 'write-buffered-j-jl'; + await create3rdPartyMeasures(prefix); + + const profiler = nodejsProfiler({ + prefix, + measureName, captureBufferedEntries: true, - filename: path.join( - process.cwd(), - 'tmp', - 'int', - 'utils', - 'trace-buffered.json', - ), - enabled: true, }); + await awaitObserverCallbackAndFlush(profiler); + profiler.close(); - const bufferedStats = bufferedProfiler.stats; - expect(bufferedStats.state).toBe('running'); - expect(bufferedStats.walOpen).toBeTrue(); - expect(bufferedStats.isSubscribed).toBeTrue(); - expect(bufferedStats.queued).toBe(0); - expect(bufferedStats.dropped).toBe(0); - expect(bufferedStats.written).toBe(0); + const snapshotData = await loadNormalizedTraceJson( + profiler.stats.finalFilePath as `${string}.json`, + ); - bufferedProfiler.close(); + expect(JSON.stringify(snapshotData)).toMatchFileSnapshot( + `__snapshots__/${measureName}.json`, + ); }); it('should return correct getStats with dropped and written counts', () => { - const statsProfiler = new NodejsProfiler({ - prefix: 'stats-test', - track: 'Stats', - encodePerfEntry: traceEventEncoder, - maxQueueSize: 2, - flushThreshold: 2, - filename: path.join( - process.cwd(), - 'tmp', - 'int', - 'utils', - 'trace-stats.json', - ), - enabled: true, - }); + const prefix = 'stats-test'; + const statsProfiler = nodejsProfiler(prefix); expect(statsProfiler.measure('test-op', () => 'result')).toBe('result'); const stats = statsProfiler.stats; - expect(stats.state).toBe('running'); - expect(stats.walOpen).toBeTrue(); - expect(stats.isSubscribed).toBeTrue(); - expect(typeof stats.queued).toBe('number'); - expect(typeof stats.dropped).toBe('number'); - expect(typeof stats.written).toBe('number'); + expect(stats).toEqual( + expect.objectContaining({ + profilerState: 'running', + shardOpen: true, + isSubscribed: true, + groupId: prefix, + maxQueueSize: 10_000, + flushThreshold: 20, + buffered: true, + isCoordinator: true, + }), + ); statsProfiler.close(); }); it('should provide comprehensive queue statistics via getStats', async () => { - const traceStatsFile = path.join( - process.cwd(), - 'tmp', - 'int', - 'utils', - 'trace-stats-comprehensive.json', - ); - const profiler = new NodejsProfiler({ - prefix: 'stats-profiler', + const prefix = 'stats-comprehensive'; + const profiler = nodejsProfiler({ + measureName: prefix, track: 'Stats', - encodePerfEntry: traceEventEncoder, - maxQueueSize: 3, flushThreshold: 2, - filename: traceStatsFile, - enabled: true, + maxQueueSize: 3, }); const initialStats = profiler.stats; - expect(initialStats.state).toBe('running'); - expect(initialStats.walOpen).toBeTrue(); - expect(initialStats.isSubscribed).toBeTrue(); - expect(initialStats.queued).toBe(0); - expect(initialStats.dropped).toBe(0); - expect(initialStats.written).toBe(0); + expect(initialStats).toEqual( + expect.objectContaining({ + profilerState: 'running', + shardOpen: true, + isSubscribed: true, + groupId: prefix, + queued: 0, + dropped: 0, + written: 0, + maxQueueSize: 3, + flushThreshold: 2, + buffered: true, + isCoordinator: true, + }), + ); profiler.measure('operation-1', () => 'result1'); profiler.measure('operation-2', () => 'result2'); await awaitObserverCallbackAndFlush(profiler); - // Each measure creates 4 events (start marker, begin span, end span, end marker) - // 2 measures × 4 events = 8 events written expect(profiler.stats.written).toBe(8); profiler.setEnabled(false); const finalStats = profiler.stats; - expect(finalStats.state).toBe('idle'); - expect(finalStats.walOpen).toBeFalse(); - expect(finalStats.isSubscribed).toBeFalse(); - expect(finalStats.queued).toBe(0); - - profiler.flush(); - profiler.close(); - - // eslint-disable-next-line n/no-sync - const content = fs.readFileSync(traceStatsFile, 'utf8'); - const normalizedContent = omitTraceJson(content); - await expect(normalizedContent).toMatchFileSnapshot( - '__snapshots__/comprehensive-stats-trace-events.jsonl', + expect(finalStats).toEqual( + expect.objectContaining({ + profilerState: 'idle', + shardOpen: false, + isSubscribed: false, + groupId: prefix, + queued: 0, + written: 8, + maxQueueSize: 3, + flushThreshold: 2, + buffered: true, + isCoordinator: true, + }), ); }); - describe('sharded path structure', () => { - it('should create sharded path structure when filename is not provided', () => { - const profiler = new NodejsProfiler({ - prefix: 'sharded-test', - track: 'Test', - encodePerfEntry: traceEventEncoder, - enabled: true, - }); + it('should create sharded path structure when filename is not provided', async () => { + const prefix = 'sharded-test'; + const measureName = prefix; + const profiler = nodejsProfiler(measureName); - const filePath = profiler.filePath; - expect(filePath).toContainPath('tmp/profiles'); - expect(filePath).toMatch(/\.jsonl$/); + const { finalFilePath, shardPath } = profiler.stats; + expect(finalFilePath).toContainPath('tmp/int/utils'); + expect(finalFilePath).toMatch(/\.json$/); - const pathParts = filePath.split(path.sep); - const groupIdDir = pathParts.at(-2); - const fileName = pathParts.at(-1); + const pathParts = finalFilePath.split(path.sep); + const groupIdDir = pathParts.at(-2); + const fileName = pathParts.at(-1); - expect(groupIdDir).toMatch(WAL_ID_PATTERNS.GROUP_ID); - expect(fileName).toMatch(/^trace\.\d{8}-\d{6}-\d{3}(?:\.\d+){3}\.jsonl$/); + expect(groupIdDir).toStrictEqual(measureName); + expect(fileName).toMatch( + new RegExp(`^${PROFILER_OUT_BASENAME}\\.${measureName}\\.json$`), + ); - const groupIdDirPath = path.dirname(filePath); - // eslint-disable-next-line n/no-sync - expect(fs.existsSync(groupIdDirPath)).toBeTrue(); + expect(shardPath).toContain(measureName); + expect(shardPath).toMatch(/\.jsonl$/); - profiler.close(); - }); + const groupIdDirPath = path.dirname(finalFilePath); + await expect(fsPromises.access(groupIdDirPath)).resolves.not.toThrowError(); - it('should create correct folder structure for sharded paths', () => { - const profiler = new NodejsProfiler({ - prefix: 'folder-test', - track: 'Test', - encodePerfEntry: traceEventEncoder, - enabled: true, - }); + profiler.close(); + }); - const filePath = profiler.filePath; - const dirPath = path.dirname(filePath); - const groupId = path.basename(dirPath); + it('should handle sharding across multiple processes', async () => { + const numProcesses = 3; + const startTime = performance.now(); + + const { [PROFILER_MEASURE_NAME_ENV_VAR]: _measureName, ...cleanEnv } = + process.env; + + const processStartTime = performance.now(); + const { stdout, stderr } = await executeProcess({ + command: 'npx', + args: [ + 'tsx', + '--tsconfig', + 'tsconfig.base.json', + path.relative(process.cwd(), workerScriptPath), + String(numProcesses), + ], + cwd: process.cwd(), + env: { + ...cleanEnv, + [PROFILER_ENABLED_ENV_VAR]: 'true', + [PROFILER_DEBUG_ENV_VAR]: 'true', + [PROFILER_OUT_DIR_ENV_VAR]: testSuitDir, + }, + }); + const processDuration = performance.now() - processStartTime; - expect(groupId).toMatch(WAL_ID_PATTERNS.GROUP_ID); - // eslint-disable-next-line n/no-sync - expect(fs.existsSync(dirPath)).toBeTrue(); - // eslint-disable-next-line n/no-sync - expect(fs.statSync(dirPath).isDirectory()).toBeTrue(); + if (!stdout.trim()) { + const stderrMessage = stderr ? ` stderr: ${stderr}` : ''; + throw new Error( + `Worker process produced no stdout output.${stderrMessage}`, + ); + } - profiler.close(); - }); + let coordinatorStats; + try { + coordinatorStats = JSON.parse(stdout.trim()); + } catch (error) { + throw new Error( + `Failed to parse worker output as JSON. stdout: "${stdout}", stderr: "${stderr}"`, + { cause: error }, + ); + } - it('should write trace events to sharded path file', async () => { - const profiler = new NodejsProfiler({ - prefix: 'write-test', - track: 'Test', - encodePerfEntry: traceEventEncoder, - enabled: true, - }); + const validationStartTime = performance.now(); + expect(coordinatorStats).toMatchObject({ + isCoordinator: true, + shardFileCount: numProcesses + 1, // numProcesses child processes + 1 coordinator shard + groupId: expect.stringMatching(/^\d{8}-\d{6}-\d{3}$/), // Auto-generated groupId format + }); - profiler.measure('test-operation', () => 'result'); + // Verify all processes share the same groupId + const groupId = coordinatorStats.groupId; + expect(coordinatorStats.finalFilePath).toContainPath(groupId); - await awaitObserverCallbackAndFlush(profiler); - profiler.close(); + const snapshotData = await loadNormalizedTraceJson( + coordinatorStats.finalFilePath as `${string}.json`, + ); - const filePath = profiler.filePath; - // eslint-disable-next-line n/no-sync - const content = fs.readFileSync(filePath, 'utf8'); - const normalizedContent = omitTraceJson(content); - await expect(normalizedContent).toMatchFileSnapshot( - '__snapshots__/sharded-path-trace-events.jsonl', - ); + const processIds = new Set(); + snapshotData.traceEvents?.forEach((e: TraceEvent) => { + if (e.name?.includes('process-')) { + const match = e.name.match(/process-(\d+)/); + if (match && match[1]) { + processIds.add(match[1]); + } + } }); - }); + + expect(processIds.size).toStrictEqual(numProcesses); + const validationDuration = performance.now() - validationStartTime; + const totalDuration = performance.now() - startTime; + + // Log timing information for debugging + // eslint-disable-next-line no-console + console.log( + `[Timing] Process execution: ${processDuration.toFixed(2)}ms, Validation: ${validationDuration.toFixed(2)}ms, Total: ${totalDuration.toFixed(2)}ms`, + ); + }, 10_000); // Timeout: 10 seconds for multi-process coordination }); diff --git a/packages/utils/src/lib/profiler/profiler-node.ts b/packages/utils/src/lib/profiler/profiler-node.ts index b668b2028..86aad570a 100644 --- a/packages/utils/src/lib/profiler/profiler-node.ts +++ b/packages/utils/src/lib/profiler/profiler-node.ts @@ -1,29 +1,62 @@ -import path from 'node:path'; +import { performance } from 'node:perf_hooks'; import { isEnvVarEnabled } from '../env.js'; -import { subscribeProcessExit } from '../exit-process.js'; +import { type FatalKind, subscribeProcessExit } from '../exit-process.js'; import { type PerformanceObserverOptions, PerformanceObserverSink, } from '../performance-observer.js'; import { objectToEntries } from '../transform.js'; -import { errorToMarkerPayload } from '../user-timing-extensibility-api-utils.js'; +import { + asOptions, + errorToMarkerPayload, + markerPayload, +} from '../user-timing-extensibility-api-utils.js'; import type { ActionTrackEntryPayload, MarkerPayload, } from '../user-timing-extensibility-api.type.js'; +import { ShardedWal } from '../wal-sharded.js'; import { - type AppendableSink, + type WalFormat, + type WalRecord, WriteAheadLogFile, - getShardId, - getShardedGroupId, - getShardedPath, + parseWalFormat, } from '../wal.js'; import { - PROFILER_DEBUG_ENV_VAR, + PROFILER_DEBUG_MEASURE_PREFIX, PROFILER_ENABLED_ENV_VAR, + PROFILER_MEASURE_NAME_ENV_VAR, + PROFILER_OUT_DIR_ENV_VAR, + PROFILER_PERSIST_OUT_DIR, + PROFILER_SHARDER_ID_ENV_VAR, } from './constants.js'; import { Profiler, type ProfilerOptions } from './profiler.js'; -import { traceEventWalFormat } from './wal-json-trace.js'; + +export type ProfilerBufferOptions = Omit< + PerformanceObserverOptions, + 'sink' | 'encodePerfEntry' +>; +export type ProfilerFormat = Partial< + WalFormat +> & + Pick, 'encodePerfEntry'>; +export type PersistOptions = { + /** + * Output directory for WAL shards and final files. + * @default 'tmp/profiles' + */ + outDir?: string; + + /** + * Optional name for your measurement that is reflected in path name. If not provided, a new group ID will be generated. + */ + measureName?: string; + /** + * WAL format configuration for sharded write-ahead logging. + * Defines codec, extensions, and finalizer for the WAL files. + */ + format: ProfilerFormat; +}; /** * Options for configuring a NodejsProfiler instance. @@ -33,25 +66,13 @@ import { traceEventWalFormat } from './wal-json-trace.js'; * @template Tracks - Record type defining available track names and their configurations */ export type NodejsProfilerOptions< - DomainEvents extends string | object, - Tracks extends Record, + DomainEvents extends WalRecord, + Tracks extends Record>, > = ProfilerOptions & - Omit, 'sink'> & { - /** - * File path for the WriteAheadLogFile sink. - * If not provided, defaults to `trace.json` in the current working directory. - * - * @default path.join(process.cwd(), 'trace.json') - */ - filename?: string; - /** - * Name of the environment variable to check for debug mode. - * When the env var is set to 'true', profiler state transitions create performance marks for debugging. - * - * @default 'CP_PROFILER_DEBUG' - */ - debugEnvVar?: string; - }; + ProfilerBufferOptions & + PersistOptions; + +export type NodeJsProfilerState = 'idle' | 'running' | 'closed'; /** * Performance profiler with automatic process exit handling for buffered performance data. @@ -70,16 +91,16 @@ export type NodejsProfilerOptions< * @template Tracks - Record type defining available track names and their configurations */ export class NodejsProfiler< - DomainEvents extends string | object, + DomainEvents extends WalRecord, Tracks extends Record = Record< string, ActionTrackEntryPayload >, > extends Profiler { - #sink: AppendableSink; + #shard: WriteAheadLogFile; + #sharder: ShardedWal; #performanceObserverSink: PerformanceObserverSink; #state: 'idle' | 'running' | 'closed' = 'idle'; - #debug: boolean; #unsubscribeExitHandlers: (() => void) | undefined; /** @@ -87,45 +108,46 @@ export class NodejsProfiler< * A WriteAheadLogFile sink is automatically created for buffering performance data. * @param options - Configuration options */ - // eslint-disable-next-line max-lines-per-function + constructor(options: NodejsProfilerOptions) { +// Pick ProfilerBufferOptions const { - encodePerfEntry, captureBufferedEntries, flushThreshold, maxQueueSize, + ...allButBufferOptions + } = options; + // Pick ProfilerPersistOptions + const { + format: profilerFormat, + measureName, + outDir = PROFILER_PERSIST_OUT_DIR, enabled, - filename, - debugEnvVar = PROFILER_DEBUG_ENV_VAR, + debug, ...profilerOptions - } = options; - const initialEnabled = enabled ?? isEnvVarEnabled(PROFILER_ENABLED_ENV_VAR); - super({ ...profilerOptions, enabled: initialEnabled }); + } = allButBufferOptions; - const walFormat = traceEventWalFormat(); - this.#sink = new WriteAheadLogFile({ - file: - filename ?? - path.join( - process.cwd(), - getShardedPath({ - dir: 'tmp/profiles', - groupId: getShardedGroupId(), - shardId: getShardId(), - format: walFormat, - }), - ), - codec: walFormat.codec, - }) as AppendableSink; - this.#debug = isEnvVarEnabled(debugEnvVar); + super({ ...profilerOptions, enabled, debug }); + const { encodePerfEntry, ...format } = profilerFormat; + + this.#sharder = new ShardedWal({ + debug, + dir: process.env[PROFILER_OUT_DIR_ENV_VAR] ?? outDir, + format: parseWalFormat(format), + coordinatorIdEnvVar: PROFILER_SHARDER_ID_ENV_VAR, + measureNameEnvVar: PROFILER_MEASURE_NAME_ENV_VAR, + groupId: measureName, + }); + + this.#shard = this.#sharder.shard(); this.#performanceObserverSink = new PerformanceObserverSink({ - sink: this.#sink, + sink: this.#shard, encodePerfEntry, captureBufferedEntries, flushThreshold, maxQueueSize, - debugEnvVar, + debug: this.isDebugMode(), }); this.#unsubscribeExitHandlers = subscribeProcessExit({ @@ -140,41 +162,15 @@ export class NodejsProfiler< }, }); + const initialEnabled = + options.enabled ?? isEnvVarEnabled(PROFILER_ENABLED_ENV_VAR); if (initialEnabled) { - this.#transition('running'); + this.transition('running'); } } - /** - * Returns whether debug mode is enabled for profiler state transitions. - * - * Debug mode is initially determined by the environment variable specified by `debugEnvVar` - * (defaults to 'CP_PROFILER_DEBUG') during construction, but can be changed at runtime - * using {@link setDebugMode}. When enabled, profiler state transitions create - * performance marks for debugging. - * - * @returns true if debug mode is enabled, false otherwise - */ - get debug(): boolean { - return this.#debug; - } - - /** - * Sets debug mode for profiler state transitions. - * - * When debug mode is enabled, profiler state transitions create performance marks - * for debugging. This allows runtime control of debug mode without needing to - * restart the application or change environment variables. - * - * @param enabled - Whether to enable debug mode - */ - setDebugMode(enabled: boolean): void { - this.#debug = enabled; - } - /** * Creates a performance marker for a profiler state transition. - * @param transition - The state transition that occurred */ #transitionMarker(transition: string): void { const transitionMarkerPayload: MarkerPayload = { @@ -183,7 +179,10 @@ export class NodejsProfiler< tooltipText: `Profiler state transition: ${transition}`, properties: [['Transition', transition], ...objectToEntries(this.stats)], }; - this.marker(transition, transitionMarkerPayload); + performance.mark( + transition, + asOptions(markerPayload(transitionMarkerPayload)), + ); } /** @@ -195,13 +194,21 @@ export class NodejsProfiler< error: unknown, kind: 'uncaughtException' | 'unhandledRejection', ): void { + this.#fatalErrorMarker(error, kind); + this.close(); // Ensures buffers flush and sink finalizes + } + /** + * Creates a fatal errors by marking them and shutting down the profiler. + * @param error - The error that occurred + * @param kind - The kind of fatal error (uncaughtException or unhandledRejection) + */ + #fatalErrorMarker(error: unknown, kind: FatalKind): void { this.marker( 'Fatal Error', errorToMarkerPayload(error, { tooltipText: `${kind} caused fatal error`, }), ); - this.close(); // Ensures buffers flush and sink finalizes } /** @@ -210,13 +217,13 @@ export class NodejsProfiler< * State transitions enforce lifecycle invariants: * - `idle -> running`: Enables profiling, opens sink, and subscribes to performance observer * - `running -> idle`: Disables profiling, unsubscribes, and closes sink (sink will be reopened on re-enable) - * - `running -> closed`: Disables profiling, unsubscribes, and closes sink (irreversible) - * - `idle -> closed`: Closes sink if it was opened (irreversible) + * - `running -> closed`: Disables profiling, unsubscribes, closes sink, and finalizes shards (irreversible) + * - `idle -> closed`: Closes sink if it was opened and finalizes shards (irreversible) * * @param next - The target state to transition to * @throws {Error} If attempting to transition from 'closed' state or invalid transition */ - #transition(next: 'idle' | 'running' | 'closed'): void { + protected transition(next: NodeJsProfilerState): void { if (this.#state === next) { return; } @@ -225,24 +232,30 @@ export class NodejsProfiler< } const transition = `${this.#state}->${next}`; + if (this.isDebugMode()) { + this.#transitionMarker(`${PROFILER_DEBUG_MEASURE_PREFIX}:${transition}`); + } switch (transition) { case 'idle->running': super.setEnabled(true); - this.#sink.open?.(); - this.#performanceObserverSink.subscribe(); + this.#shard?.open(); + this.#performanceObserverSink?.subscribe(); break; case 'running->idle': - case 'running->closed': super.setEnabled(false); - this.#performanceObserverSink.unsubscribe(); - this.#sink.close?.(); + this.#performanceObserverSink?.unsubscribe(); + this.#shard?.close(); break; + case 'running->closed': case 'idle->closed': - // Sink may have been opened before, close it - this.#sink.close?.(); + super.setEnabled(false); + this.#performanceObserverSink?.unsubscribe(); + this.#shard?.close(); + this.#sharder?.finalizeIfCoordinator(); + this.#unsubscribeExitHandlers?.(); break; default: @@ -250,10 +263,6 @@ export class NodejsProfiler< } this.#state = next; - - if (this.#debug) { - this.#transitionMarker(transition); - } } /** @@ -264,13 +273,7 @@ export class NodejsProfiler< if (this.#state === 'closed') { return; } - this.#unsubscribeExitHandlers?.(); - this.#transition('closed'); - } - - /** @returns Current profiler state */ - get state(): 'idle' | 'running' | 'closed' { - return this.#state; + this.transition('closed'); } /** @returns Whether profiler is in 'running' state */ @@ -281,19 +284,34 @@ export class NodejsProfiler< /** Enables profiling (start/stop) */ override setEnabled(enabled: boolean): void { if (enabled) { - this.#transition('running'); + this.transition('running'); } else { - this.#transition('idle'); + this.transition('idle'); } } + /** @returns Current profiler state */ + get state(): 'idle' | 'running' | 'closed' { + return this.#state; + } + /** @returns Queue statistics and profiling state for monitoring */ get stats() { + const { + state: sharderState, + isCoordinator, + ...sharderStats + } = this.#sharder?.stats ?? {}; + return { - ...this.#performanceObserverSink.getStats(), - debug: this.#debug, - state: this.#state, - walOpen: !this.#sink.isClosed(), + profilerState: this.#state, + debug: this.isDebugMode(), + sharderState, + ...sharderStats, + isCoordinator, + shardOpen: this.#shard?.isClosed(), + shardPath: this.#shard?.getPath(), + ...this.#performanceObserverSink?.getStats(), }; } @@ -302,11 +320,6 @@ export class NodejsProfiler< if (this.#state === 'closed') { return; // No-op if closed } - this.#performanceObserverSink.flush(); - } - - /** @returns The file path of the WriteAheadLogFile sink */ - get filePath(): string { - return (this.#sink as WriteAheadLogFile).getPath(); + this.#performanceObserverSink?.flush(); } } diff --git a/packages/utils/src/lib/profiler/profiler-node.unit.test.ts b/packages/utils/src/lib/profiler/profiler-node.unit.test.ts index 5357adc37..773ca702d 100644 --- a/packages/utils/src/lib/profiler/profiler-node.unit.test.ts +++ b/packages/utils/src/lib/profiler/profiler-node.unit.test.ts @@ -1,97 +1,206 @@ -import path from 'node:path'; import { performance } from 'node:perf_hooks'; import { beforeEach, describe, expect, it, vi } from 'vitest'; +import { + awaitObserverCallbackAndFlush, + osAgnosticPath, +} from '@code-pushup/test-utils'; +import { + loadAndOmitTraceJson, + loadAndOmitTraceJsonl, +} from '../../../mocks/omit-trace-json.js'; import { MockTraceEventFileSink } from '../../../mocks/sink.mock'; +import { isEnvVarEnabled } from '../env.js'; import { subscribeProcessExit } from '../exit-process.js'; -import * as PerfObserverModule from '../performance-observer.js'; import type { PerformanceEntryEncoder } from '../performance-observer.js'; +import { ID_PATTERNS } from '../process-id.js'; import type { ActionTrackEntryPayload, UserTimingDetail, } from '../user-timing-extensibility-api.type.js'; import * as WalModule from '../wal.js'; +import { + PROFILER_DEBUG_ENV_VAR, + PROFILER_OUT_BASENAME, + PROFILER_PERSIST_OUT_DIR, + PROFILER_SHARDER_ID_ENV_VAR, +} from './constants.js'; import { NodejsProfiler, type NodejsProfilerOptions } from './profiler-node.js'; -import { Profiler } from './profiler.js'; +import { Profiler, getProfilerId } from './profiler.js'; +import { entryToTraceEvents } from './trace-file-utils.js'; +import type { TraceEvent } from './trace-file.type.js'; +import { traceEventWalFormat } from './wal-json-trace.js'; vi.mock('../exit-process.js'); -const simpleEncoder: PerformanceEntryEncoder = entry => { +const simpleEncoder: PerformanceEntryEncoder<{ message: string }> = entry => { if (entry.entryType === 'measure') { - return [`${entry.name}:${entry.duration.toFixed(2)}ms`]; + return [{ message: `${entry.name}:${entry.duration.toFixed(2)}ms` }]; } return []; }; -describe('NodejsProfiler', () => { - const getNodejsProfiler = ( - overrides?: Partial< - NodejsProfilerOptions> - >, - ) => { - const sink = new MockTraceEventFileSink(); - const mockFilePath = - overrides?.filename ?? - '/test/tmp/profiles/20240101-120000-000/trace.20240101-120000-000.12345.1.1.jsonl'; - vi.spyOn(sink, 'open'); - vi.spyOn(sink, 'close'); - vi.spyOn(sink, 'getPath').mockReturnValue(mockFilePath); - - // Mock WriteAheadLogFile constructor to return our mock sink - vi.spyOn(WalModule, 'WriteAheadLogFile').mockImplementation( - () => sink as any, - ); - - const mockPerfObserverSink = { - subscribe: vi.fn(), - unsubscribe: vi.fn(() => { - mockPerfObserverSink.flush(); - }), - isSubscribed: vi.fn().mockReturnValue(false), - encode: vi.fn(), - flush: vi.fn(), - getStats: vi.fn().mockReturnValue({ - isSubscribed: false, - queued: 0, - dropped: 0, - written: 0, - maxQueueSize: 10_000, - flushThreshold: 20, - addedSinceLastFlush: 0, - buffered: true, - }), - }; - vi.spyOn(PerfObserverModule, 'PerformanceObserverSink').mockReturnValue( - mockPerfObserverSink as any, - ); - - const profiler = new NodejsProfiler({ - prefix: 'test', - track: 'test-track', +const resetEnv = () => { + // eslint-disable-next-line functional/immutable-data + delete process.env.DEBUG; + // eslint-disable-next-line functional/immutable-data + delete process.env.CP_PROFILING; + // eslint-disable-next-line functional/immutable-data, @typescript-eslint/no-dynamic-delete + delete process.env[PROFILER_SHARDER_ID_ENV_VAR]; +}; + +const expectRunning = (p: NodejsProfiler) => { + expect(p.state).toBe('running'); + expect(p.stats.shardOpen).toBe(true); + expect(p.stats.isSubscribed).toBe(true); +}; + +const expectIdle = (p: NodejsProfiler) => { + expect(p.state).toBe('idle'); + expect(p.stats.shardOpen).toBe(false); + expect(p.stats.isSubscribed).toBe(false); +}; + +const expectTransitionMarker = (name: string) => { + const marks = performance.getEntriesByType('mark'); + expect(marks.some(m => m.name === name)).toBe(true); +}; + +const expectNoTransitionMarker = (name: string) => { + const marks = performance.getEntriesByType('mark'); + expect(marks.some(m => m.name === name)).toBe(false); +}; + +const createProfiler = ( + options: + | string + | (Partial< + NodejsProfilerOptions< + TraceEvent, + Record + > + > & { measureName: string }), +): NodejsProfiler => { + const opts = typeof options === 'string' ? { measureName: options } : options; + return new NodejsProfiler({ + ...opts, + track: opts.track ?? 'int-test-track', + format: { + ...traceEventWalFormat(), + encodePerfEntry: entryToTraceEvents, + baseName: opts.format?.baseName ?? PROFILER_OUT_BASENAME, + }, + enabled: opts.enabled ?? true, + debug: opts.debug ?? isEnvVarEnabled(PROFILER_DEBUG_ENV_VAR), + measureName: opts.measureName, + }); +}; + +class TestNodejsProfiler extends NodejsProfiler { + forceTransition(next: any) { + this.transition(next); + } +} + +const createTestProfiler = ( + options: + | string + | (Partial< + NodejsProfilerOptions< + TraceEvent, + Record + > + > & { measureName: string }), +): TestNodejsProfiler => { + const opts = typeof options === 'string' ? { measureName: options } : options; + return new TestNodejsProfiler({ + ...opts, + track: opts.track ?? 'int-test-track', + format: { + ...traceEventWalFormat(), + encodePerfEntry: entryToTraceEvents, + baseName: opts.format?.baseName ?? PROFILER_OUT_BASENAME, + }, + enabled: opts.enabled ?? true, + debug: opts.debug ?? isEnvVarEnabled(PROFILER_DEBUG_ENV_VAR), + measureName: opts.measureName, + }); +}; + +const createSimpleProfiler = ( + overrides?: Partial< + NodejsProfilerOptions< + { message: string }, + Record + > + >, +): NodejsProfiler<{ message: string }> => { + const sink = new MockTraceEventFileSink(); + vi.spyOn(sink, 'open'); + vi.spyOn(sink, 'close'); + vi.spyOn(WalModule, 'WriteAheadLogFile').mockImplementation( + () => sink as any, + ); + return new NodejsProfiler({ + prefix: 'cp', + track: 'test-track', + measureName: overrides?.measureName ?? 'simple', + enabled: overrides?.enabled ?? true, + debug: overrides?.debug ?? isEnvVarEnabled(PROFILER_DEBUG_ENV_VAR), + format: { encodePerfEntry: simpleEncoder, - ...overrides, - }); + baseName: overrides?.format?.baseName ?? PROFILER_OUT_BASENAME, + walExtension: '.jsonl', + finalExtension: '.json', + ...overrides?.format, + }, + ...overrides, + }); +}; + +const captureExitHandlers = () => { + const mockSubscribeProcessExit = vi.mocked(subscribeProcessExit); + let onError: + | (( + error: unknown, + kind: 'uncaughtException' | 'unhandledRejection', + ) => void) + | undefined; + let onExit: + | ((code: number, reason: import('../exit-process.js').CloseReason) => void) + | undefined; + + mockSubscribeProcessExit.mockImplementation(options => { + onError = options?.onError; + onExit = options?.onExit; + return vi.fn(); + }); - return { sink, perfObserverSink: mockPerfObserverSink, profiler }; + return { + get onError() { + return onError; + }, + get onExit() { + return onExit; + }, }; +}; - const originalEnv = process.env.CP_PROFILER_DEBUG; +describe('NodejsProfiler', () => { + const originalEnv = process.env.DEBUG; beforeEach(() => { performance.clearMarks(); performance.clearMeasures(); - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILING; + resetEnv(); }); afterEach(() => { if (originalEnv === undefined) { // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILER_DEBUG; + delete process.env.DEBUG; } else { // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = originalEnv; + process.env.DEBUG = originalEnv; } }); @@ -101,7 +210,7 @@ describe('NodejsProfiler', () => { }); it('should have required static structure', () => { - const profiler = getNodejsProfiler().profiler; + const profiler = createProfiler('static-structure'); expect(typeof profiler.measure).toBe('function'); expect(typeof profiler.measureAsync).toBe('function'); expect(typeof profiler.marker).toBe('function'); @@ -117,17 +226,60 @@ describe('NodejsProfiler', () => { }); it('should initialize with sink opened when enabled is true', () => { - const { sink, perfObserverSink } = getNodejsProfiler({ enabled: true }); - expect(sink.isClosed()).toBe(false); - expect(sink.open).toHaveBeenCalledTimes(1); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes(1); + const profiler = createProfiler({ + measureName: 'init-enabled', + }); + expect(profiler.state).toBe('running'); + expect(profiler.stats.shardOpen).toBe(true); + expect(profiler.stats.isSubscribed).toBe(true); }); + // eslint-disable-next-line vitest/expect-expect it('should initialize with sink closed when enabled is false', () => { - const { sink, perfObserverSink } = getNodejsProfiler({ enabled: false }); - expect(sink.isClosed()).toBe(true); - expect(sink.open).not.toHaveBeenCalled(); - expect(perfObserverSink.subscribe).not.toHaveBeenCalled(); + const profiler = createProfiler({ + measureName: 'init-disabled', + enabled: false, + }); + expectIdle(profiler); + }); + + it('should initialize as coordinator if env vars is undefined', async () => { + const profiler = createProfiler('is-coordinator'); + expect(profiler.stats.isCoordinator).toBe(true); + }); + + it('should finalize shard folder as coordinator', async () => { + const profiler = createProfiler('is-coordinator'); + expect(profiler.stats.isCoordinator).toBe(true); + profiler.marker('special-marker'); + profiler.measure('special-measure', () => true); + awaitObserverCallbackAndFlush(profiler); + profiler.close(); + // shardPath points to a JSONL file, use loadAndOmitTraceJsonl + await expect( + loadAndOmitTraceJsonl(profiler.stats.shardPath as `${string}.jsonl`), + ).resolves.not.toThrowError(); + + await expect( + loadAndOmitTraceJson(profiler.stats.finalFilePath), + ).resolves.not.toThrowError(); + }); + + it('should NOT initialize as coordinator if env vars is defined', async () => { + vi.stubEnv(PROFILER_SHARDER_ID_ENV_VAR, getProfilerId()); + const profiler = createProfiler('is-coordinator'); + expect(profiler.stats.isCoordinator).toBe(false); + profiler.marker('special-marker'); + profiler.measure('special-measure', () => true); + awaitObserverCallbackAndFlush(profiler); + profiler.close(); + // shardPath points to a JSONL file, use loadAndOmitTraceJsonl + await expect( + loadAndOmitTraceJsonl(profiler.stats.shardPath as `${string}.jsonl`), + ).resolves.not.toThrowError(); + await expect( + loadAndOmitTraceJson(profiler.stats.finalFilePath), + ).rejects.toThrowError('no such file or directory'); }); }); @@ -136,142 +288,78 @@ describe('NodejsProfiler', () => { { name: 'idle → running', initial: false, - action: ( - p: NodejsProfiler>, - ) => p.setEnabled(true), - expected: { - state: 'running', - sinkOpen: 1, - sinkClose: 0, - subscribe: 1, - unsubscribe: 0, - }, + action: (p: NodejsProfiler) => p.setEnabled(true), + assert: expectRunning, }, { name: 'running → idle', initial: true, - action: ( - p: NodejsProfiler>, - ) => p.setEnabled(false), - expected: { - state: 'idle', - sinkOpen: 1, - sinkClose: 1, - subscribe: 1, - unsubscribe: 1, - }, + action: (p: NodejsProfiler) => p.setEnabled(false), + assert: expectIdle, }, { name: 'idle → closed', initial: false, - action: ( - p: NodejsProfiler>, - ) => p.close(), - expected: { - state: 'closed', - sinkOpen: 0, - sinkClose: 1, - subscribe: 0, - unsubscribe: 0, - }, + action: (p: NodejsProfiler) => p.close(), + assert: (p: NodejsProfiler) => expect(p.state).toBe('closed'), }, { name: 'running → closed', initial: true, - action: ( - p: NodejsProfiler>, - ) => p.close(), - expected: { - state: 'closed', - sinkOpen: 1, - sinkClose: 1, - subscribe: 1, - unsubscribe: 1, - }, + action: (p: NodejsProfiler) => p.close(), + assert: (p: NodejsProfiler) => expect(p.state).toBe('closed'), }, - ])('should handle $name transition', ({ initial, action, expected }) => { - const { sink, perfObserverSink, profiler } = getNodejsProfiler({ + ])('should handle $name transition', ({ initial, action, assert }) => { + const profiler = createProfiler({ + measureName: `state-transition-${initial ? 'running' : 'idle'}`, enabled: initial, }); action(profiler); - expect(profiler.state).toBe(expected.state); - expect(sink.open).toHaveBeenCalledTimes(expected.sinkOpen); - expect(sink.close).toHaveBeenCalledTimes(expected.sinkClose); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes( - expected.subscribe, - ); - expect(perfObserverSink.unsubscribe).toHaveBeenCalledTimes( - expected.unsubscribe, - ); + assert(profiler); }); it('should expose state via getter', () => { - const profiler = getNodejsProfiler({ enabled: false }).profiler; + const profiler = createProfiler({ + measureName: 'state-getter', + enabled: false, + }); - expect(profiler.state).toBe('idle'); + expectIdle(profiler); profiler.setEnabled(true); - expect(profiler.state).toBe('running'); + expectRunning(profiler); profiler.setEnabled(false); - expect(profiler.state).toBe('idle'); + expectIdle(profiler); profiler.close(); expect(profiler.state).toBe('closed'); }); + // eslint-disable-next-line vitest/expect-expect it('should maintain state invariant: running ⇒ sink open + observer subscribed', () => { - const { sink, perfObserverSink, profiler } = getNodejsProfiler({ + const profiler = createProfiler({ + measureName: 'state-invariant', enabled: false, }); - expect(profiler.state).toBe('idle'); - expect(sink.isClosed()).toBe(true); - expect(perfObserverSink.isSubscribed()).toBe(false); + expectIdle(profiler); profiler.setEnabled(true); - expect(profiler.state).toBe('running'); - expect(sink.isClosed()).toBe(false); - expect(sink.open).toHaveBeenCalledTimes(1); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes(1); + expectRunning(profiler); profiler.setEnabled(false); - expect(profiler.state).toBe('idle'); - expect(sink.isClosed()).toBe(true); - expect(sink.close).toHaveBeenCalledTimes(1); - expect(perfObserverSink.unsubscribe).toHaveBeenCalledTimes(1); + expectIdle(profiler); profiler.setEnabled(true); - expect(profiler.state).toBe('running'); - expect(sink.isClosed()).toBe(false); - expect(sink.open).toHaveBeenCalledTimes(2); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes(2); - }); - - it('#transition method should execute all operations in running->closed case', () => { - const { sink, perfObserverSink, profiler } = getNodejsProfiler({ - enabled: true, - }); - - const parentSetEnabledSpy = vi.spyOn(Profiler.prototype, 'setEnabled'); - - expect(profiler.state).toBe('running'); - - profiler.close(); - - expect(parentSetEnabledSpy).toHaveBeenCalledWith(false); - expect(perfObserverSink.unsubscribe).toHaveBeenCalledTimes(1); - expect(sink.close).toHaveBeenCalledTimes(1); - expect(profiler.state).toBe('closed'); - - parentSetEnabledSpy.mockRestore(); + expectRunning(profiler); }); it('is idempotent for repeated operations', () => { - const { sink, perfObserverSink, profiler } = getNodejsProfiler({ - enabled: true, + const profiler = createProfiler({ + measureName: 'idempotent-operations', }); profiler.setEnabled(true); @@ -281,140 +369,179 @@ describe('NodejsProfiler', () => { profiler.close(); profiler.close(); - expect(sink.open).toHaveBeenCalledTimes(1); - expect(sink.close).toHaveBeenCalledTimes(1); - expect(perfObserverSink.subscribe).toHaveBeenCalledTimes(1); - expect(perfObserverSink.unsubscribe).toHaveBeenCalledTimes(1); + expect(profiler.state).toBe('closed'); }); it('rejects all lifecycle changes after close', () => { - const { perfObserverSink, profiler } = getNodejsProfiler({ + const profiler = createProfiler({ + measureName: 'lifecycle-after-close', enabled: false, }); profiler.close(); - expect(() => profiler.setEnabled(true)).toThrow( + expect(() => profiler.setEnabled(true)).toThrowError( 'Profiler already closed', ); - expect(() => profiler.setEnabled(false)).toThrow( + expect(() => profiler.setEnabled(false)).toThrowError( 'Profiler already closed', ); profiler.flush(); - expect(perfObserverSink.flush).not.toHaveBeenCalled(); + expect(profiler.state).toBe('closed'); }); - it('throws error for invalid state transition (defensive code)', () => { - const profiler = getNodejsProfiler({ enabled: true }).profiler; - - expect(profiler.state).toBe('running'); + it('throws for invalid transitions', () => { + const profiler = createTestProfiler({ + measureName: 'invalid-transition', + enabled: false, + }); - // Test invalid transition through public API - trying to transition to an invalid state - // Since we can't access private methods, we test that the profiler maintains valid state - // Invalid transitions are prevented by the type system and runtime checks - expect(() => { - // This should not throw since we're using the public API correctly - profiler.setEnabled(false); - profiler.setEnabled(true); - }).not.toThrow(); + expect(() => profiler.forceTransition('invalid')).toThrowError( + 'Invalid transition: idle -> invalid', + ); }); }); describe('profiling operations', () => { - it('should expose filePath getter', () => { - const { profiler } = getNodejsProfiler({ enabled: true }); - expect(profiler.filePath).toMatchPath( - '/test/tmp/profiles/20240101-120000-000/trace.20240101-120000-000.12345.1.1.jsonl', + it('should expose shardPath in stats', () => { + const profiler = createProfiler({ + measureName: 'filepath-getter', + }); + // When measureName is provided, it's used as the groupId directory + expect(profiler.stats.shardPath).toContainPath( + 'tmp/profiles/filepath-getter', ); + expect(profiler.stats.shardPath).toMatch(/\.jsonl$/); }); - it('should use provided filename when specified', () => { - const customPath = path.join(process.cwd(), 'custom-trace.json'); - const { profiler } = getNodejsProfiler({ - filename: customPath, + it('should use measureName for final file path', () => { + const profiler = createProfiler({ + measureName: 'custom-filename', }); - expect(profiler.filePath).toBe(customPath); + const shardPath = profiler.stats.shardPath; + // shardPath uses the shard ID format: baseName.shardId.jsonl + expect(shardPath).toContainPath('tmp/profiles/custom-filename'); + expect(shardPath).toMatch(/trace\.\d{8}-\d{6}-\d{3}(?:\.\d+){3}\.jsonl$/); + // finalFilePath uses measureName as the identifier + expect(profiler.stats.finalFilePath).toMatchPath( + `${PROFILER_PERSIST_OUT_DIR}/custom-filename/trace.custom-filename.json`, + ); }); it('should use sharded path when filename is not provided', () => { - const { profiler } = getNodejsProfiler(); - const filePath = profiler.filePath; - expect(filePath).toMatchPath( - '/test/tmp/profiles/20240101-120000-000/trace.20240101-120000-000.12345.1.1.jsonl', - ); + const profiler = createProfiler('sharded-path'); + const filePath = profiler.stats.shardPath; + // When measureName is provided, it's used as the groupId directory + expect(filePath).toContainPath('tmp/profiles/sharded-path'); + expect(filePath).toMatch(/\.jsonl$/); }); it('should perform measurements when enabled', () => { - const { profiler } = getNodejsProfiler({ enabled: true }); + const profiler = createProfiler({ + measureName: 'measurements-enabled', + }); const result = profiler.measure('test-op', () => 'success'); expect(result).toBe('success'); }); it('should skip sink operations when disabled', () => { - const { sink, profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'sink-disabled', + enabled: false, + }); const result = profiler.measure('disabled-op', () => 'success'); expect(result).toBe('success'); - expect(sink.getWrittenItems()).toHaveLength(0); + // When disabled, no entries should be written + expect(profiler.stats.written).toBe(0); }); it('get stats() getter should return current stats', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'stats-getter', + enabled: false, + }); - expect(profiler.stats).toStrictEqual({ - state: 'idle', - walOpen: false, + const stats = profiler.stats; + // shardPath uses dynamic shard ID format, so we check it matches the pattern + // Remove ^ and $ anchors from INSTANCE_ID pattern since we're embedding it + const instanceIdPattern = ID_PATTERNS.INSTANCE_ID.source.replace( + /^\^|\$$/g, + '', + ); + // Normalize path before regex matching to handle OS-specific separators + expect(osAgnosticPath(stats.shardPath)).toMatch( + new RegExp( + `^tmp/profiles/stats-getter/trace\\.${instanceIdPattern}\\.jsonl$`, + ), + ); + expect(stats).toStrictEqual({ + profilerState: 'idle', + debug: false, + sharderState: 'active', + shardCount: 0, + groupId: 'stats-getter', // When measureName is provided, it's used as groupId + isCoordinator: true, // When no coordinator env var is set, this profiler becomes coordinator + isFinalized: false, + isCleaned: false, + finalFilePath: expect.pathToMatch( + `${PROFILER_PERSIST_OUT_DIR}/stats-getter/trace.stats-getter.json`, + ), + shardFileCount: 0, + shardFiles: [], + shardOpen: false, + shardPath: stats.shardPath, // Use the actual value since it's dynamic isSubscribed: false, queued: 0, dropped: 0, written: 0, + lastRecover: [], maxQueueSize: 10_000, flushThreshold: 20, addedSinceLastFlush: 0, buffered: true, - debug: false, }); }); it('flush() should flush when profiler is running', () => { - const { perfObserverSink, profiler } = getNodejsProfiler({ - enabled: true, + const profiler = createProfiler({ + measureName: 'flush-running', }); - - expect(profiler.state).toBe('running'); - - profiler.flush(); - - expect(perfObserverSink.flush).toHaveBeenCalledTimes(1); + expect(() => profiler.flush()).not.toThrowError(); }); it('should propagate errors from measure work function', () => { - const { profiler } = getNodejsProfiler({ enabled: true }); + const profiler = createProfiler({ + measureName: 'measure-error', + }); const error = new Error('Test error'); expect(() => { profiler.measure('error-test', () => { throw error; }); - }).toThrow(error); + }).toThrowError(error); }); it('should propagate errors from measureAsync work function', async () => { - const { profiler } = getNodejsProfiler({ enabled: true }); + const profiler = createProfiler({ + measureName: 'measure-async-error', + }); const error = new Error('Async test error'); - await expect(async () => { - await profiler.measureAsync('async-error-test', async () => { + await expect(profiler.measureAsync('async-error-test', async () => { throw error; - }); - }).rejects.toThrow(error); + })).rejects.toThrowError(error); }); it('should skip measurement when profiler is not active', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'skip-measurement-inactive', + enabled: false, + }); let workCalled = false; const result = profiler.measure('inactive-test', () => { @@ -427,7 +554,10 @@ describe('NodejsProfiler', () => { }); it('should skip async measurement when profiler is not active', async () => { - const { profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'skip-async-inactive', + enabled: false, + }); let workCalled = false; const result = await profiler.measureAsync( @@ -443,11 +573,14 @@ describe('NodejsProfiler', () => { }); it('should skip marker when profiler is not active', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); + const profiler = createProfiler({ + measureName: 'skip-marker-inactive', + enabled: false, + }); expect(() => { profiler.marker('inactive-marker'); - }).not.toThrow(); + }).not.toThrowError(); }); it('base Profiler behavior: should always be active in base profiler', () => { @@ -471,108 +604,83 @@ describe('NodejsProfiler', () => { expect(() => { profiler.marker('base-marker'); - }).not.toThrow(); + }).not.toThrowError(); }); }); describe('debug mode', () => { it('should initialize debug flag to false when env var not set', () => { - const { profiler } = getNodejsProfiler(); + const profiler = createProfiler('debug-flag-false'); const stats = profiler.stats; expect(stats.debug).toBe(false); }); - it('should initialize debug flag from CP_PROFILER_DEBUG env var when set', () => { + it('should initialize debug flag from DEBUG env var when set', () => { // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; + process.env.DEBUG = 'true'; - const { profiler } = getNodejsProfiler(); + const profiler = createProfiler('debug-flag-true'); const stats = profiler.stats; expect(stats.debug).toBe(true); }); it('should expose debug flag via getter', () => { - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(false); + const profiler = createProfiler('debug-getter-false'); + expect(profiler.isDebugMode()).toBe(false); + expect(profiler.stats.debug).toBe(false); // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler: debugProfiler } = getNodejsProfiler(); - expect(debugProfiler.debug).toBe(true); + process.env.DEBUG = 'true'; + const debugProfiler = createProfiler('debug-getter-true'); + expect(debugProfiler.isDebugMode()).toBe(true); + expect(debugProfiler.stats.debug).toBe(true); }); + // eslint-disable-next-line vitest/expect-expect it('should create transition marker when debug is enabled and transitioning to running', () => { // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler({ enabled: false }); + process.env.DEBUG = 'true'; + const profiler = createProfiler({ + measureName: 'debug-transition-marker', + enabled: false, + }); performance.clearMarks(); - profiler.setEnabled(true); - const marks = performance.getEntriesByType('mark'); - const transitionMark = marks.find(mark => mark.name === 'idle->running'); - expect(transitionMark).toBeDefined(); - expect(transitionMark?.name).toBe('idle->running'); - }); - - it('should not create transition marker when transitioning from running to idle (profiler disabled)', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler({ enabled: true }); - - performance.clearMarks(); - - profiler.setEnabled(false); - - const marks = performance.getEntriesByType('mark'); - const transitionMark = marks.find(mark => mark.name === 'running->idle'); - expect(transitionMark).toBeUndefined(); + expectTransitionMarker('debug:idle->running'); }); + // eslint-disable-next-line vitest/expect-expect it('does not emit transition markers unless debug is enabled', () => { - const { profiler } = getNodejsProfiler(); + const profiler = createProfiler('no-transition-markers'); performance.clearMarks(); - profiler.setEnabled(true); - expect( - performance - .getEntriesByType('mark') - .some(m => m.name.startsWith('idle->running')), - ).toBe(false); + expectNoTransitionMarker('idle->running'); }); it('should include stats in transition marker properties when transitioning to running', () => { // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler, perfObserverSink } = getNodejsProfiler({ + process.env.DEBUG = 'true'; + const profiler = createProfiler({ + measureName: 'debug-transition-stats', enabled: false, }); - perfObserverSink.getStats.mockReturnValue({ - isSubscribed: true, - queued: 5, - dropped: 2, - written: 10, - maxQueueSize: 10_000, - flushThreshold: 20, - addedSinceLastFlush: 3, - buffered: true, - }); - performance.clearMarks(); - profiler.setEnabled(true); const marks = performance.getEntriesByType('mark'); - const transitionMark = marks.find(mark => mark.name === 'idle->running'); + const transitionMark = marks.find( + mark => mark.name === 'debug:idle->running', + ); expect(transitionMark).toBeDefined(); - expect(transitionMark?.name).toBe('idle->running'); + expect(transitionMark?.name).toBe('debug:idle->running'); expect(transitionMark?.detail).toBeDefined(); const detail = transitionMark?.detail as UserTimingDetail; expect(detail.devtools).toBeDefined(); @@ -583,213 +691,31 @@ describe('NodejsProfiler', () => { // eslint-disable-next-line vitest/max-nested-describe describe('setDebugMode', () => { it('should enable debug mode when called with true', () => { - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - - expect(profiler.debug).toBe(true); - expect(profiler.stats.debug).toBe(true); - }); - - it('should disable debug mode when called with false', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - - expect(profiler.debug).toBe(false); + const profiler = createProfiler('set-debug-true'); + expect(profiler.isDebugMode()).toBe(false); expect(profiler.stats.debug).toBe(false); - }); - - it('should create transition markers after enabling debug mode', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); - expect(profiler.debug).toBe(false); - - performance.clearMarks(); - profiler.setEnabled(true); - expect( - performance - .getEntriesByType('mark') - .some(m => m.name.startsWith('idle->running')), - ).toBe(false); - - profiler.setEnabled(false); - profiler.setDebugMode(true); - performance.clearMarks(); - - profiler.setEnabled(true); - - const marks = performance.getEntriesByType('mark'); - const transitionMark = marks.find( - mark => mark.name === 'idle->running', - ); - expect(transitionMark).toBeDefined(); - expect(transitionMark?.name).toBe('idle->running'); - }); - - it('should stop creating transition markers after disabling debug mode', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler({ enabled: false }); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - performance.clearMarks(); - profiler.setEnabled(true); - - expect( - performance - .getEntriesByType('mark') - .some(m => m.name.startsWith('idle->running')), - ).toBe(false); - }); - - it('should be idempotent when called multiple times with true', () => { - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - profiler.setDebugMode(true); profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); + expect(profiler.isDebugMode()).toBe(true); expect(profiler.stats.debug).toBe(true); }); - - it('should be idempotent when called multiple times with false', () => { - // eslint-disable-next-line functional/immutable-data - process.env.CP_PROFILER_DEBUG = 'true'; - const { profiler } = getNodejsProfiler(); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - profiler.setDebugMode(false); - profiler.setDebugMode(false); - - expect(profiler.debug).toBe(false); - expect(profiler.stats.debug).toBe(false); - }); - - it('should work when profiler is in idle state', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); - expect(profiler.state).toBe('idle'); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); - expect(profiler.stats.debug).toBe(true); - }); - - it('should work when profiler is in running state', () => { - const { profiler } = getNodejsProfiler({ enabled: true }); - expect(profiler.state).toBe('running'); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); - expect(profiler.stats.debug).toBe(true); - - performance.clearMarks(); - profiler.setEnabled(false); - profiler.setEnabled(true); - - const marks = performance.getEntriesByType('mark'); - const transitionMark = marks.find( - mark => mark.name === 'idle->running', - ); - expect(transitionMark).toBeDefined(); - }); - - it('should work when profiler is in closed state', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); - profiler.close(); - expect(profiler.state).toBe('closed'); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); - expect(profiler.stats.debug).toBe(true); - }); - - it('should toggle debug mode multiple times', () => { - const { profiler } = getNodejsProfiler({ enabled: false }); - - profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - expect(profiler.debug).toBe(false); - - profiler.setDebugMode(true); - expect(profiler.debug).toBe(true); - - profiler.setDebugMode(false); - expect(profiler.debug).toBe(false); - }); }); }); describe('exit handlers', () => { const mockSubscribeProcessExit = vi.mocked(subscribeProcessExit); - let capturedOnError: - | (( - error: unknown, - kind: 'uncaughtException' | 'unhandledRejection', - ) => void) - | undefined; - let capturedOnExit: - | (( - code: number, - reason: import('../exit-process.js').CloseReason, - ) => void) - | undefined; - const createProfiler = ( - overrides?: Partial< - NodejsProfilerOptions> - >, - ) => { - const sink = new MockTraceEventFileSink(); - vi.spyOn(sink, 'open'); - vi.spyOn(sink, 'close'); - vi.spyOn(WalModule, 'WriteAheadLogFile').mockImplementation( - () => sink as any, - ); - return new NodejsProfiler({ - prefix: 'cp', - track: 'test-track', - encodePerfEntry: simpleEncoder, - ...overrides, - }); - }; - - let profiler: NodejsProfiler< - string, - Record - >; - beforeEach(() => { - capturedOnError = undefined; - capturedOnExit = undefined; - - mockSubscribeProcessExit.mockImplementation(options => { - capturedOnError = options?.onError; - capturedOnExit = options?.onExit; - return vi.fn(); - }); - performance.clearMarks(); performance.clearMeasures(); - // eslint-disable-next-line functional/immutable-data - delete process.env.CP_PROFILING; + resetEnv(); }); it('installs exit handlers on construction', () => { - expect(() => createProfiler()).not.toThrow(); + expect(() => + createSimpleProfiler({ measureName: 'exit-handlers-install' }), + ).not.toThrowError(); expect(mockSubscribeProcessExit).toHaveBeenCalledWith({ onError: expect.any(Function), @@ -798,7 +724,9 @@ describe('NodejsProfiler', () => { }); it('setEnabled toggles profiler state', () => { - profiler = createProfiler({ enabled: true }); + const profiler = createSimpleProfiler({ + measureName: 'exit-set-enabled', + }); expect(profiler.isEnabled()).toBe(true); profiler.setEnabled(false); @@ -809,10 +737,15 @@ describe('NodejsProfiler', () => { }); it('marks fatal errors and shuts down profiler on uncaughtException', () => { - profiler = createProfiler({ enabled: true }); + const handlers = captureExitHandlers(); + expect(() => + createSimpleProfiler({ + measureName: 'exit-uncaught-exception', + }), + ).not.toThrowError(); const testError = new Error('Test fatal error'); - capturedOnError?.call(profiler, testError, 'uncaughtException'); + handlers.onError?.(testError, 'uncaughtException'); expect(performance.getEntriesByType('mark')).toStrictEqual([ { @@ -836,14 +769,13 @@ describe('NodejsProfiler', () => { }); it('marks fatal errors and shuts down profiler on unhandledRejection', () => { - profiler = createProfiler({ enabled: true }); + const handlers = captureExitHandlers(); + const profiler = createSimpleProfiler({ + measureName: 'exit-unhandled-rejection', + }); expect(profiler.isEnabled()).toBe(true); - capturedOnError?.call( - profiler, - new Error('Test fatal error'), - 'unhandledRejection', - ); + handlers.onError?.(new Error('Test fatal error'), 'unhandledRejection'); expect(performance.getEntriesByType('mark')).toStrictEqual([ { @@ -867,11 +799,14 @@ describe('NodejsProfiler', () => { }); it('exit handler shuts down profiler', () => { - profiler = createProfiler({ enabled: true }); + const handlers = captureExitHandlers(); + const profiler = createSimpleProfiler({ + measureName: 'exit-handler-shutdown', + }); const closeSpy = vi.spyOn(profiler, 'close'); expect(profiler.isEnabled()).toBe(true); - capturedOnExit?.(0, { kind: 'exit' }); + handlers.onExit?.(0, { kind: 'exit' }); expect(profiler.isEnabled()).toBe(false); expect(closeSpy).toHaveBeenCalledTimes(1); @@ -881,7 +816,10 @@ describe('NodejsProfiler', () => { const unsubscribeFn = vi.fn(); mockSubscribeProcessExit.mockReturnValue(unsubscribeFn); - profiler = createProfiler({ enabled: false }); + const profiler = createSimpleProfiler({ + measureName: 'exit-close-unsubscribe', + enabled: false, + }); expect(profiler.isEnabled()).toBe(false); expect(mockSubscribeProcessExit).toHaveBeenCalled(); diff --git a/packages/utils/src/lib/profiler/profiler.int.test.ts b/packages/utils/src/lib/profiler/profiler.int.test.ts index 1ee4763d6..a21f07554 100644 --- a/packages/utils/src/lib/profiler/profiler.int.test.ts +++ b/packages/utils/src/lib/profiler/profiler.int.test.ts @@ -1,72 +1,44 @@ -import type { ActionTrackEntryPayload } from '../user-timing-extensibility-api.type.js'; -import { Profiler } from './profiler.js'; +import type { ActionTrackConfigs } from '../user-timing-extensibility-api-utils.js'; +import { Profiler, type ProfilerOptions } from './profiler.js'; describe('Profiler Integration', () => { - let profiler: Profiler>; - - beforeEach(() => { - performance.clearMarks(); - performance.clearMeasures(); - - profiler = new Profiler({ + function profiler(opt?: ProfilerOptions): Profiler { + return new Profiler({ + ...opt, prefix: 'cp', track: 'CLI', trackGroup: 'Code Pushup', - color: 'primary-dark', + enabled: true, tracks: { utils: { track: 'Utils', color: 'primary' }, - core: { track: 'Core', color: 'primary-light' }, }, - enabled: true, }); + } + + beforeEach(() => { + performance.clearMarks(); + performance.clearMeasures(); + // Don't stub env var to undefined - let profiler respect enabled: true option + // The profiler constructor uses: enabled ?? isEnvVarEnabled(PROFILER_ENABLED_ENV_VAR) + // So if enabled is explicitly true, it will use that value }); it('should create complete performance timeline for sync operation', () => { + const p = profiler(); expect( - profiler.measure('sync-test', () => + p.measure('sync-test', () => Array.from({ length: 1000 }, (_, i) => i).reduce( (sum, num) => sum + num, 0, ), ), ).toBe(499_500); - - const marks = performance.getEntriesByType('mark'); - const measures = performance.getEntriesByType('measure'); - - expect(marks).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - name: 'cp:sync-test:start', - detail: expect.objectContaining({ - devtools: expect.objectContaining({ dataType: 'track-entry' }), - }), - }), - expect.objectContaining({ - name: 'cp:sync-test:end', - detail: expect.objectContaining({ - devtools: expect.objectContaining({ dataType: 'track-entry' }), - }), - }), - ]), - ); - - expect(measures).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - name: 'cp:sync-test', - duration: expect.any(Number), - detail: expect.objectContaining({ - devtools: expect.objectContaining({ dataType: 'track-entry' }), - }), - }), - ]), - ); }); it('should create complete performance timeline for async operation', async () => { + const p = profiler(); await expect( - profiler.measureAsync('async-test', async () => { + p.measureAsync('async-test', async () => { await new Promise(resolve => setTimeout(resolve, 10)); return 'async-result'; }), @@ -106,8 +78,9 @@ describe('Profiler Integration', () => { }); it('should handle nested measurements correctly', () => { - profiler.measure('outer', () => { - profiler.measure('inner', () => 'inner-result'); + const p = profiler(); + p.measure('outer', () => { + p.measure('inner', () => 'inner-result'); return 'outer-result'; }); @@ -134,7 +107,8 @@ describe('Profiler Integration', () => { }); it('should create markers with proper metadata', () => { - profiler.marker('test-marker', { + const p = profiler(); + p.marker('test-marker', { color: 'warning', tooltipText: 'Test marker tooltip', properties: [ @@ -165,131 +139,48 @@ describe('Profiler Integration', () => { }); it('should create proper DevTools payloads for tracks', () => { - profiler.measure('track-test', (): string => 'result', { + const p = profiler(); + p.measure('track-test', (): string => 'result', { success: result => ({ - properties: [['result', result]], - tooltipText: 'Track test completed', + track: 'Track 1', + trackGroup: 'Group 1', + color: 'secondary-dark', + properties: [['secondary', result]], + tooltipText: 'Track test secondary', }), }); const measures = performance.getEntriesByType('measure'); - expect(measures).toStrictEqual( + expect(measures).toEqual( expect.arrayContaining([ expect.objectContaining({ name: 'cp:track-test', - detail: { - devtools: expect.objectContaining({ - dataType: 'track-entry', - track: 'CLI', - trackGroup: 'Code Pushup', - color: 'primary-dark', - properties: [['result', 'result']], - tooltipText: 'Track test completed', - }), - }, - }), - ]), - ); - }); - - it('should merge track defaults with measurement options', () => { - profiler.measure('sync-op', () => 'sync-result', { - success: result => ({ - properties: [ - ['operation', 'sync'], - ['result', result], - ], - }), - }); - - const measures = performance.getEntriesByType('measure'); - expect(measures).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - name: 'cp:sync-op', - detail: { + detail: expect.objectContaining({ devtools: expect.objectContaining({ dataType: 'track-entry', - track: 'CLI', - trackGroup: 'Code Pushup', - color: 'primary-dark', - properties: [ - ['operation', 'sync'], - ['result', 'sync-result'], - ], - }), - }, - }), - ]), - ); - }); - - it('should mark errors with red color in DevTools', () => { - const error = new Error('Test error'); - - expect(() => { - profiler.measure('error-test', () => { - throw error; - }); - }).toThrow(error); - - const measures = performance.getEntriesByType('measure'); - expect(measures).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - detail: { - devtools: expect.objectContaining({ - color: 'error', - properties: expect.arrayContaining([ - ['Error Type', 'Error'], - ['Error Message', 'Test error'], - ]), - }), - }, - }), - ]), - ); - }); - - it('should include error metadata in DevTools properties', () => { - const customError = new TypeError('Custom type error'); - - expect(() => { - profiler.measure('custom-error-test', () => { - throw customError; - }); - }).toThrow(customError); - - const measures = performance.getEntriesByType('measure'); - expect(measures).toStrictEqual( - expect.arrayContaining([ - expect.objectContaining({ - detail: { - devtools: expect.objectContaining({ - properties: expect.arrayContaining([ - ['Error Type', 'TypeError'], - ['Error Message', 'Custom type error'], - ]), + track: 'Track 1', + trackGroup: 'Group 1', + color: 'secondary-dark', + properties: [['secondary', 'result']], + tooltipText: 'Track test secondary', }), - }, + }), }), ]), ); }); it('should not create performance entries when disabled', async () => { - profiler.setEnabled(false); + const p = profiler(); + p.setEnabled(false); - const syncResult = profiler.measure('disabled-sync', () => 'sync'); + const syncResult = p.measure('disabled-sync', () => 'sync'); expect(syncResult).toBe('sync'); - const asyncResult = profiler.measureAsync( - 'disabled-async', - async () => 'async', - ); + const asyncResult = p.measureAsync('disabled-async', async () => 'async'); await expect(asyncResult).resolves.toBe('async'); - profiler.marker('disabled-marker'); + p.marker('disabled-marker'); expect(performance.getEntriesByType('mark')).toHaveLength(0); expect(performance.getEntriesByType('measure')).toHaveLength(0); diff --git a/packages/utils/src/lib/profiler/profiler.ts b/packages/utils/src/lib/profiler/profiler.ts index e2b2f3b88..322b813d8 100644 --- a/packages/utils/src/lib/profiler/profiler.ts +++ b/packages/utils/src/lib/profiler/profiler.ts @@ -16,7 +16,10 @@ import type { DevToolsColor, EntryMeta, } from '../user-timing-extensibility-api.type.js'; -import { PROFILER_ENABLED_ENV_VAR } from './constants.js'; +import { + PROFILER_DEBUG_ENV_VAR, + PROFILER_ENABLED_ENV_VAR, +} from './constants.js'; /** * Generates a unique profiler ID based on performance time origin, process ID, thread ID, and instance count. @@ -35,8 +38,6 @@ type ProfilerMeasureOptions = MeasureCtxOptions & { /** Custom track configurations that will be merged with default settings */ tracks?: Record>; - /** Whether profiling should be enabled (defaults to CP_PROFILING env var) */ - enabled?: boolean; }; /** @@ -44,6 +45,16 @@ type ProfilerMeasureOptions = */ export type MarkerOptions = EntryMeta & { color?: DevToolsColor }; +export type ProfilerStateOptions = { + /** Whether profiling should be enabled (defaults to CP_PROFILING env var) */ + enabled?: boolean; + /** + * When set to true, profiler creates debug logs in traces. + * + * @default false + */ + debug?: boolean; +}; /** * Options for configuring a Profiler instance. * @@ -59,7 +70,7 @@ export type MarkerOptions = EntryMeta & { color?: DevToolsColor }; * @property tracks - Custom track configurations merged with defaults */ export type ProfilerOptions = - ProfilerMeasureOptions; + ProfilerStateOptions & ProfilerMeasureOptions; /** * Performance profiler that creates structured timing measurements with Chrome DevTools Extensibility API payloads. @@ -71,11 +82,24 @@ export type ProfilerOptions = export class Profiler { static instanceCount = 0; readonly id = getProfilerId(); + /** + * Whether debug mode is enabled for profiler state transitions. + * When enabled, profiler state transitions create performance marks for debugging. + */ + #debug: boolean = false; #enabled: boolean = false; readonly #defaults: ActionTrackEntryPayload; readonly tracks: Record | undefined; readonly #ctxOf: ReturnType; + /** + * Protected method to set debug mode state. + * Allows subclasses to update debug state. + */ + protected setDebugState(debugMode: boolean): void { + this.#debug = debugMode; + } + /** * Creates a new Profiler instance with the specified configuration. * @@ -89,10 +113,11 @@ export class Profiler { * */ constructor(options: ProfilerOptions) { - const { tracks, prefix, enabled, ...defaults } = options; + const { tracks, prefix, enabled, debug, ...defaults } = options; const dataType = 'track-entry'; this.#enabled = enabled ?? isEnvVarEnabled(PROFILER_ENABLED_ENV_VAR); + this.#debug = debug ?? isEnvVarEnabled(PROFILER_DEBUG_ENV_VAR); this.#defaults = { ...defaults, dataType }; this.tracks = tracks ? setupTracks({ ...defaults, dataType }, tracks) @@ -128,6 +153,29 @@ export class Profiler { return this.#enabled; } + /** + * Sets debug mode state for this profiler. + * + * This means any future {@link Profiler} instantiations (including child processes) will use the same debug state. + * + * @param debugMode - Whether debug mode should be enabled + */ + setDebugMode(debugMode: boolean): void { + process.env[PROFILER_DEBUG_ENV_VAR] = `${debugMode}`; + this.#debug = debugMode; + } + + /** + * Is debug mode enabled? + * + * (defaults to 'DEBUG'). + * + * @returns Whether debug mode is currently enabled + */ + isDebugMode(): boolean { + return this.#debug; + } + /** * Creates a performance mark including payload for a Chrome DevTools 'marker' item. * diff --git a/packages/utils/src/lib/profiler/profiler.unit.test.ts b/packages/utils/src/lib/profiler/profiler.unit.test.ts index bf047e3fa..216f561eb 100644 --- a/packages/utils/src/lib/profiler/profiler.unit.test.ts +++ b/packages/utils/src/lib/profiler/profiler.unit.test.ts @@ -1,7 +1,6 @@ import { performance } from 'node:perf_hooks'; import { threadId } from 'node:worker_threads'; import { beforeEach, describe, expect, it, vi } from 'vitest'; -import type { ActionTrackEntryPayload } from '../user-timing-extensibility-api.type.js'; import { Profiler, type ProfilerOptions, getProfilerId } from './profiler.js'; vi.mock('../exit-process.js'); @@ -25,15 +24,11 @@ describe('Profiler', () => { ...overrides, }); - let profiler: Profiler>; - beforeEach(() => { performance.clearMarks(); performance.clearMeasures(); // eslint-disable-next-line functional/immutable-data delete process.env.CP_PROFILING; - - profiler = getProfiler(); }); it('should create profiler instances', () => { @@ -48,7 +43,8 @@ describe('Profiler', () => { }); it('constructor should use defaults for measure', () => { - const customProfiler = getProfiler({ color: 'secondary', enabled: true }); + const customProfiler = getProfiler({ color: 'secondary' }); + customProfiler.setEnabled(true); const result = customProfiler.measure('test-operation', () => 'success'); @@ -123,7 +119,17 @@ describe('Profiler', () => { }); }); + it('setDebugState should update debug flag in subclasses', () => { + const testProfiler = getProfiler({ + prefix: 'cp', + track: 'test-track', + debug: true, + }); + expect(testProfiler.isDebugMode()).toBe(true); + }); + it('isEnabled should set and get enabled state', () => { + const profiler = getProfiler(); expect(profiler.isEnabled()).toBe(false); profiler.setEnabled(true); @@ -147,14 +153,15 @@ describe('Profiler', () => { }); it('marker should execute without error when enabled', () => { - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); + enabledProfiler.setEnabled(true); expect(() => { enabledProfiler.marker('test-marker', { color: 'primary', tooltipText: 'Test marker', properties: [['key', 'value']], }); - }).not.toThrow(); + }).not.toThrowError(); const marks = performance.getEntriesByType('mark'); expect(marks).toStrictEqual([ @@ -175,13 +182,14 @@ describe('Profiler', () => { it('marker should execute without error when enabled with default color', () => { performance.clearMarks(); - const profilerWithColor = getProfiler({ color: 'primary', enabled: true }); + const profilerWithColor = getProfiler({ color: 'primary' }); + profilerWithColor.setEnabled(true); expect(() => { profilerWithColor.marker('test-marker-default-color', { tooltipText: 'Test marker with default color', }); - }).not.toThrow(); + }).not.toThrowError(); const marks = performance.getEntriesByType('mark'); expect(marks).toStrictEqual([ @@ -199,7 +207,8 @@ describe('Profiler', () => { }); it('marker should execute without error when enabled with no default color', () => { - const profilerNoColor = getProfiler({ enabled: true }); + const profilerNoColor = getProfiler(); + profilerNoColor.setEnabled(true); expect(() => { profilerNoColor.marker('test-marker-no-color', { @@ -207,7 +216,7 @@ describe('Profiler', () => { tooltipText: 'Test marker without default color', properties: [['key', 'value']], }); - }).not.toThrow(); + }).not.toThrowError(); const marks = performance.getEntriesByType('mark'); expect(marks).toStrictEqual([ @@ -233,7 +242,7 @@ describe('Profiler', () => { color: 'primary', tooltipText: 'This should not create a mark', }); - }).not.toThrow(); + }).not.toThrowError(); const marks = performance.getEntriesByType('mark'); expect(marks).toHaveLength(0); @@ -243,7 +252,8 @@ describe('Profiler', () => { performance.clearMarks(); performance.clearMeasures(); - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); + enabledProfiler.setEnabled(true); const workFn = vi.fn(() => 'result'); const result = enabledProfiler.measure('test-event', workFn, { color: 'primary', @@ -291,6 +301,7 @@ describe('Profiler', () => { }); it('measure should always execute work function', () => { + const profiler = getProfiler(); const workFn = vi.fn(() => 'result'); const result = profiler.measure('test-event', workFn); @@ -299,33 +310,36 @@ describe('Profiler', () => { }); it('measure should propagate errors when enabled', () => { + const profiler = getProfiler(); const error = new Error('Test error'); const workFn = vi.fn(() => { throw error; }); - expect(() => profiler.measure('test-event', workFn)).toThrow(error); + expect(() => profiler.measure('test-event', workFn)).toThrowError(error); expect(workFn).toHaveBeenCalled(); }); it('measure should propagate errors', () => { + const profiler = getProfiler(); const error = new Error('Test error'); const workFn = vi.fn(() => { throw error; }); - expect(() => profiler.measure('test-event', workFn)).toThrow(error); + expect(() => profiler.measure('test-event', workFn)).toThrowError(error); expect(workFn).toHaveBeenCalled(); }); it('measure should propagate errors when enabled and call error callback', () => { - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); + enabledProfiler.setEnabled(true); const error = new Error('Enabled test error'); const workFn = vi.fn(() => { throw error; }); - expect(() => enabledProfiler.measure('test-event-error', workFn)).toThrow( + expect(() => enabledProfiler.measure('test-event-error', workFn)).toThrowError( error, ); expect(workFn).toHaveBeenCalled(); @@ -357,7 +371,8 @@ describe('Profiler', () => { }); it('measureAsync should handle async operations correctly when enabled', async () => { - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); + enabledProfiler.setEnabled(true); const workFn = vi.fn(async () => { await Promise.resolve(); return 'async-result'; @@ -416,6 +431,7 @@ describe('Profiler', () => { }); it('measureAsync should propagate async errors when enabled', async () => { + const profiler = getProfiler(); const error = new Error('Async test error'); const workFn = vi.fn(async () => { await Promise.resolve(); @@ -424,12 +440,13 @@ describe('Profiler', () => { await expect( profiler.measureAsync('test-async-event', workFn), - ).rejects.toThrow(error); + ).rejects.toThrowError(error); expect(workFn).toHaveBeenCalled(); }); it('measureAsync should propagate async errors when enabled and call error callback', async () => { - const enabledProfiler = getProfiler({ enabled: true }); + const enabledProfiler = getProfiler(); + enabledProfiler.setEnabled(true); const error = new Error('Enabled async test error'); const workFn = vi.fn(async () => { await Promise.resolve(); @@ -438,7 +455,7 @@ describe('Profiler', () => { await expect( enabledProfiler.measureAsync('test-async-event-error', workFn), - ).rejects.toThrow(error); + ).rejects.toThrowError(error); expect(workFn).toHaveBeenCalled(); // Verify that performance marks were created even though error occurred diff --git a/packages/utils/src/lib/profiler/trace-file-utils.ts b/packages/utils/src/lib/profiler/trace-file-utils.ts index 1061062d3..bd5533d67 100644 --- a/packages/utils/src/lib/profiler/trace-file-utils.ts +++ b/packages/utils/src/lib/profiler/trace-file-utils.ts @@ -5,24 +5,18 @@ import type { } from 'node:perf_hooks'; import { threadId } from 'node:worker_threads'; import { defaultClock } from '../clock-epoch.js'; -import type { UserTimingDetail } from '../user-timing-extensibility-api.type.js'; import type { - BeginEvent, - CompleteEvent, - EndEvent, - InstantEvent, - InstantEventArgs, - InstantEventTracingStartedInBrowser, - SpanEvent, - SpanEventArgs, TraceEvent, TraceEventContainer, - TraceEventRaw, TraceMetadata, - UserTimingTraceEvent, + TracingStartedInBrowserOptions, } from './trace-file.type.js'; -/** Global counter for generating unique span IDs within a trace */ +/** + * Trace-local monotonic span id counter. + * Chrome only requires uniqueness within a single trace file. + * Resetting per trace is intentional - we're not aiming for global uniqueness. + */ // eslint-disable-next-line functional/no-let let id2Count = 0; @@ -33,69 +27,87 @@ let id2Count = 0; export const nextId2 = () => ({ local: `0x${++id2Count}` }); /** - * Provides default values for trace event properties. - * @param opt - Optional overrides for process ID, thread ID, and timestamp - * @param opt.pid - Process ID override, defaults to current process PID - * @param opt.tid - Thread ID override, defaults to current thread ID - * @param opt.ts - Timestamp override in microseconds, defaults to current epoch time - * @returns Object containing pid, tid, and ts with defaults applied - */ -const defaults = (opt?: { pid?: number; tid?: number; ts?: number }) => ({ - pid: opt?.pid ?? process.pid, - tid: opt?.tid ?? threadId, - ts: opt?.ts ?? defaultClock.epochNowUs(), -}); - -/** - * Generates a unique frame tree node ID from process and thread IDs. + * Generates a frame tree node ID from process and thread IDs. * @param pid - Process ID * @param tid - Thread ID - * @returns Combined numeric ID + * @returns Frame tree node ID as a number */ export const frameTreeNodeId = (pid: number, tid: number) => Number.parseInt(`${pid}0${tid}`, 10); /** - * Generates a frame name string from process and thread IDs. + * Generates a frame name from process and thread IDs. * @param pid - Process ID * @param tid - Thread ID - * @returns Formatted frame name + * @returns Frame name string in format FRAME0P{pid}T{tid} */ export const frameName = (pid: number, tid: number) => `FRAME0P${pid}T${tid}`; /** - * Creates an instant trace event for marking a point in time. - * @param opt - Event configuration options - * @returns InstantEvent object + * Core factory for creating trace events with defaults. + * @param opt - Partial trace event with required name and ph + * @returns Complete TraceEvent with defaults applied */ -export const getInstantEvent = (opt: { - name: string; - ts?: number; - pid?: number; - tid?: number; - args?: InstantEventArgs; -}): InstantEvent => ({ - cat: 'blink.user_timing', - ph: 'i', - name: opt.name, - ...defaults(opt), - args: opt.args ?? {}, +const baseEvent = ( + opt: Partial & { name: string; ph: string }, +): TraceEvent => ({ + cat: opt.cat ?? 'blink.user_timing', + pid: opt.pid ?? process.pid, + tid: opt.tid ?? threadId, + ts: opt.ts ?? defaultClock.epochNowUs(), + ...opt, }); +/** + * Creates an instant trace event for marking a point in time. + * @param name - Event name + * @param ts - Optional timestamp in microseconds + * @param opt - Optional event configuration + * @returns Instant trace event (ph: 'I') + */ +export const instant = ( + name: string, + ts?: number, + opt?: Partial, +): TraceEvent => baseEvent({ name, ph: 'I', ts, ...opt }); + +/** + * Creates a pair of begin and end span events. + * @param name - Span name + * @param tsB - Begin timestamp in microseconds + * @param tsE - End timestamp in microseconds + * @param opt - Optional event configuration + * @param opt.tsMarkerPadding - Padding to apply to timestamps (default: 1) + * @returns Array of [begin event, end event] + */ +export const span = ( + name: string, + tsB: number, + tsE: number, + opt?: Partial & { tsMarkerPadding?: number }, +): TraceEvent[] => { + const id2 = opt?.id2 ?? nextId2(); + const pad = opt?.tsMarkerPadding ?? 1; + const { tsMarkerPadding, ...eventOpt } = opt ?? {}; + const args = eventOpt.args ?? {}; + return [ + baseEvent({ name, ph: 'b', ts: tsB + pad, id2, ...eventOpt, args }), + baseEvent({ name, ph: 'e', ts: tsE - pad, id2, ...eventOpt, args }), + ]; +}; + /** * Creates a start tracing event with frame information. * This event is needed at the beginning of the traceEvents array to make tell the UI profiling has started, and it should visualize the data. * @param opt - Tracing configuration options * @returns StartTracingEvent object */ -export const getInstantEventTracingStartedInBrowser = (opt: { - url: string; - ts?: number; - pid?: number; - tid?: number; -}): InstantEventTracingStartedInBrowser => { - const { pid, tid, ts } = defaults(opt); - const id = frameTreeNodeId(pid, tid); +export const getInstantEventTracingStartedInBrowser = ( + opt: TracingStartedInBrowserOptions, +): TraceEvent => { + const pid = opt.pid ?? process.pid; + const tid = opt.tid ?? threadId; + const ts = opt.ts ?? defaultClock.epochNowUs(); return { cat: 'devtools.timeline', @@ -106,7 +118,7 @@ export const getInstantEventTracingStartedInBrowser = (opt: { ts, args: { data: { - frameTreeNodeId: id, + frameTreeNodeId: frameTreeNodeId(pid, tid), frames: [ { frame: frameName(pid, tid), @@ -118,157 +130,78 @@ export const getInstantEventTracingStartedInBrowser = (opt: { }, ], persistentIds: true, - }, + } satisfies Record, }, }; }; /** * Creates a complete trace event with duration. - * @param opt - Event configuration with name and duration - * @returns CompleteEvent object - */ -export const getCompleteEvent = (opt: { - name: string; - dur: number; - ts?: number; - pid?: number; - tid?: number; -}): CompleteEvent => ({ - cat: 'devtools.timeline', - ph: 'X', - name: opt.name, - dur: opt.dur, - ...defaults(opt), - args: {}, -}); - -/** Options for creating span events */ -type SpanOpt = { - name: string; - id2: { local: string }; - ts?: number; - pid?: number; - tid?: number; - args?: SpanEventArgs; -}; - -/** - * Creates a begin span event. - * @param ph - Phase ('b' for begin) - * @param opt - Span event options - * @returns BeginEvent object - */ -export function getSpanEvent(ph: 'b', opt: SpanOpt): BeginEvent; -/** - * Creates an end span event. - * @param ph - Phase ('e' for end) - * @param opt - Span event options - * @returns EndEvent object + * @param name - Event name + * @param dur - Duration in microseconds + * @param opt - Optional event configuration + * @returns Complete trace event (ph: 'X') */ -export function getSpanEvent(ph: 'e', opt: SpanOpt): EndEvent; -/** - * Creates a span event (begin or end). - * @param ph - Phase ('b' or 'e') - * @param opt - Span event options - * @returns SpanEvent object - */ -export function getSpanEvent(ph: 'b' | 'e', opt: SpanOpt): SpanEvent { - return { - cat: 'blink.user_timing', - ph, - name: opt.name, - id2: opt.id2, - ...defaults(opt), - args: opt.args?.data?.detail - ? { data: { detail: opt.args.data.detail } } - : {}, - }; -} - -/** - * Creates a pair of begin and end span events. - * @param opt - Span configuration with start/end timestamps - * @returns Tuple of BeginEvent and EndEvent - */ -export const getSpan = (opt: { - name: string; - tsB: number; - tsE: number; - id2?: { local: string }; - pid?: number; - tid?: number; - args?: SpanEventArgs; - tsMarkerPadding?: number; -}): [BeginEvent, EndEvent] => { - // tsMarkerPadding is here to make the measure slightly smaller so the markers align perfectly. - // Otherwise, the marker is visible at the start of the measure below the frame - // No padding Padding - // spans: ======== |======| - // marks: | | - const pad = opt.tsMarkerPadding ?? 1; - // b|e need to share the same id2 - const id2 = opt.id2 ?? nextId2(); - - return [ - getSpanEvent('b', { - ...opt, - id2, - ts: opt.tsB + pad, - }), - getSpanEvent('e', { - ...opt, - id2, - ts: opt.tsE - pad, - }), - ]; -}; +export const complete = ( + name: string, + dur: number, + opt?: Partial, +): TraceEvent => + baseEvent({ + cat: 'devtools.timeline', + ph: 'X', + name, + dur, + args: {}, + ...opt, + }); /** * Converts a PerformanceMark to an instant trace event. * @param entry - Performance mark entry * @param opt - Optional overrides for name, pid, and tid - * @returns InstantEvent object + * @returns Instant trace event */ export const markToInstantEvent = ( entry: PerformanceMark, opt?: { name?: string; pid?: number; tid?: number }, -): InstantEvent => - getInstantEvent({ - ...opt, - name: opt?.name ?? entry.name, - ts: defaultClock.fromEntry(entry), - args: entry.detail ? { detail: entry.detail } : undefined, - }); +): TraceEvent => + instant( + opt?.name ?? entry.name, + defaultClock.fromEntry(entry), + entry.detail + ? { args: { data: { detail: entry.detail } }, ...opt } + : { args: {}, ...opt }, + ); /** * Converts a PerformanceMeasure to a pair of span events. * @param entry - Performance measure entry * @param opt - Optional overrides for name, pid, and tid - * @returns Tuple of BeginEvent and EndEvent + * @returns Array of [begin event, end event] */ export const measureToSpanEvents = ( entry: PerformanceMeasure, opt?: { name?: string; pid?: number; tid?: number }, -): [BeginEvent, EndEvent] => - getSpan({ - ...opt, - name: opt?.name ?? entry.name, - tsB: defaultClock.fromEntry(entry), - tsE: defaultClock.fromEntry(entry, true), - args: entry.detail ? { data: { detail: entry.detail } } : undefined, - }); +): TraceEvent[] => + span( + opt?.name ?? entry.name, + defaultClock.fromEntry(entry), + defaultClock.fromEntry(entry, true), + { + ...opt, + args: entry.detail ? { detail: entry.detail } : {}, + }, + ); /** - * Converts a PerformanceEntry to an array of UserTimingTraceEvents. + * Converts a PerformanceEntry to an array of trace events. * A mark is converted to an instant event, and a measure is converted to a pair of span events. * Other entry types are ignored. * @param entry - Performance entry - * @returns UserTimingTraceEvent[] + * @returns Array of trace events */ -export function entryToTraceEvents( - entry: PerformanceEntry, -): UserTimingTraceEvent[] { +export function entryToTraceEvents(entry: PerformanceEntry): TraceEvent[] { if (entry.entryType === 'mark') { return [markToInstantEvent(entry as PerformanceMark)]; } @@ -278,6 +211,70 @@ export function entryToTraceEvents( return []; } +/** + * Creates a mapper function for transforming detail properties in args. + * @param fn - Transformation function to apply to detail values + * @returns Function that maps args object + */ +const mapArgs = (fn: (v: unknown) => unknown) => (args?: TraceEvent['args']) => + args && { + ...args, + ...(args.detail != null && { detail: fn(args.detail) }), + ...(args.data?.detail != null && { + data: { ...args.data, detail: fn(args.data.detail) }, + }), + }; + +/** + * Encodes a trace event by converting object details to JSON strings. + * @param e - Trace event with potentially object details + * @returns Trace event with string-encoded details + */ +export const encodeEvent = (e: TraceEvent): TraceEvent => { + const mappedArgs = mapArgs(d => + typeof d === 'object' ? JSON.stringify(d) : d, + )(e.args); + return { + ...e, + ...(mappedArgs && { args: mappedArgs }), + }; +}; + +/** + * Decodes a trace event by parsing JSON string details back to objects. + * @param e - Trace event with potentially string-encoded details + * @returns Trace event with decoded object details + */ +export const decodeEvent = (e: TraceEvent): TraceEvent => { + const mappedArgs = mapArgs(d => (typeof d === 'string' ? JSON.parse(d) : d))( + e.args, + ); + return { + ...e, + ...(mappedArgs && { args: mappedArgs }), + }; +}; + +/** + * Serializes a trace event to a JSON string for storage. + * First encodes the event structure (converting object details to JSON strings), + * then stringifies the entire event. + * @param event - Trace event to serialize + * @returns JSON string representation of the encoded trace event + */ +export const serializeTraceEvent = (event: TraceEvent): string => + JSON.stringify(encodeEvent(event)); + +/** + * Deserializes a JSON string back to a trace event. + * First parses the JSON string, then decodes the event structure + * (parsing JSON string details back to objects). + * @param json - JSON string representation of a trace event + * @returns Decoded trace event + */ +export const deserializeTraceEvent = (json: string): TraceEvent => + decodeEvent(JSON.parse(json)); + /** * Creates trace metadata object with standard DevTools fields and custom metadata. * @param startDate - Optional start date for the trace, defaults to current date @@ -287,7 +284,7 @@ export function entryToTraceEvents( export function getTraceMetadata( startDate?: Date, metadata?: Record, -) { +): TraceMetadata { return { source: 'DevTools', startTime: startDate?.toISOString() ?? new Date().toISOString(), @@ -302,121 +299,15 @@ export function getTraceMetadata( * @param opt - Trace file configuration * @returns TraceEventContainer with events and metadata */ -export const getTraceFile = (opt: { +export const createTraceFile = (opt: { traceEvents: TraceEvent[]; startTime?: string; metadata?: Partial; }): TraceEventContainer => ({ - traceEvents: opt.traceEvents, + traceEvents: opt.traceEvents.map(encodeEvent), displayTimeUnit: 'ms', metadata: getTraceMetadata( opt.startTime ? new Date(opt.startTime) : new Date(), opt.metadata, ), }); - -/** - * Processes the detail property of an object using a custom processor function. - * @template T - Object type that may contain a detail property - * @param target - Object containing the detail property to process - * @param processor - Function to transform the detail value - * @returns New object with processed detail property, or original object if no detail - */ -function processDetail( - target: T, - processor: (detail: string | object) => string | object, -): T { - if ( - target.detail != null && - (typeof target.detail === 'string' || typeof target.detail === 'object') - ) { - return { ...target, detail: processor(target.detail) }; - } - return target; -} - -/** - * Decodes a JSON string detail property back to its original object form. - * @param target - Object containing a detail property as a JSON string - * @returns UserTimingDetail with the detail property parsed from JSON - */ -export function decodeDetail(target: { detail: string }): UserTimingDetail { - return processDetail(target, detail => - typeof detail === 'string' - ? (JSON.parse(detail) as string | object) - : detail, - ) as UserTimingDetail; -} - -/** - * Encodes object detail properties to JSON strings for storage/transmission. - * @param target - UserTimingDetail object with detail property to encode - * @returns UserTimingDetail with object details converted to JSON strings - */ -export function encodeDetail(target: UserTimingDetail): UserTimingDetail { - return processDetail( - target as UserTimingDetail & { detail?: unknown }, - (detail: string | object) => - typeof detail === 'object' - ? JSON.stringify(detail as UserTimingDetail) - : detail, - ) as UserTimingDetail; -} - -/** - * Decodes a raw trace event with JSON string details back to typed UserTimingTraceEvent. - * Parses detail properties from JSON strings to objects. - * @param event - Raw trace event with string-encoded details - * @returns UserTimingTraceEvent with parsed detail objects - */ -export function decodeTraceEvent({ - args, - ...rest -}: TraceEventRaw): UserTimingTraceEvent { - if (!args) { - return rest as UserTimingTraceEvent; - } - - const processedArgs = decodeDetail(args as { detail: string }); - if ('data' in args && args.data && typeof args.data === 'object') { - // eslint-disable-next-line @typescript-eslint/consistent-type-assertions - return { - ...rest, - args: { - ...processedArgs, - data: decodeDetail(args.data as { detail: string }), - }, - } as UserTimingTraceEvent; - } - // eslint-disable-next-line @typescript-eslint/consistent-type-assertions - return { ...rest, args: processedArgs } as UserTimingTraceEvent; -} - -/** - * Encodes a UserTimingTraceEvent to raw format with JSON string details. - * Converts object details to JSON strings for storage/transmission. - * @param event - UserTimingTraceEvent with object details - * @returns TraceEventRaw with string-encoded details - */ -export function encodeTraceEvent({ - args, - ...rest -}: UserTimingTraceEvent): TraceEventRaw { - if (!args) { - return rest as TraceEventRaw; - } - - const processedArgs = encodeDetail(args as UserTimingDetail); - if ('data' in args && args.data && typeof args.data === 'object') { - const result: TraceEventRaw = { - ...rest, - args: { - ...processedArgs, - data: encodeDetail(args.data as UserTimingDetail), - }, - }; - return result; - } - const result: TraceEventRaw = { ...rest, args: processedArgs }; - return result; -} diff --git a/packages/utils/src/lib/profiler/trace-file-utils.unit.test.ts b/packages/utils/src/lib/profiler/trace-file-utils.unit.test.ts index aa21887af..f7172e23c 100644 --- a/packages/utils/src/lib/profiler/trace-file-utils.unit.test.ts +++ b/packages/utils/src/lib/profiler/trace-file-utils.unit.test.ts @@ -1,27 +1,24 @@ import type { PerformanceMark, PerformanceMeasure } from 'node:perf_hooks'; import { - decodeDetail, - decodeTraceEvent, - encodeDetail, - encodeTraceEvent, + complete, + createTraceFile, + decodeEvent, + deserializeTraceEvent, + encodeEvent, entryToTraceEvents, - frameName, - frameTreeNodeId, - getCompleteEvent, - getInstantEvent, getInstantEventTracingStartedInBrowser, - getSpan, - getSpanEvent, - getTraceFile, getTraceMetadata, + instant, markToInstantEvent, measureToSpanEvents, nextId2, + serializeTraceEvent, + span, } from './trace-file-utils.js'; describe('getTraceFile', () => { it('should create trace file with empty events array', () => { - expect(getTraceFile({ traceEvents: [] })).toStrictEqual({ + expect(createTraceFile({ traceEvents: [] })).toStrictEqual({ traceEvents: [], displayTimeUnit: 'ms', metadata: { @@ -35,11 +32,9 @@ describe('getTraceFile', () => { it('should create trace file with events', () => { expect( - getTraceFile({ + createTraceFile({ traceEvents: [ - getInstantEvent({ - name: 'test-event', - ts: 1_234_567_890, + instant('test-event', 1_234_567_890, { pid: 123, tid: 456, }), @@ -65,7 +60,7 @@ describe('getTraceFile', () => { }); it('should use custom startTime when provided', () => { - const result = getTraceFile({ + const result = createTraceFile({ traceEvents: [], startTime: '2023-01-01T00:00:00.000Z', }); @@ -79,7 +74,7 @@ describe('getTraceFile', () => { }); it('should include hardware concurrency', () => { - expect(getTraceFile({ traceEvents: [] })).toHaveProperty( + expect(createTraceFile({ traceEvents: [] })).toHaveProperty( 'metadata', expect.objectContaining({ hardwareConcurrency: expect.any(Number), @@ -88,26 +83,6 @@ describe('getTraceFile', () => { }); }); -describe('frameTreeNodeId', () => { - it.each([ - [123, 456, 1_230_456], - [1, 2, 102], - [999, 999, 9_990_999], - ])('should generate correct frame tree node ID', (pid, tid, expected) => { - expect(frameTreeNodeId(pid, tid)).toBe(expected); - }); -}); - -describe('frameName', () => { - it.each([ - [123, 456], - [1, 2], - [999, 999], - ])('should generate correct frame name', (pid, tid) => { - expect(frameName(pid, tid)).toBe(`FRAME0P${pid}T${tid}`); - }); -}); - describe('getInstantEventTracingStartedInBrowser', () => { it('should create start tracing event with required url', () => { expect( @@ -172,14 +147,9 @@ describe('getInstantEventTracingStartedInBrowser', () => { }); }); -describe('getCompleteEvent', () => { +describe('complete', () => { it('should create complete event with required fields', () => { - expect( - getCompleteEvent({ - name: 'test-complete', - dur: 1000, - }), - ).toStrictEqual({ + expect(complete('test-complete', 1000)).toStrictEqual({ cat: 'devtools.timeline', ph: 'X', name: 'test-complete', @@ -193,9 +163,7 @@ describe('getCompleteEvent', () => { it('should use custom pid, tid, and ts', () => { expect( - getCompleteEvent({ - name: 'custom-complete', - dur: 500, + complete('custom-complete', 500, { pid: 111, tid: 222, ts: 1_234_567_890, @@ -223,12 +191,12 @@ describe('markToInstantEvent', () => { } as PerformanceMark), ).toStrictEqual({ cat: 'blink.user_timing', - ph: 'i', + ph: 'I', name: 'test-mark', pid: expect.any(Number), tid: expect.any(Number), ts: expect.any(Number), - args: { detail: { customData: 'test' } }, + args: { data: { detail: { customData: 'test' } } }, }); }); @@ -241,7 +209,7 @@ describe('markToInstantEvent', () => { } as PerformanceMark), ).toStrictEqual({ cat: 'blink.user_timing', - ph: 'i', + ph: 'I', name: 'test-mark', pid: expect.any(Number), tid: expect.any(Number), @@ -266,12 +234,12 @@ describe('markToInstantEvent', () => { ), ).toStrictEqual({ cat: 'blink.user_timing', - ph: 'i', + ph: 'I', name: 'custom-name', pid: 999, tid: 888, ts: expect.any(Number), - args: { detail: { customData: 'test' } }, + args: { data: { detail: { customData: 'test' } } }, }); }); }); @@ -294,7 +262,7 @@ describe('measureToSpanEvents', () => { tid: expect.any(Number), ts: expect.any(Number), id2: { local: expect.stringMatching(/^0x\d+$/) }, - args: { data: { detail: { measurement: 'data' } } }, + args: { detail: { measurement: 'data' } }, }, { cat: 'blink.user_timing', @@ -304,7 +272,7 @@ describe('measureToSpanEvents', () => { tid: expect.any(Number), ts: expect.any(Number), id2: { local: expect.stringMatching(/^0x\d+$/) }, - args: { data: { detail: { measurement: 'data' } } }, + args: { detail: { measurement: 'data' } }, }, ]); }); @@ -361,63 +329,21 @@ describe('measureToSpanEvents', () => { name: 'custom-measure', pid: 777, tid: 666, - args: { data: { detail: { measurement: 'data' } } }, + args: { detail: { measurement: 'data' } }, }), expect.objectContaining({ name: 'custom-measure', pid: 777, tid: 666, - args: { data: { detail: { measurement: 'data' } } }, + args: { detail: { measurement: 'data' } }, }), ]); }); }); -describe('getSpanEvent', () => { - it('should create begin event with args detail', () => { - expect( - getSpanEvent('b', { - name: 'test-span', - id2: { local: '0x1' }, - args: { data: { detail: { customData: 'test' } as any } }, - }), - ).toStrictEqual({ - cat: 'blink.user_timing', - ph: 'b', - name: 'test-span', - pid: expect.any(Number), - tid: expect.any(Number), - ts: expect.any(Number), - id2: { local: '0x1' }, - args: { data: { detail: { customData: 'test' } } }, - }); - }); - - it('should create end event without args detail', () => { - expect( - getSpanEvent('e', { - name: 'test-span', - id2: { local: '0x2' }, - }), - ).toStrictEqual({ - cat: 'blink.user_timing', - ph: 'e', - name: 'test-span', - pid: expect.any(Number), - tid: expect.any(Number), - ts: expect.any(Number), - id2: { local: '0x2' }, - args: {}, - }); - }); -}); - -describe('getSpan', () => { +describe('span', () => { it('should create span events with custom tsMarkerPadding', () => { - const result = getSpan({ - name: 'test-span', - tsB: 1000, - tsE: 1500, + const result = span('test-span', 1000, 1500, { tsMarkerPadding: 5, args: {}, }); @@ -447,23 +373,16 @@ describe('getSpan', () => { }); it('should generate id2 when not provided', () => { - const result = getSpan({ - name: 'test-span', - tsB: 1000, - tsE: 1500, - }); + const result = span('test-span', 1000, 1500); expect(result).toHaveLength(2); - expect(result[0].id2?.local).toMatch(/^0x\d+$/); - expect(result[1].id2).toEqual(result[0].id2); + expect(result.at(0)?.id2?.local).toMatch(/^0x\d+$/); + expect(result.at(1)?.id2).toEqual(result.at(0)?.id2); }); it('should use provided id2', () => { expect( - getSpan({ - name: 'test-span', - tsB: 1000, - tsE: 1500, + span('test-span', 1000, 1500, { id2: { local: 'custom-id' }, }), ).toStrictEqual([ @@ -529,12 +448,12 @@ describe('entryToTraceEvents', () => { expect(result).toHaveLength(1); expect(result[0]).toStrictEqual({ cat: 'blink.user_timing', - ph: 'i', + ph: 'I', name: 'test-mark', pid: expect.any(Number), tid: expect.any(Number), ts: expect.any(Number), - args: { detail: { customData: 'test' } }, + args: { data: { detail: { customData: 'test' } } }, }); }); @@ -559,7 +478,7 @@ describe('entryToTraceEvents', () => { tid: expect.any(Number), ts: expect.any(Number), id2: { local: expect.stringMatching(/^0x\d+$/) }, - args: { data: { detail: { measurement: 'data' } } }, + args: { detail: { measurement: 'data' } }, }); expect(result[1]).toStrictEqual({ cat: 'blink.user_timing', @@ -569,7 +488,7 @@ describe('entryToTraceEvents', () => { tid: expect.any(Number), ts: expect.any(Number), id2: { local: expect.stringMatching(/^0x\d+$/) }, - args: { data: { detail: { measurement: 'data' } } }, + args: { detail: { measurement: 'data' } }, }); }); @@ -621,86 +540,104 @@ describe('getTraceMetadata', () => { }); }); -describe('decodeDetail', () => { - it('should decode string detail back to object', () => { - const input = { detail: '{"key": "value"}' }; - const result = decodeDetail(input); +describe('decodeEvent', () => { + it('should decode trace event with string details', () => { + const encodedEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'test-event', + pid: 123, + tid: 456, + ts: 1000, + args: { + detail: '{"custom": "data"}', + data: { detail: '{"nested": "value"}' }, + }, + }; + + const result = decodeEvent(encodedEvent); expect(result).toStrictEqual({ - detail: { key: 'value' }, + cat: 'blink.user_timing', + ph: 'i', + name: 'test-event', + pid: 123, + tid: 456, + ts: 1000, + args: { + detail: { custom: 'data' }, + data: { detail: { nested: 'value' } }, + }, }); }); - it('should return object detail unchanged', () => { - const input = { detail: { key: 'value' } }; - const result = decodeDetail(input); - - expect(result).toStrictEqual(input); - }); - - it('should return input unchanged when detail is not string or object', () => { - const input = { detail: 123 }; - const result = decodeDetail(input as any); - - expect(result).toStrictEqual(input); - }); - - it('should return input unchanged when no detail property', () => { - const input = { other: 'value' }; - const result = decodeDetail(input as any); - - expect(result).toStrictEqual(input); - }); -}); + it('should handle trace event without args', () => { + const encodedEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'test-event', + pid: 123, + tid: 456, + ts: 1000, + }; -describe('encodeDetail', () => { - it('should encode object detail to JSON string', () => { - const input = { detail: { key: 'value' } }; - const result = encodeDetail(input); + const result = decodeEvent(encodedEvent); expect(result).toStrictEqual({ - detail: '{"key":"value"}', + cat: 'blink.user_timing', + ph: 'i', + name: 'test-event', + pid: 123, + tid: 456, + ts: 1000, }); }); - it('should return string detail unchanged', () => { - const input = { detail: 'already a string' }; - const result = encodeDetail(input); - - expect(result).toStrictEqual(input); - }); - - it('should return input unchanged when detail is not string or object', () => { - const input = { detail: 123 }; - const result = encodeDetail(input as any); - - expect(result).toStrictEqual(input); - }); + it('should handle args without data property', () => { + const encodedEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'test-event', + pid: 123, + tid: 456, + ts: 1000, + args: { + detail: '{"custom": "data"}', + }, + }; - it('should return input unchanged when no detail property', () => { - const input = { other: 'value' }; - const result = encodeDetail(input as any); + const result = decodeEvent(encodedEvent); - expect(result).toStrictEqual(input); + expect(result).toStrictEqual({ + cat: 'blink.user_timing', + ph: 'i', + name: 'test-event', + pid: 123, + tid: 456, + ts: 1000, + args: { + detail: { custom: 'data' }, + }, + }); }); }); -describe('decodeTraceEvent', () => { - it('should decode trace event with string details', () => { - const rawEvent = { - cat: 'blink.user_timing' as const, - ph: 'i' as const, +describe('encodeEvent', () => { + it('should encode trace event with object details', () => { + const event = { + cat: 'blink.user_timing', + ph: 'i', name: 'test-event', pid: 123, tid: 456, ts: 1000, args: { - detail: '{"custom": "data"}', - data: { detail: '{"nested": "value"}' }, + detail: { custom: 'data' }, + data: { detail: { nested: 'value' } }, }, }; - const result = decodeTraceEvent(rawEvent); + const result = encodeEvent(event); expect(result).toStrictEqual({ cat: 'blink.user_timing', @@ -710,23 +647,23 @@ describe('decodeTraceEvent', () => { tid: 456, ts: 1000, args: { - detail: { custom: 'data' }, - data: { detail: { nested: 'value' } }, + detail: '{"custom":"data"}', + data: { detail: '{"nested":"value"}' }, }, }); }); it('should handle trace event without args', () => { - const rawEvent = { - cat: 'blink.user_timing' as const, - ph: 'i' as const, + const event = { + cat: 'blink.user_timing', + ph: 'i', name: 'test-event', pid: 123, tid: 456, ts: 1000, }; - const result = decodeTraceEvent(rawEvent); + const result = encodeEvent(event); expect(result).toStrictEqual({ cat: 'blink.user_timing', @@ -739,19 +676,19 @@ describe('decodeTraceEvent', () => { }); it('should handle args without data property', () => { - const rawEvent = { - cat: 'blink.user_timing' as const, - ph: 'i' as const, + const event = { + cat: 'blink.user_timing', + ph: 'i', name: 'test-event', pid: 123, tid: 456, ts: 1000, args: { - detail: '{"custom": "data"}', + detail: { custom: 'data' }, }, }; - const result = decodeTraceEvent(rawEvent); + const result = encodeEvent(event); expect(result).toStrictEqual({ cat: 'blink.user_timing', @@ -761,30 +698,32 @@ describe('decodeTraceEvent', () => { tid: 456, ts: 1000, args: { - detail: { custom: 'data' }, + detail: '{"custom":"data"}', }, }); }); }); -describe('encodeTraceEvent', () => { - it('should encode trace event with object details', () => { +describe('serializeTraceEvent', () => { + it('should serialize trace event to JSON string', () => { const event = { - cat: 'blink.user_timing' as const, - ph: 'i' as const, + cat: 'blink.user_timing', + ph: 'i', name: 'test-event', pid: 123, tid: 456, ts: 1000, args: { detail: { custom: 'data' }, - data: { detail: { nested: 'value' } }, }, }; - const result = encodeTraceEvent(event); + const result = serializeTraceEvent(event); - expect(result).toStrictEqual({ + expect(typeof result).toBe('string'); + expect(() => JSON.parse(result)).not.toThrowError(); + const parsed = JSON.parse(result); + expect(parsed).toStrictEqual({ cat: 'blink.user_timing', ph: 'i', name: 'test-event', @@ -793,24 +732,25 @@ describe('encodeTraceEvent', () => { ts: 1000, args: { detail: '{"custom":"data"}', - data: { detail: '{"nested":"value"}' }, }, }); }); it('should handle trace event without args', () => { const event = { - cat: 'blink.user_timing' as const, - ph: 'i' as const, + cat: 'blink.user_timing', + ph: 'i', name: 'test-event', pid: 123, tid: 456, ts: 1000, }; - const result = encodeTraceEvent(event); + const result = serializeTraceEvent(event); - expect(result).toStrictEqual({ + expect(typeof result).toBe('string'); + const parsed = JSON.parse(result); + expect(parsed).toStrictEqual({ cat: 'blink.user_timing', ph: 'i', name: 'test-event', @@ -820,22 +760,91 @@ describe('encodeTraceEvent', () => { }); }); - it('should handle args without data property', () => { + it('should handle nested object details in args', () => { const event = { - cat: 'blink.user_timing' as const, - ph: 'i' as const, + cat: 'blink.user_timing', + ph: 'i', name: 'test-event', pid: 123, tid: 456, ts: 1000, args: { detail: { custom: 'data' }, + data: { detail: { nested: 'value' } }, }, }; - const result = encodeTraceEvent(event); + const result = serializeTraceEvent(event); - expect(result).toStrictEqual({ + expect(typeof result).toBe('string'); + const parsed = JSON.parse(result); + expect(parsed.args).toStrictEqual({ + detail: '{"custom":"data"}', + data: { detail: '{"nested":"value"}' }, + }); + }); +}); + +describe('deserializeTraceEvent', () => { + it('should deserialize JSON string back to trace event', () => { + const originalEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'test-event', + pid: 123, + tid: 456, + ts: 1000, + args: { + detail: { custom: 'data' }, + }, + }; + + const serialized = serializeTraceEvent(originalEvent); + const deserialized = deserializeTraceEvent(serialized); + + expect(deserialized).toStrictEqual(originalEvent); + }); + + it('should handle round-trip serialization', () => { + const originalEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'round-trip-test', + pid: 789, + tid: 101, + ts: 987_654_321, + args: { + detail: { custom: 'data', nested: { value: 42 } }, + data: { detail: { nested: 'value' } }, + }, + }; + + const serialized = serializeTraceEvent(originalEvent); + const deserialized = deserializeTraceEvent(serialized); + const reSerialized = serializeTraceEvent(deserialized); + const reDeserialized = deserializeTraceEvent(reSerialized); + + expect(reDeserialized).toStrictEqual(originalEvent); + }); + + it('should handle trace event without args', () => { + const originalEvent = { + cat: 'blink.user_timing', + ph: 'i', + name: 'test-event', + pid: 123, + tid: 456, + ts: 1000, + }; + + const serialized = serializeTraceEvent(originalEvent); + const deserialized = deserializeTraceEvent(serialized); + + expect(deserialized).toStrictEqual(originalEvent); + }); + + it('should decode string-encoded details back to objects', () => { + const jsonString = JSON.stringify({ cat: 'blink.user_timing', ph: 'i', name: 'test-event', @@ -844,7 +853,15 @@ describe('encodeTraceEvent', () => { ts: 1000, args: { detail: '{"custom":"data"}', + data: { detail: '{"nested":"value"}' }, }, }); + + const deserialized = deserializeTraceEvent(jsonString); + + expect(deserialized.args).toStrictEqual({ + detail: { custom: 'data' }, + data: { detail: { nested: 'value' } }, + }); }); }); diff --git a/packages/utils/src/lib/profiler/trace-file.type.ts b/packages/utils/src/lib/profiler/trace-file.type.ts index 839c06884..4af577841 100644 --- a/packages/utils/src/lib/profiler/trace-file.type.ts +++ b/packages/utils/src/lib/profiler/trace-file.type.ts @@ -1,212 +1,55 @@ -import type { UserTimingDetail } from '../user-timing-extensibility-api.type.js'; +import type { + MarkerPayload, + TrackEntryPayload, +} from '../user-timing-extensibility-api.type.js'; -/** - * Arguments for instant trace events. - * @property {UserTimingDetail} [detail] - Optional user timing detail with DevTools payload - */ -export type InstantEventArgs = { - detail?: UserTimingDetail; -} & { [key: string]: unknown }; +/** DevTools payload type for trace events. */ +export type DevToolsPayload = TrackEntryPayload | MarkerPayload; -/** - * Arguments for span trace events (begin/end events). - * @property {object} [data] - Optional data object - * @property {UserTimingDetail} [data.detail] - Optional user timing detail with DevTools payload - */ -export type SpanEventArgs = { - data?: { detail?: UserTimingDetail }; -} & { [key: string]: unknown }; - -/** - * Arguments for complete trace events. - * @property {Record} [detail] - Optional detail object with arbitrary properties - */ -export type CompleteEventArgs = { detail?: Record }; - -/** - * Arguments for start tracing events. - * @property {object} data - Tracing initialization data - * @property {number} data.frameTreeNodeId - Frame tree node identifier - * @property {Array} data.frames - Array of frame information - * @property {boolean} data.persistentIds - Whether IDs are persistent - */ -export type InstantEventTracingStartedInBrowserArgs = { - data: { - frameTreeNodeId: number; - frames: { - frame: string; - isInPrimaryMainFrame: boolean; - isOutermostMainFrame: boolean; - name: string; - processId: number; - url: string; - }[]; - persistentIds: boolean; - }; -}; - -/** - * Union type of all possible trace event arguments. - */ -export type TraceArgs = - | InstantEventArgs - | SpanEventArgs - | CompleteEventArgs - | InstantEventTracingStartedInBrowserArgs; - -/** - * Base properties shared by all trace events. - * @property {string} cat - Event category - * @property {string} name - Event name - * @property {number} pid - Process ID - * @property {number} tid - Thread ID - * @property {number} ts - Timestamp in epoch microseconds - * @property {TraceArgs} [args] - Optional event arguments - */ -export type BaseTraceEvent = { +/** Unified trace event type for Chrome DevTools trace format. */ +export type TraceEvent = { cat: string; + ph: string; name: string; pid: number; tid: number; ts: number; - args: TraceArgs; -}; - -/** - * Start tracing event for Chrome DevTools tracing. - */ -export type InstantEventTracingStartedInBrowser = BaseTraceEvent & { - cat: 'devtools.timeline'; - ph: 'i'; - name: 'TracingStartedInBrowser'; - args: InstantEventTracingStartedInBrowserArgs; -}; - -/** - * Complete trace event with duration. - * Represents a complete operation with start time and duration. - * @property {'X'} ph - Phase indicator for complete events - * @property {number} dur - Duration in microseconds - */ -export type CompleteEvent = BaseTraceEvent & { ph: 'X'; dur: number }; - -/** - * Instant trace event representing a single point in time. - * Used for user timing marks and other instantaneous events. - * @property {'blink.user_timing'} cat - Fixed category for user timing events - * @property {'i'} ph - Phase indicator for instant events - * @property {never} [dur] - Duration is not applicable for instant events - * @property {InstantEventArgs} [args] - Optional event arguments - */ -export type InstantEvent = Omit & { - cat: 'blink.user_timing'; - ph: 'i'; - dur?: never; - args: InstantEventArgs; -}; - -/** - * Core properties for span trace events (begin/end pairs). - * @property {object} id2 - Span identifier - * @property {string} id2.local - Local span ID (unique to the process, same for b and e events) - * @property {SpanEventArgs} [args] - Optional event arguments - */ -type SpanCore = Omit & { - id2: { local: string }; - args: SpanEventArgs; -}; -/** - * Begin event for a span (paired with an end event). - * @property {'b'} ph - Phase indicator for begin events - * @property {never} [dur] - Duration is not applicable for begin events - */ -export type BeginEvent = SpanCore & { - ph: 'b'; - dur?: never; + dur?: number; + id2?: { local: string }; + args?: { + detail?: unknown; + data?: { detail?: unknown; [key: string]: unknown }; + devtools?: DevToolsPayload; + [key: string]: unknown; + }; }; -/** - * End event for a span (paired with a begin event). - * @property {'e'} ph - Phase indicator for end events - * @property {never} [dur] - Duration is not applicable for end events - */ -export type EndEvent = SpanCore & { ph: 'e'; dur?: never }; - -/** - * Union type for span events (begin or end). - */ -export type SpanEvent = BeginEvent | EndEvent; - -/** - * Union type of all trace event types. - */ -export type UserTimingTraceEvent = InstantEvent | SpanEvent; +// ───────────────────────────────────────────────────────────── +// DevTools metadata and annotations +// ───────────────────────────────────────────────────────────── -/** - * All trace events including system events added during finalization. - */ -export type TraceEvent = - | UserTimingTraceEvent - | CompleteEvent - | InstantEventTracingStartedInBrowser; - -/** - * Raw arguments format for trace events before processing. - * Either contains a detail string directly or nested in a data object. - */ -type RawArgs = - | { detail?: string; [key: string]: unknown } - | { data?: { detail?: string }; [key: string]: unknown }; - -/** - * Raw trace event format before type conversion. - * Similar to TraceEvent but with unprocessed arguments. - */ -export type TraceEventRaw = Omit & { args: RawArgs }; - -/** - * Time window bounds (min, max) in trace time units (e.g. microseconds). - * @property {number} min - Minimum timestamp in the window - * @property {number} max - Maximum timestamp in the window - * @property {number} range - Calculated range (max - min) - */ +/** Time window bounds in trace time units. */ export type BreadcrumbWindow = { min: number; max: number; range: number; }; -/** - * Custom label for a specific trace entry. - * @property {number | string} entryId - ID or index of the trace entry - * @property {string} label - Label text for the entry - * @property {string} [color] - Optional display color for the label - */ +/** Custom label for a trace entry. */ export type EntryLabel = { entryId: number | string; label: string; color?: string; }; -/** - * Link or relation between two trace entries. - * @property {number | string} fromEntryId - Source entry ID for the link - * @property {number | string} toEntryId - Target entry ID for the link - * @property {string} [linkType] - Optional type or description of the link - */ +/** Link between two trace entries. */ export type EntryLink = { fromEntryId: number | string; toEntryId: number | string; linkType?: string; }; -/** - * A time range annotated with a label. - * @property {number} startTime - Start timestamp of the range (microseconds) - * @property {number} endTime - End timestamp of the range (microseconds) - * @property {string} label - Annotation label for the time range - * @property {string} [color] - Optional display color for the range - */ +/** Time range annotated with a label. */ export type LabelledTimeRange = { startTime: number; endTime: number; @@ -214,51 +57,33 @@ export type LabelledTimeRange = { color?: string; }; -/** - * Hidden or expandable entries information. - * @property {unknown[]} hiddenEntries - IDs or indexes of hidden entries - * @property {unknown[]} expandableEntries - IDs or indexes of expandable entries - */ +/** Hidden or expandable entries information. */ export type EntriesModifications = { hiddenEntries: unknown[]; expandableEntries: unknown[]; }; -/** - * Initial breadcrumb information for time ranges and window. - * @property {BreadcrumbWindow} window - Time window bounds - * @property {unknown | null} child - Child breadcrumb or null - */ +/** Initial breadcrumb information. */ export type InitialBreadcrumb = { window: BreadcrumbWindow; child: unknown | null; }; -/** - * Annotations such as labels and links between entries. - * @property {EntryLabel[]} entryLabels - Custom labels for entries - * @property {LabelledTimeRange[]} labelledTimeRanges - Time ranges annotated with labels - * @property {EntryLink[]} linksBetweenEntries - Links or relations between entries - */ +/** Annotations (labels, links, time ranges). */ export type Annotations = { entryLabels: EntryLabel[]; labelledTimeRanges: LabelledTimeRange[]; linksBetweenEntries: EntryLink[]; }; -/** - * Modifications made to trace data or UI in DevTools export - */ +/** Modifications made to trace data in DevTools export. */ export type Modifications = { entriesModifications: EntriesModifications; initialBreadcrumb: InitialBreadcrumb; annotations: Annotations; }; -/** - * Top-level metadata for a trace file exported by Chrome DevTools. - * DevTools may add new fields over time. - */ +/** Top-level metadata for Chrome DevTools trace files. */ export type TraceMetadata = { /** Usually "DevTools" for exports from the Performance panel */ source: string; @@ -274,23 +99,24 @@ export type TraceMetadata = { networkThrottling?: string; enhancedTraceVersion?: number; - /** Allow additional custom metadata properties */ + /** DevTools may add new fields over time */ [key: string]: unknown; }; -/** - * Structured container for trace events with metadata. - * @property {TraceEvent[]} traceEvents - Array of trace events - * @property {'ms' | 'ns'} [displayTimeUnit] - Time unit for display (milliseconds or nanoseconds) - * @property {TraceMetadata} [metadata] - Optional metadata about the trace - */ +/** Structured container for trace events with metadata. */ export type TraceEventContainer = { traceEvents: TraceEvent[]; displayTimeUnit?: 'ms' | 'ns'; metadata?: TraceMetadata; }; -/** - * Trace file format - either an array of events or a structured container. - */ -export type TraceFile = TraceEvent[] | TraceEventContainer; +/** Trace file format: array of events or structured container. */ +export type TraceFile = TraceEventContainer; + +/** Options for creating a tracing started in browser event. */ +export type TracingStartedInBrowserOptions = { + url: string; + ts?: number; + pid?: number; + tid?: number; +}; diff --git a/packages/utils/src/lib/profiler/wal-json-trace.ts b/packages/utils/src/lib/profiler/wal-json-trace.ts index fcdfec4b1..e834ff755 100644 --- a/packages/utils/src/lib/profiler/wal-json-trace.ts +++ b/packages/utils/src/lib/profiler/wal-json-trace.ts @@ -1,13 +1,15 @@ import { defaultClock } from '../clock-epoch.js'; import type { InvalidEntry, WalFormat } from '../wal.js'; +import { PROFILER_OUT_BASENAME } from './constants.js'; import { - decodeTraceEvent, - encodeTraceEvent, - getCompleteEvent, + complete, + createTraceFile, + deserializeTraceEvent, + encodeEvent, getInstantEventTracingStartedInBrowser, - getTraceFile, + serializeTraceEvent, } from './trace-file-utils.js'; -import type { TraceEvent, UserTimingTraceEvent } from './trace-file.type.js'; +import type { TraceEvent } from './trace-file.type.js'; /** Name for the trace start margin event */ const TRACE_START_MARGIN_NAME = '[trace padding start]'; @@ -18,18 +20,11 @@ const TRACE_MARGIN_US = 1_000_000; /** Duration in microseconds for margin events (20ms = 20,000μs) */ const TRACE_MARGIN_DURATION_US = 20_000; -/** - * Generates a complete Chrome DevTools trace file content as JSON string. - * Adds margin events around the trace events and includes metadata. - * @param events - Array of user timing trace events to include - * @param metadata - Optional custom metadata to include in the trace file - * @returns JSON string representation of the complete trace file - */ export function generateTraceContent( - events: UserTimingTraceEvent[], + events: TraceEvent[], metadata?: Record, ): string { - const traceContainer = getTraceFile({ + const traceContainer = createTraceFile({ traceEvents: events, startTime: new Date().toISOString(), metadata: { @@ -38,66 +33,58 @@ export function generateTraceContent( }, }); - const marginUs = TRACE_MARGIN_US; - const marginDurUs = TRACE_MARGIN_DURATION_US; - - const sortedEvents = [...events].sort((a, b) => a.ts - b.ts); const fallbackTs = defaultClock.epochNowUs(); - const firstTs: number = sortedEvents.at(0)?.ts ?? fallbackTs; - const lastTs: number = sortedEvents.at(-1)?.ts ?? fallbackTs; + const sortedEvents = + events.length > 0 ? [...events].sort((a, b) => a.ts - b.ts) : []; - const startTs = firstTs - marginUs; - const endTs = lastTs + marginUs; + const firstTs = sortedEvents.at(0)?.ts ?? fallbackTs; + const lastTs = sortedEvents.at(-1)?.ts ?? fallbackTs; - const traceEvents: TraceEvent[] = [ - getInstantEventTracingStartedInBrowser({ - ts: startTs, - url: events.length === 0 ? 'empty-trace' : 'generated-trace', - }), - getCompleteEvent({ - name: TRACE_START_MARGIN_NAME, - ts: startTs, - dur: marginDurUs, - }), - ...sortedEvents, - getCompleteEvent({ - name: TRACE_END_MARGIN_NAME, - ts: endTs, - dur: marginDurUs, - }), - ]; - - return JSON.stringify({ ...traceContainer, traceEvents }); + return JSON.stringify({ + ...traceContainer, + traceEvents: [ + getInstantEventTracingStartedInBrowser({ + ts: firstTs - TRACE_MARGIN_US, + url: events.length > 0 ? 'generated-trace' : 'empty-trace', + }), + complete(TRACE_START_MARGIN_NAME, TRACE_MARGIN_DURATION_US, { + ts: firstTs - TRACE_MARGIN_US, + }), + ...sortedEvents.map(encodeEvent), + complete(TRACE_END_MARGIN_NAME, TRACE_MARGIN_DURATION_US, { + ts: lastTs + TRACE_MARGIN_US, + }), + ], + }); } +/** + * Codec for encoding and decoding trace events. + * Encodes nested objects in args.detail and args.data.detail to JSON strings for storage. + */ +export const traceEventCodec = { + encode: serializeTraceEvent, + decode: deserializeTraceEvent, +}; + /** * Creates a WAL (Write-Ahead Logging) format configuration for Chrome DevTools trace files. * Automatically finalizes shards into complete trace files with proper metadata and margin events. * @returns WalFormat configuration object with baseName, codec, extensions, and finalizer */ export function traceEventWalFormat() { - const baseName = 'trace'; - const walExtension = '.jsonl'; - const finalExtension = '.json'; return { - baseName, - walExtension, - finalExtension, - codec: { - encode: (event: UserTimingTraceEvent) => - JSON.stringify(encodeTraceEvent(event)), - decode: (json: string) => - decodeTraceEvent(JSON.parse(json)) as UserTimingTraceEvent, - }, + baseName: PROFILER_OUT_BASENAME, + walExtension: '.jsonl', + finalExtension: '.json', + codec: traceEventCodec, finalizer: ( - records: (UserTimingTraceEvent | InvalidEntry)[], + records: (TraceEvent | InvalidEntry)[], metadata?: Record, - ) => { - const validRecords = records.filter( - (r): r is UserTimingTraceEvent => - !(typeof r === 'object' && r != null && '__invalid' in r), - ); - return generateTraceContent(validRecords, metadata); - }, - } satisfies WalFormat; + ) => + generateTraceContent( + records.filter((r): r is TraceEvent => !('__invalid' in (r as object))), + metadata, + ), + } satisfies WalFormat; } diff --git a/packages/utils/src/lib/profiler/wal-json-trace.unit.test.ts b/packages/utils/src/lib/profiler/wal-json-trace.unit.test.ts index 3f40293ec..0bfcc06f3 100644 --- a/packages/utils/src/lib/profiler/wal-json-trace.unit.test.ts +++ b/packages/utils/src/lib/profiler/wal-json-trace.unit.test.ts @@ -1,9 +1,226 @@ -import type { UserTimingTraceEvent } from './trace-file.type.js'; -import { generateTraceContent, traceEventWalFormat } from './wal-json-trace.js'; +import type { TraceEvent } from './trace-file.type.js'; +import { + generateTraceContent, + traceEventCodec, + traceEventWalFormat, +} from './wal-json-trace.js'; + +describe('traceEventCodec', () => { + // Memory representation: TraceEvent objects with nested objects in args.detail and args.data.detail + // This is the format we process and hold in memory + const instantEvent: TraceEvent = { + name: 'cp:test-event', + ph: 'I', + ts: 123_456_789, + pid: 123, + tid: 456, + cat: 'blink.user_timing', + args: { + detail: { + custom: 'data', + }, + data: { + detail: { + nested: 'value', + }, + }, + devtools: { + dataType: 'track-entry', + track: 'test-track', + color: 'primary', + tooltipText: 'Test event tooltip', + }, + }, + } satisfies TraceEvent; + + const spanBeginEvent: TraceEvent = { + name: 'cp:test-span', + ph: 'b', + ts: 200_000_000, + pid: 123, + tid: 456, + cat: 'blink.user_timing', + id2: { local: '0x1' }, + args: { + devtools: { + dataType: 'track-entry', + track: 'span-track', + color: 'secondary', + tooltipText: 'Test span begin', + }, + }, + } satisfies TraceEvent; + + const spanEndEvent: TraceEvent = { + name: 'cp:test-span', + ph: 'e', + ts: 250_000_000, + pid: 123, + tid: 456, + cat: 'blink.user_timing', + id2: { local: '0x1' }, + args: { + devtools: { + dataType: 'track-entry', + track: 'span-track', + color: 'secondary', + tooltipText: 'Test span end', + }, + }, + } satisfies TraceEvent; + + // Encoded JSON string representation: nested objects in args.detail and args.data.detail are JSON strings + // This is the format stored in WAL files (.jsonl) + const instantEventJsonString = JSON.stringify({ + name: 'cp:test-event', + ph: 'I', + ts: 123_456_789, + pid: 123, + tid: 456, + cat: 'blink.user_timing', + args: { + detail: JSON.stringify({ custom: 'data' }), + data: { + detail: JSON.stringify({ nested: 'value' }), + }, + devtools: { + dataType: 'track-entry', + track: 'test-track', + color: 'primary', + tooltipText: 'Test event tooltip', + }, + }, + }); + + const spanBeginEventJsonString = JSON.stringify({ + name: 'cp:test-span', + ph: 'b', + ts: 200_000_000, + pid: 123, + tid: 456, + cat: 'blink.user_timing', + id2: { local: '0x1' }, + args: { + devtools: { + dataType: 'track-entry', + track: 'span-track', + color: 'secondary', + tooltipText: 'Test span begin', + }, + }, + }); + + const spanEndEventJsonString = JSON.stringify({ + name: 'cp:test-span', + ph: 'e', + ts: 250_000_000, + pid: 123, + tid: 456, + cat: 'blink.user_timing', + id2: { local: '0x1' }, + args: { + devtools: { + dataType: 'track-entry', + track: 'span-track', + color: 'secondary', + tooltipText: 'Test span end', + }, + }, + }); + + describe('decode direction (JSON string → memory object)', () => { + it('should decode instant event from JSON string', () => { + const decoded = traceEventCodec.decode(instantEventJsonString); + expect(decoded).toStrictEqual(instantEvent); + }); + + it('should decode span begin event from JSON string', () => { + const decoded = traceEventCodec.decode(spanBeginEventJsonString); + expect(decoded).toStrictEqual(spanBeginEvent); + }); + + it('should decode span end event from JSON string', () => { + const decoded = traceEventCodec.decode(spanEndEventJsonString); + expect(decoded).toStrictEqual(spanEndEvent); + }); + + it('should decode events with nested detail objects correctly', () => { + const decoded = traceEventCodec.decode(instantEventJsonString); + expect(decoded.args?.detail).toStrictEqual({ custom: 'data' }); + expect(decoded.args?.data?.detail).toStrictEqual({ nested: 'value' }); + }); + }); + + describe('encode direction (memory object → JSON string)', () => { + it('should encode instant event to JSON string', () => { + const encoded = traceEventCodec.encode(instantEvent); + expect(typeof encoded).toBe('string'); + const parsed = JSON.parse(encoded); + expect(parsed.args.detail).toBe(JSON.stringify({ custom: 'data' })); + expect(parsed.args.data.detail).toBe(JSON.stringify({ nested: 'value' })); + }); + + it('should encode span begin event to JSON string', () => { + const encoded = traceEventCodec.encode(spanBeginEvent); + expect(typeof encoded).toBe('string'); + const decoded = traceEventCodec.decode(encoded); + expect(decoded).toStrictEqual(spanBeginEvent); + }); + + it('should encode span end event to JSON string', () => { + const encoded = traceEventCodec.encode(spanEndEvent); + expect(typeof encoded).toBe('string'); + const decoded = traceEventCodec.decode(encoded); + expect(decoded).toStrictEqual(spanEndEvent); + }); + + it('should encode nested detail objects as JSON strings', () => { + const encoded = traceEventCodec.encode(instantEvent); + const parsed = JSON.parse(encoded); + expect(typeof parsed.args.detail).toBe('string'); + expect(typeof parsed.args.data.detail).toBe('string'); + expect(JSON.parse(parsed.args.detail)).toStrictEqual({ custom: 'data' }); + expect(JSON.parse(parsed.args.data.detail)).toStrictEqual({ + nested: 'value', + }); + }); + }); + + describe('round-trip (memory → string → memory)', () => { + it('should maintain consistency for instant event', () => { + const encoded = traceEventCodec.encode(instantEvent); + const decoded = traceEventCodec.decode(encoded); + expect(decoded).toStrictEqual(instantEvent); + }); + + it('should maintain consistency for span begin event', () => { + const encoded = traceEventCodec.encode(spanBeginEvent); + const decoded = traceEventCodec.decode(encoded); + expect(decoded).toStrictEqual(spanBeginEvent); + }); + + it('should maintain consistency for span end event', () => { + const encoded = traceEventCodec.encode(spanEndEvent); + const decoded = traceEventCodec.decode(encoded); + expect(decoded).toStrictEqual(spanEndEvent); + }); + + it('should handle multiple round-trips correctly', () => { + let current = instantEvent; + // eslint-disable-next-line functional/no-loop-statements + for (let i = 0; i < 3; i++) { + const encoded = traceEventCodec.encode(current); + const decoded = traceEventCodec.decode(encoded); + expect(decoded).toStrictEqual(instantEvent); + current = decoded; + } + }); + }); +}); describe('generateTraceContent', () => { it('should generate trace content for empty events array', () => { - const events: UserTimingTraceEvent[] = []; + const events: TraceEvent[] = []; const metadata = { version: '1.0.0', generatedAt: '2024-01-01T00:00:00Z' }; const result = generateTraceContent(events, metadata); @@ -51,10 +268,10 @@ describe('generateTraceContent', () => { }); it('should generate trace content for non-empty events array', () => { - const events: UserTimingTraceEvent[] = [ + const events: TraceEvent[] = [ { name: 'cp:test-operation:start', - ph: 'i', + ph: 'I', ts: 1000, pid: 123, tid: 456, @@ -67,7 +284,7 @@ describe('generateTraceContent', () => { }, { name: 'cp:test-operation:end', - ph: 'i', + ph: 'I', ts: 2000, pid: 123, tid: 456, @@ -125,10 +342,10 @@ describe('generateTraceContent', () => { }); it('should sort events by timestamp', () => { - const events: UserTimingTraceEvent[] = [ + const events: TraceEvent[] = [ { name: 'cp:second-operation', - ph: 'i', + ph: 'I', ts: 2000, pid: 123, tid: 456, @@ -137,7 +354,7 @@ describe('generateTraceContent', () => { }, { name: 'cp:first-operation', - ph: 'i', + ph: 'I', ts: 1000, pid: 123, tid: 456, @@ -158,10 +375,10 @@ describe('generateTraceContent', () => { }); it('should handle single event with proper margin calculation', () => { - const events: UserTimingTraceEvent[] = [ + const events: TraceEvent[] = [ { name: 'cp:single-event', - ph: 'i', + ph: 'I', ts: 5000, pid: 123, tid: 456, @@ -240,9 +457,9 @@ describe('traceEventWalFormat', () => { it('should encode and decode trace events correctly', () => { const format = traceEventWalFormat(); - const testEvent: UserTimingTraceEvent = { + const testEvent: TraceEvent = { name: 'cp:test-event', - ph: 'i', + ph: 'I', ts: 123_456_789, pid: 123, tid: 456, @@ -260,12 +477,38 @@ describe('traceEventWalFormat', () => { expect(decoded).toStrictEqual(testEvent); }); + it('should maintain consistency through decode -> encode -> decode round-trip', () => { + const format = traceEventWalFormat(); + const originalEvent: TraceEvent = { + name: 'cp:round-trip-test', + ph: 'I', + ts: 987_654_321, + pid: 789, + tid: 101, + cat: 'blink.user_timing', + args: { + dataType: 'track-entry', + track: 'Round Trip Track', + trackGroup: 'Test Group', + customField: 'custom value', + }, + }; + + const initialEncoded = format.codec.encode(originalEvent); + const firstDecoded = format.codec.decode(initialEncoded); + const secondEncoded = format.codec.encode(firstDecoded); + const secondDecoded = format.codec.decode(secondEncoded); + + expect(secondDecoded).toStrictEqual(firstDecoded); + expect(secondDecoded).toStrictEqual(originalEvent); + }); + it('should finalize records into trace content', () => { const format = traceEventWalFormat(); - const records: UserTimingTraceEvent[] = [ + const records: TraceEvent[] = [ { name: 'cp:operation:start', - ph: 'i', + ph: 'I', ts: 1000, pid: 123, tid: 456, @@ -285,7 +528,7 @@ describe('traceEventWalFormat', () => { it('should include generatedAt in finalizer metadata', () => { const format = traceEventWalFormat(); - const records: UserTimingTraceEvent[] = []; + const records: TraceEvent[] = []; const result = format.finalizer(records); const parsed = JSON.parse(result); diff --git a/packages/utils/src/lib/reports/load-report.unit.test.ts b/packages/utils/src/lib/reports/load-report.unit.test.ts index acafeb61c..747b6721b 100644 --- a/packages/utils/src/lib/reports/load-report.unit.test.ts +++ b/packages/utils/src/lib/reports/load-report.unit.test.ts @@ -59,6 +59,6 @@ describe('loadReport', () => { filename: 'report', format: 'json', }), - ).rejects.toThrow('slug has to follow the pattern'); + ).rejects.toThrowError('slug has to follow the pattern'); }); }); diff --git a/packages/utils/src/lib/reports/scoring.unit.test.ts b/packages/utils/src/lib/reports/scoring.unit.test.ts index 0533492e0..fc4926238 100644 --- a/packages/utils/src/lib/reports/scoring.unit.test.ts +++ b/packages/utils/src/lib/reports/scoring.unit.test.ts @@ -60,7 +60,7 @@ describe('calculateScore', () => { it('should throw for an empty reference array', () => { expect(() => calculateScore<{ weight: number }>([], ref => ref.weight), - ).toThrow('Reference array cannot be empty.'); + ).toThrowError('Reference array cannot be empty.'); }); it('should throw negative weight', () => { @@ -69,7 +69,7 @@ describe('calculateScore', () => { [{ slug: 'first-contentful-paint', weight: -1, score: 0.5 }], ref => ref.score, ), - ).toThrow('Weight cannot be negative.'); + ).toThrowError('Weight cannot be negative.'); }); it('should throw for a reference array full of zero weights', () => { @@ -81,7 +81,7 @@ describe('calculateScore', () => { ], ref => ref.score, ), - ).toThrow('All references cannot have zero weight.'); + ).toThrowError('All references cannot have zero weight.'); }); it('should throw for a negative score', () => { @@ -90,7 +90,7 @@ describe('calculateScore', () => { [{ slug: 'first-contentful-paint', weight: 1, score: -0.8 }], ref => ref.score, ), - ).toThrow('All scores must be in range 0-1.'); + ).toThrowError('All scores must be in range 0-1.'); }); it('should throw for score above 1', () => { @@ -99,7 +99,7 @@ describe('calculateScore', () => { [{ slug: 'first-contentful-paint', weight: 1, score: 2 }], ref => ref.score, ), - ).toThrow('All scores must be in range 0-1.'); + ).toThrowError('All scores must be in range 0-1.'); }); }); diff --git a/packages/utils/src/lib/reports/sorting.unit.test.ts b/packages/utils/src/lib/reports/sorting.unit.test.ts index 9491a63b3..0f9725316 100644 --- a/packages/utils/src/lib/reports/sorting.unit.test.ts +++ b/packages/utils/src/lib/reports/sorting.unit.test.ts @@ -72,7 +72,7 @@ describe('getSortableAuditByRef', () => { }, ], ), - ).toThrow('Audit pancake-coverage is not present in coverage'); + ).toThrowError('Audit pancake-coverage is not present in coverage'); }); }); @@ -174,7 +174,7 @@ describe('getSortableGroupByRef', () => { }, ], ), - ).toThrow('Group test-coverage is not present in coverage'); + ).toThrowError('Group test-coverage is not present in coverage'); }); }); diff --git a/packages/utils/src/lib/text-formats/table.unit.test.ts b/packages/utils/src/lib/text-formats/table.unit.test.ts index 308f52791..e95e4279b 100644 --- a/packages/utils/src/lib/text-formats/table.unit.test.ts +++ b/packages/utils/src/lib/text-formats/table.unit.test.ts @@ -14,7 +14,7 @@ describe('rowToStringArray', () => { columns: [{ key: 'prop' }], rows: [[1, 2, 3]], } as unknown as Table), - ).toThrow('Column can`t be object when rows are primitive values'); + ).toThrowError('Column can`t be object when rows are primitive values'); }); it('should transform row of primitive values row to a string array', () => { @@ -208,6 +208,6 @@ describe('getColumnAlignments', () => { it('throws for a undefined row', () => { expect(() => getColumnAlignments({ rows: [undefined as unknown as TableRowObject] }), - ).toThrow('first row can`t be undefined.'); + ).toThrowError('first row can`t be undefined.'); }); }); diff --git a/packages/utils/src/lib/transform.unit.test.ts b/packages/utils/src/lib/transform.unit.test.ts index 4dd262fbc..e4c2f58ef 100644 --- a/packages/utils/src/lib/transform.unit.test.ts +++ b/packages/utils/src/lib/transform.unit.test.ts @@ -233,7 +233,7 @@ describe('objectToCliArgs', () => { it('should throw error for unsupported type', () => { const params = { unsupported: Symbol('test') as any }; - expect(() => objectToCliArgs(params)).toThrow('Unsupported type'); + expect(() => objectToCliArgs(params)).toThrowError('Unsupported type'); }); }); diff --git a/packages/utils/src/lib/user-timing-extensibility-api-utils.ts b/packages/utils/src/lib/user-timing-extensibility-api-utils.ts index fedae9fa3..2eca4f3bf 100644 --- a/packages/utils/src/lib/user-timing-extensibility-api-utils.ts +++ b/packages/utils/src/lib/user-timing-extensibility-api-utils.ts @@ -332,7 +332,7 @@ export function mergeDevtoolsPayload< } export type ActionTrackConfigs = Record< T, - ActionTrackEntryPayload + Omit >; /** * Sets up tracks with default values merged into each track. diff --git a/packages/utils/src/lib/wal-sharded.int.test.ts b/packages/utils/src/lib/wal-sharded.int.test.ts new file mode 100644 index 000000000..18e811072 --- /dev/null +++ b/packages/utils/src/lib/wal-sharded.int.test.ts @@ -0,0 +1,254 @@ +import fs from 'node:fs'; +import path from 'node:path'; +import { afterEach, beforeEach, describe, expect, it } from 'vitest'; +import { PROFILER_SHARDER_ID_ENV_VAR } from './profiler/constants.js'; +import { ShardedWal } from './wal-sharded.js'; +import { + type WalFormat, + type WalRecord, + stringCodec, +} from './wal.js'; + +describe('ShardedWal Integration', () => { + const testDir = path.join( + process.cwd(), + 'tmp', + 'int', + 'utils', + 'wal-sharded', + ); + const makeMockFormat = ( + overrides: Partial>, + ): WalFormat => { + const { + baseName = 'wal', + walExtension = '.log', + finalExtension = '.json', + codec = stringCodec(), + finalizer = records => `${JSON.stringify(records)}\n`, + } = overrides; + + return { + baseName, + walExtension, + finalExtension, + codec, + finalizer, + }; + }; + let shardedWal: ShardedWal; + + beforeEach(() => { + if (fs.existsSync(testDir)) { + fs.rmSync(testDir, { recursive: true, force: true }); + } + fs.mkdirSync(testDir, { recursive: true }); + }); + + afterEach(() => { + if (shardedWal) { + shardedWal.cleanupIfCoordinator(); + } + if (fs.existsSync(testDir)) { + fs.rmSync(testDir, { recursive: true, force: true }); + } + }); + + it('should create and finalize shards correctly', () => { + shardedWal = new ShardedWal({ + debug: false, + dir: testDir, + format: makeMockFormat({ + baseName: 'trace', + }), + coordinatorIdEnvVar: PROFILER_SHARDER_ID_ENV_VAR, + groupId: 'create-finalize', + }); + + const shard1 = shardedWal.shard(); + shard1.open(); + shard1.append('record1'); + shard1.append('record2'); + shard1.close(); + + const shard2 = shardedWal.shard(); + shard2.open(); + shard2.append('record3'); + shard2.close(); + + shardedWal.finalize(); + + const finalFile = path.join( + testDir, + shardedWal.groupId, + `trace.create-finalize.json`, + ); + expect(fs.existsSync(finalFile)).toBeTrue(); + + const content = fs.readFileSync(finalFile, 'utf8'); + const records = JSON.parse(content.trim()); + expect(records).toEqual(['record1', 'record2', 'record3']); + }); + + it('should merge multiple shards correctly', () => { + shardedWal = new ShardedWal({ + debug: false, + dir: testDir, + format: makeMockFormat({ + baseName: 'merged', + }), + coordinatorIdEnvVar: PROFILER_SHARDER_ID_ENV_VAR, + groupId: 'merge-shards', + }); + + // eslint-disable-next-line functional/no-loop-statements + for (let i = 1; i <= 5; i++) { + const shard = shardedWal.shard(); + shard.open(); + shard.append(`record-from-shard-${i}`); + shard.close(); + } + + shardedWal.finalize(); + + const finalFile = path.join( + testDir, + shardedWal.groupId, + `merged.merge-shards.json`, + ); + const content = fs.readFileSync(finalFile, 'utf8'); + const records = JSON.parse(content.trim()); + expect(records).toHaveLength(5); + expect(records[0]).toBe('record-from-shard-1'); + expect(records[4]).toBe('record-from-shard-5'); + }); + + it('should handle invalid entries during if debug true', () => { + shardedWal = new ShardedWal({ + debug: true, + dir: testDir, + format: makeMockFormat({ + baseName: 'test', + }), + coordinatorIdEnvVar: PROFILER_SHARDER_ID_ENV_VAR, + groupId: 'invalid-entries', + }); + + const shard = shardedWal.shard(); + shard.open(); + shard.append('valid1'); + shard.append('invalid'); + shard.append('valid2'); + shard.close(); + + shardedWal.finalize(); + expect(shardedWal.stats.lastRecover).toStrictEqual([]); + + const finalFile = path.join( + testDir, + shardedWal.groupId, + `test.invalid-entries.json`, + ); + const content = fs.readFileSync(finalFile, 'utf8'); + const records = JSON.parse(content.trim()); + expect(records).toEqual(['valid1', 'invalid', 'valid2']); + }); + + it('should cleanup shard files after finalization', () => { + shardedWal = new ShardedWal({ + debug: false, + dir: testDir, + format: makeMockFormat({ + baseName: 'cleanup-test', + }), + coordinatorIdEnvVar: PROFILER_SHARDER_ID_ENV_VAR, + groupId: 'cleanup-test', + }); + + const shard1 = shardedWal.shard(); + shard1.open(); + shard1.append('record1'); + shard1.close(); + + const shard2 = shardedWal.shard(); + shard2.open(); + shard2.append('record2'); + shard2.close(); + + shardedWal.finalize(); + + const finalFile = path.join( + testDir, + shardedWal.groupId, + `cleanup-test.cleanup-test.json`, + ); + expect(fs.existsSync(finalFile)).toBeTrue(); + + shardedWal.cleanupIfCoordinator(); + + const groupDir = path.join(testDir, shardedWal.groupId); + const files = fs.readdirSync(groupDir); + expect(files).not.toContain(expect.stringMatching(/cleanup-test.*\.log$/)); + expect(files).toContain(`cleanup-test.cleanup-test.json`); + }); + + it('should use custom options in finalizer', () => { + shardedWal = new ShardedWal({ + debug: false, + dir: testDir, + format: makeMockFormat({ + baseName: 'custom', + finalizer: (records, opt) => + `${JSON.stringify({ records, metadata: opt })}\n`, + }), + coordinatorIdEnvVar: PROFILER_SHARDER_ID_ENV_VAR, + groupId: 'custom-finalizer', + }); + + const shard = shardedWal.shard(); + shard.open(); + shard.append('record1'); + shard.close(); + + shardedWal.finalize({ version: '2.0', timestamp: Date.now() }); + + const finalFile = path.join( + testDir, + shardedWal.groupId, + `custom.custom-finalizer.json`, + ); + const content = fs.readFileSync(finalFile, 'utf8'); + const result = JSON.parse(content.trim()); + expect(result.records).toEqual(['record1']); + expect(result.metadata).toEqual({ + version: '2.0', + timestamp: expect.any(Number), + }); + }); + + it('should handle empty shards correctly', () => { + shardedWal = new ShardedWal({ + debug: false, + dir: testDir, + format: makeMockFormat({ + baseName: 'empty', + }), + coordinatorIdEnvVar: PROFILER_SHARDER_ID_ENV_VAR, + groupId: 'empty-shards', + }); + + const groupDir = path.join(testDir, shardedWal.groupId); + fs.mkdirSync(groupDir, { recursive: true }); + + shardedWal.finalize(); + + const finalFile = path.join( + testDir, + shardedWal.groupId, + `empty.${shardedWal.groupId}.json`, + ); + expect(fs.existsSync(finalFile)).toBeTrue(); + const content = fs.readFileSync(finalFile, 'utf8'); + expect(content.trim()).toBe('[]'); + }); +}); diff --git a/packages/utils/src/lib/wal-sharded.ts b/packages/utils/src/lib/wal-sharded.ts new file mode 100644 index 000000000..113b03d66 --- /dev/null +++ b/packages/utils/src/lib/wal-sharded.ts @@ -0,0 +1,407 @@ +import * as fs from 'node:fs'; +import path from 'node:path'; +import process from 'node:process'; +import { threadId } from 'node:worker_threads'; +import { extendError } from './errors.js'; +import { + type Counter, + getUniqueInstanceId, + getUniqueTimeId, +} from './process-id.js'; +import { + type InvalidEntry, + type RecoverResult, + type WalFormat, + type WalRecord, + WriteAheadLogFile, + filterValidRecords, +} from './wal.js'; + +/** + * NOTE: this helper is only used in this file. The rest of the repo avoids sync methods so it is not reusable. + * Ensures a directory exists, creating it recursively if necessary using sync methods. + * @param dirPath - The directory path to ensure exists + */ +function ensureDirectoryExistsSync(dirPath: string): void { + if (!fs.existsSync(dirPath)) { + fs.mkdirSync(dirPath, { recursive: true }); + } +} + +// eslint-disable-next-line functional/no-let +let shardCount = 0; + +/** + * Counter for generating sequential shard IDs. + * Encapsulates the shard count increment logic. + */ +export const ShardedWalCounter: Counter = { + next() { + return ++shardCount; + }, +}; + +/** + * Generates a unique readable instance ID. + * This ID uniquely identifies a shard/file per process/thread combination with a human-readable timestamp. + * Format: readable-timestamp.pid.threadId.counter + * Example: "20240101-120000-000.12345.1.1" + * + * @returns A unique ID string with readable timestamp, process ID, thread ID, and counter + */ +export function getShardId(): string { + return `${getUniqueTimeId()}.${process.pid}.${threadId}.${ShardedWalCounter.next()}`; +} + +/** + * Sharded Write-Ahead Log manager for coordinating multiple WAL shards. + * Handles distributed logging across multiple processes/files with atomic finalization. + */ + +export class ShardedWal { + static instanceCount = 0; + + readonly #id: string = getUniqueInstanceId({ + next() { + return ++ShardedWal.instanceCount; + }, + }); + readonly groupId = getUniqueTimeId(); + readonly #debug: boolean = false; + readonly #format: WalFormat; + readonly #dir: string = process.cwd(); + readonly #coordinatorIdEnvVar: string; + #state: 'active' | 'finalized' | 'cleaned' = 'active'; + #lastRecovery: { + file: string; + result: RecoverResult>; + }[] = []; + #createdShardFiles: string[] = []; + + /** + * Initialize the origin PID environment variable if not already set. + * This must be done as early as possible before any user code runs. + * Sets envVarName to the current process ID if not already defined. + * + * @param envVarName - Environment variable name for storing coordinator ID + * @param profilerID - The profiler ID to set as coordinator + */ + static setCoordinatorProcess(envVarName: string, profilerID: string): void { + if (!process.env[envVarName]) { + process.env[envVarName] = profilerID; + } + } + + /** + * Determines if this process is the leader WAL process using the origin PID heuristic. + * + * The leader is the process that first enabled profiling (the one that set CP_PROFILER_ORIGIN_PID). + * All descendant processes inherit the environment but have different PIDs. + * + * @param envVarName - Environment variable name for storing coordinator ID + * @param profilerID - The profiler ID to check + * @returns true if this is the leader WAL process, false otherwise + */ + static isCoordinatorProcess(envVarName: string, profilerID: string): boolean { + return process.env[envVarName] === profilerID; + } + + /** + * Create a sharded WAL manager. + * + * @param opt.dir - Base directory to store shard files (defaults to process.cwd()) + * @param opt.format - WAL format configuration + * @param opt.groupId - Group ID for sharding (defaults to generated group ID) + * @param opt.coordinatorIdEnvVar - Environment variable name for storing coordinator ID (defaults to CP_SHARDED_WAL_COORDINATOR_ID) + * @param opt.autoCoordinator - Whether to auto-set the coordinator ID on construction (defaults to true) + * @param opt.measureNameEnvVar - Environment variable name for coordinating groupId across processes (optional) + */ + constructor(opt: { + debug?: boolean; + dir?: string; + format: WalFormat; + groupId?: string; + coordinatorIdEnvVar: string; + autoCoordinator?: boolean; + measureNameEnvVar?: string; + }) { + const { + dir, + format, + debug, + groupId, + coordinatorIdEnvVar, + autoCoordinator = true, + measureNameEnvVar, + } = opt; + + if (debug != null) { + this.#debug = debug; + } + + // Determine groupId: use provided, then env var, or generate + // eslint-disable-next-line functional/no-let + let resolvedGroupId: string; + if (groupId) { + // User explicitly provided groupId - use it + resolvedGroupId = groupId; + } else if (measureNameEnvVar && process.env[measureNameEnvVar]) { + // Env var is set (by coordinator or previous process) - use it + resolvedGroupId = process.env[measureNameEnvVar]; + } else if (measureNameEnvVar) { + // Env var not set - we're likely the first/coordinator, generate and set it + resolvedGroupId = getUniqueTimeId(); + + process.env[measureNameEnvVar] = resolvedGroupId; + } else { + // No measureNameEnvVar provided - generate unique one (backward compatible) + resolvedGroupId = getUniqueTimeId(); + } + + this.groupId = resolvedGroupId; + + if (dir) { + this.#dir = dir; + } + this.#format = format; + this.#coordinatorIdEnvVar = coordinatorIdEnvVar; + + if (autoCoordinator) { + ShardedWal.setCoordinatorProcess(this.#coordinatorIdEnvVar, this.#id); + } + } + + /** + * Gets the unique instance ID for this ShardedWal. + * + * @returns The unique instance ID + */ + get id(): string { + return this.#id; + } + + /** + * Is this instance the coordinator? + * + * Coordinator status is determined from the coordinatorIdEnvVar environment variable. + * The coordinator handles finalization and cleanup of shard files. + * Checks dynamically to allow coordinator to be set after construction. + * + * @returns true if this instance is the coordinator, false otherwise + */ + isCoordinator(): boolean { + return ShardedWal.isCoordinatorProcess(this.#coordinatorIdEnvVar, this.#id); + } + + /** + * Asserts that the WAL is in 'active' state. + * Throws an error if the WAL has been finalized or cleaned. + * + * @throws Error if WAL is not in 'active' state + */ + private assertActive(): void { + if (this.#state !== 'active') { + throw new Error(`WAL is ${this.#state}, cannot modify`); + } + } + + /** + * Gets the current lifecycle state of the WAL. + * + * @returns Current lifecycle state: 'active', 'finalized', or 'cleaned' + */ + getState(): 'active' | 'finalized' | 'cleaned' { + return this.#state; + } + + /** + * Checks if the WAL has been finalized. + * + * @returns true if WAL is in 'finalized' state, false otherwise + */ + isFinalized(): boolean { + return this.#state === 'finalized'; + } + + /** + * Checks if the WAL has been cleaned. + * + * @returns true if WAL is in 'cleaned' state, false otherwise + */ + isCleaned(): boolean { + return this.#state === 'cleaned'; + } + + /** + * Generates a filename for a shard file using a shard ID. + * Both groupId and shardId are already in readable date format. + * + * Example with baseName "trace" and shardId "20240101-120000-000.12345.1.1": + * Filename: trace.20240101-120000-000.12345.1.1.log + * + * @param shardId - The human-readable shard ID (readable-timestamp.pid.threadId.count format) + * @returns The filename for the shard file + */ + getShardedFileName(shardId: string) { + const { baseName, walExtension } = this.#format; + return `${baseName}.${shardId}${walExtension}`; + } + + /** + * Generates a filename for the final merged output file. + * Uses the groupId as the identifier in the final filename. + * + * Example with baseName "trace" and groupId "20240101-120000-000": + * Filename: trace.20240101-120000-000.json + * + * Example with baseName "trace" and groupId "measureName": + * Filename: trace.measureName.json + * + * @returns The filename for the final merged output file + */ + getFinalFilePath() { + const groupIdDir = path.join(this.#dir, this.groupId); + const { baseName, finalExtension } = this.#format; + + return path.join( + groupIdDir, + `${baseName}.${this.groupId}${finalExtension}`, + ); + } + + shard() { + this.assertActive(); + const filePath = path.join( + this.#dir, + this.groupId, + this.getShardedFileName(getShardId()), + ); + this.#createdShardFiles.push(filePath); + return new WriteAheadLogFile({ + file: filePath, + codec: this.#format.codec, + }); + } + + /** Get all shard file paths matching this WAL's base name */ + private shardFiles() { + if (!fs.existsSync(this.#dir)) { + return []; + } + + const groupDir = path.join(this.#dir, this.groupId); + // create dir if not existing + ensureDirectoryExistsSync(groupDir); + + return fs + .readdirSync(groupDir) + .filter(entry => entry.endsWith(this.#format.walExtension)) + .filter(entry => entry.startsWith(`${this.#format.baseName}`)) + .map(entry => path.join(groupDir, entry)); + } + + /** Get shard file paths created by this instance */ + private getCreatedShardFiles() { + return this.#createdShardFiles.filter(f => fs.existsSync(f)); + } + + /** + * Finalize all shards by merging them into a single output file. + * Recovers all records from all shards, validates no errors, and writes merged result. + * Idempotent: returns early if already finalized or cleaned. + * @throws Error if custom finalizer method throws + */ + finalize(opt?: Record) { + if (this.#state !== 'active') { + return; + } + + // Ensure base directory exists before calling shardFiles() + ensureDirectoryExistsSync(this.#dir); + + const fileRecoveries = this.shardFiles().map(f => ({ + file: f, + result: new WriteAheadLogFile({ + file: f, + codec: this.#format.codec, + }).recover(), + })); + + const records = fileRecoveries.flatMap(({ result }) => result.records); + + if (this.#debug) { + this.#lastRecovery = fileRecoveries; + } + + ensureDirectoryExistsSync(path.dirname(this.getFinalFilePath())); + + try { + fs.writeFileSync( + this.getFinalFilePath(), + this.#format.finalizer(filterValidRecords(records), opt), + ); + } catch (error) { + throw extendError( + error, + 'Could not finalize sharded wal. Finalizer method in format throws.', + { appendMessage: true }, + ); + } + + this.#state = 'finalized'; + } + + /** + * Cleanup shard files by removing them from disk. + * Coordinator-only: throws error if not coordinator to prevent race conditions. + * Idempotent: returns early if already cleaned. + */ + cleanup() { + if (!this.isCoordinator()) { + throw new Error('cleanup() can only be called by coordinator'); + } + + if (this.#state === 'cleaned') { + return; + } + + this.getCreatedShardFiles() + .filter(f => fs.existsSync(f)) + .forEach(f => { + fs.unlinkSync(f); + }); + + this.#state = 'cleaned'; + } + + get stats() { + return { + lastRecover: this.#lastRecovery, + state: this.#state, + groupId: this.groupId, + shardCount: this.getCreatedShardFiles().length, + isCoordinator: this.isCoordinator(), + isFinalized: this.isFinalized(), + isCleaned: this.isCleaned(), + finalFilePath: this.getFinalFilePath(), + shardFileCount: this.getCreatedShardFiles().length, + shardFiles: this.getCreatedShardFiles(), + }; + } + + finalizeIfCoordinator(opt?: Record) { + if (this.isCoordinator()) { + this.finalize(opt); + } + } + + /** + * Cleanup shard files if this instance is the coordinator. + * Safe to call from any process - only coordinator will execute cleanup. + */ + cleanupIfCoordinator() { + if (this.isCoordinator()) { + this.cleanup(); + } + } +} diff --git a/packages/utils/src/lib/wal-sharded.unit.test.ts b/packages/utils/src/lib/wal-sharded.unit.test.ts new file mode 100644 index 000000000..a1472bfdf --- /dev/null +++ b/packages/utils/src/lib/wal-sharded.unit.test.ts @@ -0,0 +1,512 @@ +import { vol } from 'memfs'; +import { beforeEach, describe, expect, it } from 'vitest'; +import { MEMFS_VOLUME, osAgnosticPath } from '@code-pushup/test-utils'; +import { getUniqueInstanceId } from './process-id.js'; +import { PROFILER_SHARDER_ID_ENV_VAR } from './profiler/constants.js'; +import { ShardedWal } from './wal-sharded.js'; +import { + type WalFormat, + WriteAheadLogFile, + parseWalFormat, + stringCodec, +} from './wal.js'; + +const read = (p: string) => vol.readFileSync(p, 'utf8') as string; + +const getShardedWal = (overrides?: { + dir?: string; + format?: Partial; + measureNameEnvVar?: string; + autoCoordinator?: boolean; +}) => { + const { format, ...rest } = overrides ?? {}; + return new ShardedWal({ + debug: false, + dir: '/test/shards', + format: parseWalFormat({ + baseName: 'test-wal', + ...format, + }), + coordinatorIdEnvVar: PROFILER_SHARDER_ID_ENV_VAR, + ...rest, + }); +}; + +describe('ShardedWal', () => { + beforeEach(() => { + vol.reset(); + vol.fromJSON({}, MEMFS_VOLUME); + // Clear coordinator env var for fresh state + // eslint-disable-next-line functional/immutable-data, @typescript-eslint/no-dynamic-delete + delete process.env[PROFILER_SHARDER_ID_ENV_VAR]; + }); + + describe('initialization', () => { + it('should create instance with directory and format', () => { + const sw = getShardedWal(); + expect(sw).toBeInstanceOf(ShardedWal); + }); + + it('should expose a stable id via getter', () => { + const sw = getShardedWal(); + const firstId = sw.id; + expect(sw.id).toBe(firstId); + }); + + it('should use groupId from env var when measureNameEnvVar is set', () => { + // eslint-disable-next-line functional/immutable-data + process.env.CP_PROFILER_MEASURE_NAME = 'from-env'; + const sw = getShardedWal({ + measureNameEnvVar: 'CP_PROFILER_MEASURE_NAME', + }); + expect(sw.groupId).toBe('from-env'); + expect(process.env.CP_PROFILER_MEASURE_NAME).toBe('from-env'); + }); + + it('should set env var when measureNameEnvVar is provided and unset', () => { + // eslint-disable-next-line functional/immutable-data + delete process.env.CP_PROFILER_MEASURE_NAME; + const sw = getShardedWal({ + measureNameEnvVar: 'CP_PROFILER_MEASURE_NAME', + }); + expect(process.env.CP_PROFILER_MEASURE_NAME).toBe(sw.groupId); + }); + }); + + describe('shard management', () => { + it('should create shard with correct file path', () => { + const sw = getShardedWal({ + format: { baseName: 'trace', walExtension: '.log' }, + }); + const shard = sw.shard(); + expect(shard).toBeInstanceOf(WriteAheadLogFile); + // Shard files use getShardId() format (timestamp.pid.threadId.counter) + // The groupId is auto-generated and used in the shard path + // Normalize path before regex matching to handle OS-specific separators + expect(osAgnosticPath(shard.getPath())).toMatch( + /^\/shards\/\d{8}-\d{6}-\d{3}\/trace\.\d{8}-\d{6}-\d{3}(?:\.\d+){3}\.log$/, + ); + expect(shard.getPath()).toEndWithPath('.log'); + }); + + it('should create shard with default shardId when no argument provided', () => { + const sw = getShardedWal({ + format: { baseName: 'trace', walExtension: '.log' }, + }); + const shard = sw.shard(); + expect(shard.getPath()).toStartWithPath( + '/shards/20231114-221320-000/trace.20231114-221320-000.10001', + ); + expect(shard.getPath()).toEndWithPath('.log'); + }); + }); + + describe('file operations', () => { + it('should list no shard files when directory does not exist', () => { + const sw = getShardedWal({ dir: '/nonexistent' }); + const files = (sw as any).shardFiles(); + expect(files).toEqual([]); + }); + + it('should list no shard files when directory is empty', () => { + const sw = getShardedWal({ dir: '/empty' }); + vol.mkdirSync('/empty/20231114-221320-000', { recursive: true }); + const files = (sw as any).shardFiles(); + expect(files).toEqual([]); + }); + + it('should list shard files matching extension', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/trace.19700101-000820-001.1.log': + 'content1', + '/shards/20231114-221320-000/trace.19700101-000820-002.2.log': + 'content2', + '/shards/other.txt': 'not a shard', + }); + + const sw = getShardedWal({ + dir: '/shards', + format: { baseName: 'trace', walExtension: '.log' }, + }); + const files = (sw as any).shardFiles(); + + expect(files).toHaveLength(2); + expect(files).toEqual( + expect.arrayContaining([ + expect.pathToMatch( + '/shards/20231114-221320-000/trace.19700101-000820-001.1.log', + ), + expect.pathToMatch( + '/shards/20231114-221320-000/trace.19700101-000820-002.2.log', + ), + ]), + ); + }); + }); + + describe('finalization', () => { + it('should finalize empty shards to empty result', () => { + const sw = getShardedWal({ + dir: '/shards', + format: { + baseName: 'final', + finalExtension: '.json', + finalizer: records => `${JSON.stringify(records)}\n`, + }, + }); + + vol.mkdirSync('/shards/20231114-221320-000', { recursive: true }); + sw.finalize(); + + expect( + read('/shards/20231114-221320-000/final.20231114-221320-000.json'), + ).toBe('[]\n'); + }); + + it('should finalize multiple shards into single file', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/merged.20240101-120000-001.1.log': + 'record1\n', + '/shards/20231114-221320-000/merged.20240101-120000-002.2.log': + 'record2\n', + }); + + const sw = getShardedWal({ + dir: '/shards', + format: { + baseName: 'merged', + walExtension: '.log', + finalExtension: '.json', + finalizer: records => `${JSON.stringify(records)}\n`, + }, + }); + + sw.finalize(); + + const result = JSON.parse( + read( + '/shards/20231114-221320-000/merged.20231114-221320-000.json', + ).trim(), + ); + expect(result).toEqual(['record1', 'record2']); + }); + + it('should handle invalid entries during finalize', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/final.20240101-120000-001.1.log': + 'valid\n', + '/shards/20231114-221320-000/final.20240101-120000-002.2.log': + 'invalid\n', + }); + + const sw = getShardedWal({ + dir: '/shards', + format: { + baseName: 'final', + walExtension: '.log', + finalExtension: '.json', + codec: stringCodec(), + finalizer: records => `${JSON.stringify(records)}\n`, + }, + }); + + sw.finalize(); + + const result = JSON.parse( + read( + '/shards/20231114-221320-000/final.20231114-221320-000.json', + ).trim(), + ); + expect(result).toHaveLength(2); + expect(result[0]).toBe('valid'); + expect(result[1]).toBe('invalid'); + }); + + it('should use custom options in finalizer', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/final.20231114-221320-000.10001.2.1.log': + 'record1\n', + }); + + const sw = getShardedWal({ + dir: '/shards', + format: { + baseName: 'final', + walExtension: '.log', + finalExtension: '.json', + finalizer: (records, opt) => + `${JSON.stringify({ records, meta: opt })}\n`, + }, + }); + + sw.finalize({ version: '1.0', compressed: true }); + + const result = JSON.parse( + read('/shards/20231114-221320-000/final.20231114-221320-000.json'), + ); + expect(result.records).toEqual(['record1']); + expect(result.meta).toEqual({ version: '1.0', compressed: true }); + }); + }); + + describe('cleanup', () => { + it('should throw error when cleanup is called by non-coordinator', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': + 'content1', + }); + const sw = getShardedWal({ + dir: '/shards', + format: { baseName: 'test', walExtension: '.log' }, + autoCoordinator: false, + }); + + // Instance won't be coordinator, so cleanup() should throw + expect(() => sw.cleanup()).toThrowError( + 'cleanup() can only be called by coordinator', + ); + }); + + it('should handle cleanupIfCoordinator when not coordinator', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': + 'content1', + }); + + const sw = getShardedWal({ + dir: '/shards', + format: { baseName: 'test', walExtension: '.log' }, + autoCoordinator: false, + }); + + // cleanupIfCoordinator should be no-op when not coordinator + sw.cleanupIfCoordinator(); + + // Files should still exist + expect(vol.toJSON()).not.toStrictEqual({}); + expect(sw.getState()).toBe('active'); + }); + + it('should handle cleanup when some shard files do not exist', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': + 'content1', + }); + + const sw = getShardedWal({ + dir: '/shards', + format: { baseName: 'test', walExtension: '.log' }, + }); + + vol.unlinkSync( + '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log', + ); + + // cleanupIfCoordinator won't throw even if files don't exist + expect(() => sw.cleanupIfCoordinator()).not.toThrowError(); + }); + + it('should ignore directory removal failures during cleanup', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': + 'content1', + '/shards/20231114-221320-000/keep.txt': 'keep', + }); + + const sw = getShardedWal({ + dir: '/shards', + format: { baseName: 'test', walExtension: '.log' }, + }); + + expect(() => sw.cleanup()).not.toThrowError(); + expect( + vol.readFileSync('/shards/20231114-221320-000/keep.txt', 'utf8'), + ).toBe('keep'); + }); + }); + + describe('lifecycle state', () => { + it('throws with appended finalizer error when finalize fails', () => { + const sw = getShardedWal({ + dir: '/shards', + format: { + baseName: 'test', + finalExtension: '.json', + finalizer: () => { + throw new Error('finalizer boom'); + }, + }, + }); + + expect(() => sw.finalize()).toThrowError( + /Could not finalize sharded wal\. Finalizer method in format throws\./, + ); + expect(() => sw.finalize()).toThrowError(/finalizer boom/); + expect(sw.getState()).toBe('active'); + }); + + it('should start in active state', () => { + const sw = getShardedWal(); + expect(sw.getState()).toBe('active'); + expect(sw.isFinalized()).toBeFalse(); + expect(sw.isCleaned()).toBeFalse(); + }); + + it('should transition to finalized state after finalize', () => { + vol.mkdirSync('/shards/20231114-221320-000', { recursive: true }); + const sw = getShardedWal({ + dir: '/shards', + format: { + baseName: 'test', + finalExtension: '.json', + finalizer: records => `${JSON.stringify(records)}\n`, + }, + }); + + sw.finalize(); + + expect(sw.getState()).toBe('finalized'); + expect(sw.isFinalized()).toBeTrue(); + expect(sw.isCleaned()).toBeFalse(); + }); + + it('should transition to cleaned state after cleanup (when coordinator)', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': + 'content1', + }); + + const sw = getShardedWal({ + dir: '/shards', + format: { baseName: 'test', walExtension: '.log' }, + }); + + sw.cleanupIfCoordinator(); + + const state = sw.getState(); + expect(['active', 'cleaned']).toContain(state); + }); + + it('should make cleanup idempotent for coordinator', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': + 'content1', + }); + + const sw = getShardedWal({ + dir: '/shards', + format: { baseName: 'test', walExtension: '.log' }, + }); + + sw.cleanup(); + expect(sw.getState()).toBe('cleaned'); + + expect(() => sw.cleanup()).not.toThrowError(); + expect(sw.getState()).toBe('cleaned'); + }); + + it('should prevent shard creation after finalize', () => { + vol.mkdirSync('/shards/20231114-221320-000', { recursive: true }); + const sw = getShardedWal({ + dir: '/shards', + format: { + baseName: 'test', + finalExtension: '.json', + finalizer: records => `${JSON.stringify(records)}\n`, + }, + }); + + sw.finalize(); + + expect(() => sw.shard()).toThrowError('WAL is finalized, cannot modify'); + }); + + it('should prevent shard creation after cleanup', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': + 'content1', + }); + + // Generate the instance ID that will be used by the constructor + // The constructor increments ShardedWal.instanceCount, so we need to + // generate the ID using the value that will be used (current + 1) + // without actually modifying ShardedWal.instanceCount + const nextCount = ShardedWal.instanceCount + 1; + const instanceId = getUniqueInstanceId({ + next() { + return nextCount; + }, + }); + + // Set coordinator BEFORE creating instance + ShardedWal.setCoordinatorProcess(PROFILER_SHARDER_ID_ENV_VAR, instanceId); + + const sw = getShardedWal({ + dir: '/shards', + format: { baseName: 'test', walExtension: '.log' }, + }); + + sw.cleanupIfCoordinator(); + + expect(() => sw.shard()).toThrowError('WAL is cleaned, cannot modify'); + }); + + it('should make finalize idempotent', () => { + vol.mkdirSync('/shards/20231114-221320-000', { recursive: true }); + const sw = getShardedWal({ + dir: '/shards', + format: { + baseName: 'test', + finalExtension: '.json', + finalizer: records => `${JSON.stringify(records)}\n`, + }, + }); + + sw.finalize(); + expect(sw.getState()).toBe('finalized'); + + // Call again - should not throw and should remain finalized + sw.finalize(); + expect(sw.getState()).toBe('finalized'); + }); + + it('should prevent finalize after cleanup', () => { + vol.fromJSON({ + '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': + 'content1', + }); + + // Generate the instance ID that will be used by the constructor + // The constructor increments ShardedWal.instanceCount, so we need to + // generate the ID using the value that will be used (current + 1) + // without actually modifying ShardedWal.instanceCount + const nextCount = ShardedWal.instanceCount + 1; + const instanceId = getUniqueInstanceId({ + next() { + return nextCount; + }, + }); + + // Set coordinator BEFORE creating instance + ShardedWal.setCoordinatorProcess(PROFILER_SHARDER_ID_ENV_VAR, instanceId); + + const sw = getShardedWal({ + dir: '/shards', + format: { + baseName: 'test', + walExtension: '.log', + finalExtension: '.json', + finalizer: records => `${JSON.stringify(records)}\n`, + }, + }); + + expect(sw.stats.shardFiles).toHaveLength(0); + sw.shard(); + expect(sw.stats.shardFiles).toHaveLength(0); + + sw.cleanupIfCoordinator(); + expect(sw.getState()).toBe('cleaned'); + expect(sw.stats.shardFiles).toHaveLength(0); + }); + }); +}); diff --git a/packages/utils/src/lib/wal.int.test.ts b/packages/utils/src/lib/wal.int.test.ts new file mode 100644 index 000000000..81c71709b --- /dev/null +++ b/packages/utils/src/lib/wal.int.test.ts @@ -0,0 +1,127 @@ +import fs from 'node:fs/promises'; +import path from 'node:path'; +import { afterEach, beforeEach, describe, expect, it } from 'vitest'; +import { + type Codec, + type WalRecord, + WriteAheadLogFile, + stringCodec, +} from './wal.js'; + +describe('WriteAheadLogFile Integration', () => { + const testDir = path.join(process.cwd(), 'tmp', 'int', 'utils', 'wal'); + let walFile: WriteAheadLogFile; + + beforeEach(async () => { + // Clean up test directory + await fs.rm(testDir, { recursive: true, force: true }); + await fs.mkdir(testDir, { recursive: true }); + }); + + afterEach(async () => { + if (walFile && !walFile.isClosed()) { + walFile.close(); + } + await fs.rm(testDir, { recursive: true, force: true }); + }); + + it('should recover from file with partial write', async () => { + const filePath = path.join(testDir, 'partial.log'); + walFile = new WriteAheadLogFile({ file: filePath, codec: stringCodec() }); + + walFile.open(); + walFile.append('complete1'); + walFile.append('complete2'); + walFile.close(); + + // Simulate partial write by appending incomplete line + await fs.appendFile(filePath, '"partial'); + + const recovered = walFile.recover(); + expect(recovered.records).toEqual(['complete1', 'complete2']); + expect(recovered.partialTail).toBe('"partial'); + }); + + it('should repack file removing invalid entries', () => { + const filePath = path.join(testDir, 'repack.log'); + const tolerantCodec: Codec = { + encode: v => (typeof v === 'string' ? v : JSON.stringify(v)), + decode: (s: string) => { + if (s === 'invalid') throw new Error('Invalid record'); + return s; + }, + }; + + walFile = new WriteAheadLogFile({ file: filePath, codec: tolerantCodec }); + walFile.open(); + walFile.append('valid1'); + walFile.append('invalid'); + walFile.append('valid2'); + walFile.close(); + + walFile.repack(); + + const recovered = walFile.recover(); + expect(recovered.records).toEqual(['valid1', 'valid2']); + }); + + it('should handle error recovery scenarios', () => { + const filePath = path.join(testDir, 'errors.log'); + const failingCodec: Codec = { + encode: v => (typeof v === 'string' ? v : JSON.stringify(v)), + decode: (s: string) => { + if (s === 'bad') throw new Error('Bad record'); + return s; + }, + }; + + walFile = new WriteAheadLogFile({ file: filePath, codec: failingCodec }); + walFile.open(); + walFile.append('good'); + walFile.append('bad'); + walFile.append('good'); + walFile.close(); + + const recovered = walFile.recover(); + expect(recovered.records).toEqual([ + 'good', + { __invalid: true, raw: 'bad' }, + 'good', + ]); + expect(recovered.errors).toEqual([]); + }); + + it('should handle object records correctly', () => { + const filePath = path.join(testDir, 'objects.log'); + walFile = new WriteAheadLogFile({ + file: filePath, + codec: stringCodec(), + }); + + walFile.open(); + walFile.append({ id: 1, name: 'test1' }); + walFile.append({ id: 2, name: 'test2' }); + walFile.close(); + + const recovered = walFile.recover(); + expect(recovered.records).toEqual([ + { id: 1, name: 'test1' }, + { id: 2, name: 'test2' }, + ]); + }); + + it('should perform complete write/recover cycle', () => { + const filePath = path.join(testDir, 'test.log'); + walFile = new WriteAheadLogFile({ file: filePath, codec: stringCodec() }); + + walFile.open(); + walFile.append('record1'); + walFile.append('record2'); + walFile.close(); + + const recovered = walFile.recover(); + expect(recovered.records).toEqual(['record1', 'record2']); + expect(recovered.errors).toEqual([]); + expect(recovered.partialTail).toBeNull(); + }); +}); diff --git a/packages/utils/src/lib/wal.ts b/packages/utils/src/lib/wal.ts index f0dc87a83..b504beabf 100644 --- a/packages/utils/src/lib/wal.ts +++ b/packages/utils/src/lib/wal.ts @@ -1,8 +1,5 @@ -/* eslint-disable max-lines */ import * as fs from 'node:fs'; import path from 'node:path'; -import process from 'node:process'; -import { threadId } from 'node:worker_threads'; /** * Codec for encoding/decoding values to/from strings for WAL storage. @@ -17,6 +14,18 @@ export type Codec = { export type InvalidEntry = { __invalid: true; raw: O }; +// eslint-disable-next-line @typescript-eslint/no-unused-vars +type CodecInput = C extends Codec ? I : never; +// eslint-disable-next-line @typescript-eslint/no-unused-vars +type CodecOutput = C extends Codec ? O : never; + +export type TolerantCodec = Codec< + CodecInput | InvalidEntry>, + CodecOutput +>; + +export type WalRecord = object | string; + /** * Interface for sinks that can append items. * Allows for different types of appendable storage (WAL, in-memory, etc.) @@ -28,29 +37,29 @@ export type AppendableSink = Recoverable & { close?: () => void; }; -/** - * Interface for sinks that support recovery operations. - * Represents the recoverable subset of AppendableSink functionality. - */ -export type Recoverable = { - recover: () => RecoverResult; - repack: (out?: string) => void; - finalize?: (opt?: Record) => void; -}; - /** * Result of recovering records from a WAL file. * Contains successfully recovered records and any errors encountered during parsing. */ export type RecoverResult = { /** Successfully recovered records */ - records: T[]; + records: (T | InvalidEntry)[]; /** Errors encountered during recovery with line numbers and context */ errors: { lineNo: number; line: string; error: Error }[]; /** Last incomplete line if file was truncated (null if clean) */ partialTail: string | null; }; +/** + * Interface for sinks that support recovery operations. + * Represents the recoverable subset of AppendableSink functionality. + */ +export type Recoverable = { + recover: () => RecoverResult; + repack: (out?: string) => void; + finalize?: (opt?: Record) => void; +}; + /** * Statistics about the WAL file state and last recovery operation. */ @@ -59,10 +68,6 @@ export type WalStats = { filePath: string; /** Whether the WAL file is currently closed */ isClosed: boolean; - /** Whether the WAL file exists on disk */ - fileExists: boolean; - /** File size in bytes (0 if file doesn't exist) */ - fileSize: number; /** Last recovery state from the most recent {@link recover} or {@link repack} operation */ lastRecovery: RecoverResult> | null; }; @@ -145,7 +150,9 @@ export function recoverFromContent( * Write-Ahead Log implementation for crash-safe append-only logging. * Provides atomic operations for writing, recovering, and repacking log entries. */ -export class WriteAheadLogFile implements AppendableSink { +export class WriteAheadLogFile + implements AppendableSink +{ #fd: number | null = null; readonly #file: string; readonly #decode: Codec>['decode']; @@ -157,8 +164,9 @@ export class WriteAheadLogFile implements AppendableSink { * @param options - Configuration options */ constructor(options: { file: string; codec: Codec }) { - this.#file = options.file; - const c = createTolerantCodec(options.codec); + const { file, codec } = options; + this.#file = file; + const c = createTolerantCodec(codec); this.#decode = c.decode; this.#encode = c.encode; } @@ -239,9 +247,8 @@ export class WriteAheadLogFile implements AppendableSink { // eslint-disable-next-line no-console console.log('Found invalid entries during WAL repack'); } - const recordsToWrite = hasInvalidEntries - ? (r.records as T[]) - : filterValidRecords(r.records); + // Always filter out invalid entries when repacking + const recordsToWrite = filterValidRecords(r.records); ensureDirectoryExistsSync(path.dirname(out)); fs.writeFileSync(out, `${recordsToWrite.map(this.#encode).join('\n')}\n`); } @@ -252,12 +259,9 @@ export class WriteAheadLogFile implements AppendableSink { * @returns Statistics object with file info and last recovery state */ getStats(): WalStats { - const fileExists = fs.existsSync(this.#file); return { filePath: this.#file, isClosed: this.#fd == null, - fileExists, - fileSize: fileExists ? fs.statSync(this.#file).size : 0, lastRecovery: this.#lastRecoveryState, }; } @@ -267,7 +271,7 @@ export class WriteAheadLogFile implements AppendableSink { * Format descriptor that binds codec and file extension together. * Prevents misconfiguration by keeping related concerns in one object. */ -export type WalFormat = { +export type WalFormat = { /** Base name for the WAL (e.g., "trace") */ baseName: string; /** Shard file extension (e.g., ".jsonl") */ @@ -277,21 +281,27 @@ export type WalFormat = { /** Codec for encoding/decoding records */ codec: Codec; /** Finalizer for converting records to a string */ + finalizer: (records: T[], opt?: Record) => string; +}; + +export type WalFormatWithInvalids = Omit< + WalFormat, + 'codec' | 'finalizer' +> & { + codec: TolerantCodec>; finalizer: ( records: (T | InvalidEntry)[], opt?: Record, ) => string; }; -export const stringCodec = < - T extends string | object = string, ->(): Codec => ({ - encode: v => (typeof v === 'string' ? v : JSON.stringify(v)), +export const stringCodec = (): Codec => ({ + encode: v => JSON.stringify(v), decode: v => { try { return JSON.parse(v) as T; } catch { - return v as T; + return v as unknown as T; } }, }); @@ -309,7 +319,7 @@ export const stringCodec = < * @param format - Partial WalFormat configuration * @returns Parsed WalFormat with defaults filled in */ -export function parseWalFormat( +export function parseWalFormat( format: Partial>, ): WalFormat { const { @@ -317,127 +327,23 @@ export function parseWalFormat( walExtension = '.log', finalExtension = walExtension, codec = stringCodec(), + finalizer, } = format; - const finalizer = - format.finalizer ?? - ((records: (T | InvalidEntry)[]) => { - // Encode each record using the codec before joining. - // For object types, codec.encode() will JSON-stringify them properly. - // InvalidEntry records use their raw string value directly. - const encoded = records.map(record => - typeof record === 'object' && record != null && '__invalid' in record - ? (record as InvalidEntry).raw - : codec.encode(record as T), - ); - return `${encoded.join('\n')}\n`; - }); - return { baseName, walExtension, finalExtension, codec, - finalizer, - } satisfies WalFormat; -} - -/** - * Determines if this process is the leader WAL process using the origin PID heuristic. - * - * The leader is the process that first enabled profiling (the one that set CP_PROFILER_ORIGIN_PID). - * All descendant processes inherit the environment but have different PIDs. - * - * @returns true if this is the leader WAL process, false otherwise - */ -export function isCoordinatorProcess( - envVarName: string, - profilerID: string, -): boolean { - return process.env[envVarName] === profilerID; -} - -/** - * Initialize the origin PID environment variable if not already set. - * This must be done as early as possible before any user code runs. - * Sets envVarName to the current process ID if not already defined. - */ -export function setCoordinatorProcess( - envVarName: string, - profilerID: string, -): void { - if (!process.env[envVarName]) { - // eslint-disable-next-line functional/immutable-data - process.env[envVarName] = profilerID; - } -} - -// eslint-disable-next-line functional/no-let -let shardCount = 0; - -/** - * Generates a unique sharded WAL ID based on performance time origin, process ID, thread ID, and instance count. - */ -function getShardedWalId() { - // eslint-disable-next-line functional/immutable-data - return `${Math.round(performance.timeOrigin)}.${process.pid}.${threadId}.${++ShardedWal.instanceCount}`; -} - -/** - * Generates a human-readable shard ID. - * This ID is unique per process/thread/shard combination and used in the file name. - * Format: readable-timestamp.pid.threadId.shardCount - * Example: "20240101-120000-000.12345.1.1" - * Becomes file: trace.20240101-120000-000.12345.1.1.log - */ -export function getShardId(): string { - const timestamp = Math.round(performance.timeOrigin + performance.now()); - const readableTimestamp = sortableReadableDateString(`${timestamp}`); - return `${readableTimestamp}.${process.pid}.${threadId}.${++shardCount}`; -} - -/** - * Generates a human-readable sharded group ID. - * This ID is a globally unique, sortable, human-readable date string per run. - * Used directly as the folder name to group shards. - * Format: yyyymmdd-hhmmss-ms - * Example: "20240101-120000-000" - */ -export function getShardedGroupId(): string { - return sortableReadableDateString( - `${Math.round(performance.timeOrigin + performance.now())}`, - ); -} - -/** - * Regex patterns for validating WAL ID formats - */ -export const WAL_ID_PATTERNS = { - /** Readable date format: yyyymmdd-hhmmss-ms */ - READABLE_DATE: /^\d{8}-\d{6}-\d{3}$/, - /** Group ID format: yyyymmdd-hhmmss-ms */ - GROUP_ID: /^\d{8}-\d{6}-\d{3}$/, - /** Shard ID format: readable-date.pid.threadId.count */ - SHARD_ID: /^\d{8}-\d{6}-\d{3}(?:\.\d+){3}$/, -} as const; - -export function sortableReadableDateString(timestampMs: string): string { - const timestamp = Number.parseInt(timestampMs, 10); - const date = new Date(timestamp); - const MILLISECONDS_PER_SECOND = 1000; - const yyyy = date.getFullYear(); - const mm = String(date.getMonth() + 1).padStart(2, '0'); - const dd = String(date.getDate()).padStart(2, '0'); - const hh = String(date.getHours()).padStart(2, '0'); - const min = String(date.getMinutes()).padStart(2, '0'); - const ss = String(date.getSeconds()).padStart(2, '0'); - // eslint-disable-next-line @typescript-eslint/no-magic-numbers - const ms = String(timestamp % MILLISECONDS_PER_SECOND).padStart(3, '0'); - - return `${yyyy}${mm}${dd}-${hh}${min}${ss}-${ms}`; + finalizer: + finalizer ?? + ((records, _opt) => + `${records.map(record => codec.encode(record)).join('\n')}\n`), + }; } /** + * NOTE: this helper is only used in this file. The rest of the repo avoids sync methods so it is not reusable. * Ensures a directory exists, creating it recursively if necessary using sync methods. * @param dirPath - The directory path to ensure exists */ @@ -446,177 +352,3 @@ function ensureDirectoryExistsSync(dirPath: string): void { fs.mkdirSync(dirPath, { recursive: true }); } } - -/** - * Generates a path to a shard file using human-readable IDs. - * Both groupId and shardId are already in readable date format. - * - * Example with groupId "20240101-120000-000" and shardId "20240101-120000-000.12345.1.1": - * Full path: /base/20240101-120000-000/trace.20240101-120000-000.12345.1.1.log - * - * @param opt.dir - The directory to store the shard file - * @param opt.format - The WalFormat to use for the shard file - * @param opt.groupId - The human-readable group ID (yyyymmdd-hhmmss-ms format) - * @param opt.shardId - The human-readable shard ID (readable-timestamp.pid.threadId.count format) - * @returns The path to the shard file - */ -export function getShardedPath(opt: { - dir?: string; - format: WalFormat; - groupId: string; - shardId: string; -}): string { - const { dir = '', format, groupId, shardId } = opt; - const { baseName, walExtension } = format; - - return path.join(dir, groupId, `${baseName}.${shardId}${walExtension}`); -} - -export function getShardedFinalPath(opt: { - dir?: string; - format: WalFormat; - groupId: string; -}): string { - const { dir = '', format, groupId } = opt; - const { baseName, finalExtension } = format; - - return path.join(dir, groupId, `${baseName}.${groupId}${finalExtension}`); -} - -/** - * Sharded Write-Ahead Log manager for coordinating multiple WAL shards. - * Handles distributed logging across multiple processes/files with atomic finalization. - */ - -export class ShardedWal { - static instanceCount = 0; - readonly #id: string = getShardedWalId(); - readonly groupId = getShardedGroupId(); - readonly #format: WalFormat; - readonly #dir: string = process.cwd(); - readonly #isCoordinator: boolean; - - /** - * Create a sharded WAL manager. - * - * @param opt.dir - Base directory to store shard files (defaults to process.cwd()) - * @param opt.format - WAL format configuration - * @param opt.groupId - Group ID for sharding (defaults to generated group ID) - * @param opt.coordinatorIdEnvVar - Environment variable name for storing coordinator ID (defaults to CP_SHARDED_WAL_COORDINATOR_ID) - */ - constructor(opt: { - dir?: string; - format: Partial>; - groupId?: string; - coordinatorIdEnvVar: string; - }) { - const { dir, format, groupId, coordinatorIdEnvVar } = opt; - this.groupId = groupId ?? getShardedGroupId(); - if (dir) { - this.#dir = dir; - } - this.#format = parseWalFormat(format); - this.#isCoordinator = isCoordinatorProcess(coordinatorIdEnvVar, this.#id); - } - - /** - * Is this instance the coordinator? - * - * Coordinator status is determined from the coordinatorIdEnvVar environment variable. - * The coordinator handles finalization and cleanup of shard files. - * - * @returns true if this instance is the coordinator, false otherwise - */ - isCoordinator(): boolean { - return this.#isCoordinator; - } - - shard(shardId: string = getShardId()) { - return new WriteAheadLogFile({ - file: getShardedPath({ - dir: this.#dir, - format: this.#format, - groupId: this.groupId, - shardId, - }), - codec: this.#format.codec, - }); - } - - /** Get all shard file paths matching this WAL's base name */ - private shardFiles() { - if (!fs.existsSync(this.#dir)) { - return []; - } - - const groupIdDir = path.dirname( - getShardedFinalPath({ - dir: this.#dir, - format: this.#format, - groupId: this.groupId, - }), - ); - // create dir if not existing - ensureDirectoryExistsSync(groupIdDir); - - return fs - .readdirSync(groupIdDir) - .filter(entry => entry.endsWith(this.#format.walExtension)) - .filter(entry => entry.startsWith(`${this.#format.baseName}`)) - .map(entry => path.join(groupIdDir, entry)); - } - - /** - * Finalize all shards by merging them into a single output file. - * Recovers all records from all shards, validates no errors, and writes merged result. - * @throws Error if any shard contains decode errors - */ - finalize(opt?: Record) { - const fileRecoveries = this.shardFiles().map(f => ({ - file: f, - recovery: new WriteAheadLogFile({ - file: f, - codec: this.#format.codec, - }).recover(), - })); - - const records = fileRecoveries.flatMap(({ recovery }) => recovery.records); - - // Check if any records are invalid entries (from tolerant codec) - const hasInvalidEntries = records.some( - r => typeof r === 'object' && r != null && '__invalid' in r, - ); - - const recordsToFinalize = hasInvalidEntries - ? records - : filterValidRecords(records); - const out = getShardedFinalPath({ - dir: this.#dir, - format: this.#format, - groupId: this.groupId, - }); - ensureDirectoryExistsSync(path.dirname(out)); - fs.writeFileSync(out, this.#format.finalizer(recordsToFinalize, opt)); - } - - cleanup() { - this.shardFiles().forEach(f => { - // Remove the shard file - fs.unlinkSync(f); - // Remove the parent directory (shard group directory) - const shardDir = path.dirname(f); - try { - fs.rmdirSync(shardDir); - } catch { - // Directory might not be empty or already removed, ignore - } - }); - - // Also try to remove the root directory if it becomes empty - try { - fs.rmdirSync(this.#dir); - } catch { - // Directory might not be empty or already removed, ignore - } - } -} diff --git a/packages/utils/src/lib/wal.unit.test.ts b/packages/utils/src/lib/wal.unit.test.ts index 4221d4f0f..acd941cfa 100644 --- a/packages/utils/src/lib/wal.unit.test.ts +++ b/packages/utils/src/lib/wal.unit.test.ts @@ -1,27 +1,18 @@ import { vol } from 'memfs'; +import { beforeEach, describe, expect, it, vi } from 'vitest'; import { MEMFS_VOLUME } from '@code-pushup/test-utils'; -import { SHARDED_WAL_COORDINATOR_ID_ENV_VAR } from './profiler/constants.js'; import { type Codec, - type InvalidEntry, - ShardedWal, - WAL_ID_PATTERNS, WriteAheadLogFile, createTolerantCodec, filterValidRecords, - getShardId, - getShardedGroupId, - isCoordinatorProcess, parseWalFormat, recoverFromContent, - setCoordinatorProcess, stringCodec, } from './wal.js'; const read = (p: string) => vol.readFileSync(p, 'utf8') as string; - const write = (p: string, c: string) => vol.writeFileSync(p, c); - const wal = ( file: string, codec: Codec = stringCodec(), @@ -37,9 +28,8 @@ describe('createTolerantCodec', () => { throw new Error('decoding error'); }, }); - expect(() => c.encode(42)).toThrow('encoding error'); - const result = c.decode('42'); - expect(result).toEqual({ __invalid: true, raw: '42' }); + expect(() => c.encode(42)).toThrowError('encoding error'); + expect(c.decode('42')).toEqual({ __invalid: true, raw: '42' }); }); it('round-trips valid values and preserves invalid ones', () => { @@ -52,7 +42,6 @@ describe('createTolerantCodec', () => { }, }); expect(c.decode(c.encode(42))).toBe(42); - const invalid = c.decode('x'); expect(invalid).toStrictEqual({ __invalid: true, raw: 'x' }); expect(c.encode(invalid)).toBe('x'); @@ -66,8 +55,7 @@ describe('filterValidRecords', () => { { __invalid: true, raw: 'x' }, { id: 3, name: 'valid3' }, ]; - const result = filterValidRecords(records); - expect(result).toEqual([ + expect(filterValidRecords(records)).toEqual([ { id: 1, name: 'valid1' }, { id: 3, name: 'valid3' }, ]); @@ -76,8 +64,7 @@ describe('filterValidRecords', () => { describe('recoverFromContent', () => { it('recovers valid records', () => { - const content = 'a\nb\n'; - const result = recoverFromContent(content, stringCodec().decode); + const result = recoverFromContent('a\nb\n', stringCodec().decode); expect(result).toEqual({ records: ['a', 'b'], errors: [], @@ -86,9 +73,7 @@ describe('recoverFromContent', () => { }); it('handles empty content', () => { - const content = ''; - const result = recoverFromContent(content, stringCodec().decode); - expect(result).toEqual({ + expect(recoverFromContent('', stringCodec().decode)).toEqual({ records: [], errors: [], partialTail: null, @@ -96,18 +81,13 @@ describe('recoverFromContent', () => { }); it('handles content without trailing newline', () => { - const content = 'a\nb'; - const result = recoverFromContent(content, stringCodec().decode); - expect(result).toEqual({ - records: ['a'], - errors: [], - partialTail: 'b', - }); + const result = recoverFromContent('a\nb', stringCodec().decode); + expect(result.records).toEqual(['a']); + expect(result.partialTail).toBe('b'); }); it('skips empty lines', () => { - const content = 'a\n\nb\n'; - const result = recoverFromContent(content, stringCodec().decode); + const result = recoverFromContent('a\n\nb\n', stringCodec().decode); expect(result).toEqual({ records: ['a', 'b'], errors: [], @@ -124,9 +104,7 @@ describe('recoverFromContent', () => { }, }; - const content = 'good\nbad\ngood\n'; - const result = recoverFromContent(content, failingCodec.decode); - + const result = recoverFromContent('good\nbad\ngood\n', failingCodec.decode); expect(result.records).toEqual(['good', 'good']); expect(result.errors).toHaveLength(1); expect(result.errors[0]).toEqual({ @@ -134,7 +112,6 @@ describe('recoverFromContent', () => { line: 'bad', error: expect.any(Error), }); - expect(result.errors.at(0)?.error.message).toBe('Bad record'); expect(result.partialTail).toBeNull(); }); @@ -147,9 +124,10 @@ describe('recoverFromContent', () => { }, }; - const content = 'good\nbad\npartial'; - const result = recoverFromContent(content, failingCodec.decode); - + const result = recoverFromContent( + 'good\nbad\npartial', + failingCodec.decode, + ); expect(result.records).toEqual(['good']); expect(result.errors).toHaveLength(1); expect(result.errors.at(0)?.lineNo).toBe(2); @@ -163,416 +141,224 @@ describe('WriteAheadLogFile', () => { vol.fromJSON({}, MEMFS_VOLUME); }); - it('should act as WLA for any kind of data', () => { - const w = wal('/test/a.log', stringCodec()); - w.open(); - w.append({ id: 1, name: 'test' }); - w.close(); - expect(w.recover().records).toStrictEqual([{ id: 1, name: 'test' }]); - w.open(); - expect(() => - w.append('{ id: 1, name:...' as unknown as object), - ).not.toThrow(); - w.close(); - expect(w.recover().records).toStrictEqual([ - { id: 1, name: 'test' }, - '{ id: 1, name:...', - ]); - }); - - it('should create instance with file path and codecs without opening', () => { - const w = wal('/test/a.log'); - expect(w).toBeInstanceOf(WriteAheadLogFile); - expect(w.getPath()).toBe('/test/a.log'); - expect(w.isClosed()).toBeTrue(); - }); - - it('throws error when appending without opening', () => { - const w = wal('/test/a.log'); - expect(w.isClosed()).toBeTrue(); - expect(() => w.append('a')).toThrow('WAL not opened'); - }); - - it('opens and closes correctly', () => { - const w = wal('/test/a.log'); - expect(w.isClosed()).toBeTrue(); - w.open(); - expect(w.isClosed()).toBeFalse(); - w.close(); - expect(w.isClosed()).toBeTrue(); - }); - - it('multiple open calls are idempotent', () => { - const w = wal('/test/a.log'); - expect(w.isClosed()).toBeTrue(); - - w.open(); - expect(w.isClosed()).toBeFalse(); - - w.open(); - expect(w.isClosed()).toBeFalse(); - w.open(); - expect(w.isClosed()).toBeFalse(); - - w.close(); - expect(w.isClosed()).toBeTrue(); - }); - - it('append lines if opened', () => { - vol.mkdirSync('/test', { recursive: true }); - const w = wal('/test/a.log'); - w.open(); - w.append('a'); - w.append('b'); - - expect(read('/test/a.log')).toBe('a\nb\n'); - }); - - it('appends records with encode logic', () => { - const w = wal('/test/a.log'); - w.open(); - - w.append('any string'); - expect(read('/test/a.log')).toBe('any string\n'); + describe('initialization', () => { + it('should create instance with file path and codec without opening', () => { + const w = wal('/test/a.log'); + expect(w).toBeInstanceOf(WriteAheadLogFile); + expect(w.getPath()).toBe('/test/a.log'); + expect(w.isClosed()).toBeTrue(); + }); }); - it('returns empty result when file does not exist', () => { - const w = wal('/test/nonexistent.log'); - const result = w.recover(); + describe('lifecycle', () => { + it('opens and closes correctly', () => { + const w = wal('/test/a.log'); + expect(w.isClosed()).toBeTrue(); + w.open(); + expect(w.isClosed()).toBeFalse(); + w.close(); + expect(w.isClosed()).toBeTrue(); + }); - expect(result).toEqual({ - records: [], - errors: [], - partialTail: null, + it('multiple open calls are idempotent', () => { + const w = wal('/test/a.log'); + w.open(); + expect(w.isClosed()).toBeFalse(); + w.open(); + w.open(); + expect(w.isClosed()).toBeFalse(); + w.close(); + expect(w.isClosed()).toBeTrue(); }); }); - it('can recover without opening (reads file directly)', () => { - vol.mkdirSync('/test', { recursive: true }); - write('/test/a.log', 'line1\nline2\n'); - const w = wal('/test/a.log'); + describe('append operations', () => { + it('throws error when appending without opening', () => { + const w = wal('/test/a.log'); + expect(() => w.append('a')).toThrowError('WAL not opened'); + }); - const result = w.recover(); - expect(result.records).toStrictEqual(['line1', 'line2']); - expect(result.errors).toEqual([]); - }); + it('appends records with encoding', () => { + vol.mkdirSync('/test', { recursive: true }); + const w = wal('/test/a.log'); + w.open(); + w.append('a'); + w.append('b'); + expect(read('/test/a.log')).toBe('"a"\n"b"\n'); + }); - it('recovers valid records if opened', () => { - vol.mkdirSync('/test', { recursive: true }); - write('/test/a.log', 'line1\nline2\n'); - const w = wal('/test/a.log'); - w.open(); - expect(w.recover()).toStrictEqual({ - records: ['line1', 'line2'], - errors: [], - partialTail: null, + it('handles any kind of data', () => { + const w = wal('/test/a.log', stringCodec()); + w.open(); + w.append({ id: 1, name: 'test' }); + w.close(); + expect(w.recover().records).toStrictEqual([{ id: 1, name: 'test' }]); }); }); - it('recovers with decode errors and partial tail using tolerant codec', () => { - vol.mkdirSync('/test', { recursive: true }); - write('/test/a.log', 'ok\nbad\npartial'); - - const tolerantCodec = createTolerantCodec({ - encode: (s: string) => s, - decode: (s: string) => { - if (s === 'bad') throw new Error('Bad record'); - return s; - }, + describe('recovery operations', () => { + it('returns empty result when file does not exist', () => { + const result = wal('/test/nonexistent.log').recover(); + expect(result).toEqual({ + records: [], + errors: [], + partialTail: null, + }); }); - expect(wal('/test/a.log', tolerantCodec).recover()).toStrictEqual({ - records: ['ok', { __invalid: true, raw: 'bad' }], - errors: [], - partialTail: 'partial', + it('recovers valid records from file', () => { + vol.mkdirSync('/test', { recursive: true }); + write('/test/a.log', 'line1\nline2\n'); + const result = wal('/test/a.log').recover(); + expect(result.records).toStrictEqual(['line1', 'line2']); + expect(result.errors).toEqual([]); + expect(result.partialTail).toBeNull(); }); - }); - it('repacks clean file without errors', () => { - vol.mkdirSync('/test', { recursive: true }); - write('/test/a.log', 'a\nb\n'); - wal('/test/a.log').repack(); - expect(read('/test/a.log')).toBe('a\nb\n'); - }); + it('recovers with decode errors and partial tail using tolerant codec', () => { + vol.mkdirSync('/test', { recursive: true }); + write('/test/a.log', 'ok\nbad\npartial'); - it('repacks with decode errors using tolerant codec', () => { - vol.mkdirSync('/test', { recursive: true }); - write('/test/a.log', 'ok\nbad\n'); + const tolerantCodec = createTolerantCodec({ + encode: (s: string) => s, + decode: (s: string) => { + if (s === 'bad') throw new Error('Bad record'); + return s; + }, + }); - const tolerantCodec = createTolerantCodec({ - encode: (s: string) => s, - decode: (s: string) => { - if (s === 'bad') throw new Error('Bad record'); - return s; - }, + const result = wal('/test/a.log', tolerantCodec).recover(); + expect(result).toStrictEqual({ + records: ['ok', { __invalid: true, raw: 'bad' }], + errors: [], + partialTail: 'partial', + }); }); - - wal('/test/a.log', tolerantCodec).repack(); - expect(read('/test/a.log')).toBe('ok\nbad\n'); }); - it('logs decode errors during content recovery', () => { - const failingCodec: Codec = { - encode: (s: string) => s, - decode: (s: string) => { - if (s === 'bad') throw new Error('Bad record during recovery'); - return s; - }, - }; - - const content = 'good\nbad\ngood\n'; - const result = recoverFromContent(content, failingCodec.decode); + describe('repack operations', () => { + it('repacks clean file without errors', () => { + vol.mkdirSync('/test', { recursive: true }); + write('/test/a.log', '"a"\n"b"\n'); + wal('/test/a.log').repack(); + expect(read('/test/a.log')).toBe('"a"\n"b"\n'); + }); - expect(result.errors).toHaveLength(1); - expect(result.errors.at(0)?.error.message).toBe( - 'Bad record during recovery', - ); - expect(result.records).toEqual(['good', 'good']); - }); + it('repacks with decode errors using tolerant codec', () => { + const consoleLogSpy = vi + .spyOn(console, 'log') + .mockImplementation(() => {}); + vol.mkdirSync('/test', { recursive: true }); + write('/test/a.log', 'ok\nbad\n'); - it('repacks with invalid entries and logs warning', () => { - const consoleLogSpy = vi.spyOn(console, 'log').mockImplementation(() => {}); + const tolerantCodec = createTolerantCodec({ + encode: (s: string) => s, + decode: (s: string) => { + if (s === 'bad') throw new Error('Bad record'); + return s; + }, + }); - vol.mkdirSync('/test', { recursive: true }); - write('/test/a.log', 'ok\nbad\n'); + wal('/test/a.log', tolerantCodec).repack(); - const tolerantCodec = createTolerantCodec({ - encode: (s: string) => s, - decode: (s: string) => { - if (s === 'bad') throw new Error('Bad record'); - return s; - }, + expect(consoleLogSpy).toHaveBeenCalledWith( + 'Found invalid entries during WAL repack', + ); + // Repack filters out invalid entries, so only valid records remain + expect(read('/test/a.log')).toBe('ok\n'); + consoleLogSpy.mockRestore(); }); - wal('/test/a.log', tolerantCodec).repack(); - - expect(consoleLogSpy).toHaveBeenCalledWith( - 'Found invalid entries during WAL repack', - ); - expect(read('/test/a.log')).toBe('ok\nbad\n'); - - consoleLogSpy.mockRestore(); - }); + it('logs decode errors when recover returns errors', () => { + const consoleLogSpy = vi + .spyOn(console, 'log') + .mockImplementation(() => {}); + vol.mkdirSync('/test', { recursive: true }); + write('/test/a.log', 'content\n'); - it('recoverFromContent handles decode errors and returns them', () => { - const failingCodec: Codec = { - encode: (s: string) => s, - decode: (s: string) => { - if (s === 'bad') throw new Error('Bad record during recovery'); - return s; - }, - }; + const walInstance = wal('/test/a.log'); + const recoverSpy = vi.spyOn(walInstance, 'recover').mockReturnValue({ + records: ['content'], + errors: [ + { lineNo: 1, line: 'content', error: new Error('Mock decode error') }, + ], + partialTail: null, + }); - const content = 'good\nbad\ngood\n'; - const result = recoverFromContent(content, failingCodec.decode); + walInstance.repack(); - expect(result.records).toEqual(['good', 'good']); - expect(result.errors).toHaveLength(1); - expect(result).toHaveProperty( - 'errors', - expect.arrayContaining([ - { - lineNo: 2, - line: 'bad', - error: expect.any(Error), - }, - ]), - ); + expect(consoleLogSpy).toHaveBeenCalledWith( + 'WAL repack encountered decode errors', + ); + recoverSpy.mockRestore(); + consoleLogSpy.mockRestore(); + }); }); - it('repack logs decode errors when recover returns errors', () => { - const consoleLogSpy = vi.spyOn(console, 'log').mockImplementation(() => {}); - - vol.mkdirSync('/test', { recursive: true }); - write('/test/a.log', 'content\n'); - - const walInstance = wal('/test/a.log'); - - const recoverSpy = vi.spyOn(walInstance, 'recover').mockReturnValue({ - records: ['content'], - errors: [ - { lineNo: 1, line: 'content', error: new Error('Mock decode error') }, - ], - partialTail: null, + describe('statistics', () => { + it('getStats returns file information and recovery state', () => { + vol.mkdirSync('/test', { recursive: true }); + const w = wal('/test/a.log'); + const stats = w.getStats(); + expect(stats.filePath).toBe('/test/a.log'); + expect(stats.isClosed).toBeTrue(); + expect(stats.lastRecovery).toBeNull(); }); - - walInstance.repack(); - - expect(consoleLogSpy).toHaveBeenCalledWith( - 'WAL repack encountered decode errors', - ); - - recoverSpy.mockRestore(); - consoleLogSpy.mockRestore(); }); }); describe('stringCodec', () => { - it('should encode strings as-is', () => { - const codec = stringCodec(); - expect(codec.encode('hello')).toBe('hello'); - expect(codec.encode('')).toBe(''); - expect(codec.encode('with spaces')).toBe('with spaces'); - }); + it('encodes strings and objects as JSON', () => { + const codec = stringCodec(); + expect(codec.encode('hello')).toBe('"hello"'); + expect(codec.encode('')).toBe('""'); - it('should encode objects as JSON strings', () => { - const codec = stringCodec(); + const objCodec = stringCodec(); const obj = { name: 'test', value: 42 }; - expect(codec.encode(obj)).toBe('{"name":"test","value":42}'); - }); - - it('should encode mixed types correctly', () => { - const codec = stringCodec(); - expect(codec.encode('string value')).toBe('string value'); - expect(codec.encode({ key: 'value' })).toBe('{"key":"value"}'); - expect(codec.encode([1, 2, 3])).toBe('[1,2,3]'); + expect(objCodec.encode(obj)).toBe('{"name":"test","value":42}'); }); - it('should decode valid JSON strings', () => { - const codec = stringCodec(); - const jsonString = '{"name":"test","value":42}'; - const result = codec.decode(jsonString); - expect(result).toEqual({ name: 'test', value: 42 }); - }); - - it('should decode arrays from JSON strings', () => { - const codec = stringCodec(); - const jsonString = '[1,2,3]'; - const result = codec.decode(jsonString); - expect(result).toEqual([1, 2, 3]); + it('decodes valid JSON strings', () => { + const codec = stringCodec(); + expect(codec.decode('{"name":"test","value":42}')).toEqual({ + name: 'test', + value: 42, + }); + expect(codec.decode('[1,2,3]')).toEqual([1, 2, 3]); }); - it('should return strings as-is when JSON parsing fails', () => { - const codec = stringCodec(); + it('returns strings as-is when JSON parsing fails', () => { + const codec = stringCodec(); expect(codec.decode('not json')).toBe('not json'); - expect(codec.decode('hello world')).toBe('hello world'); - expect(codec.decode('')).toBe(''); - }); - - it('should handle malformed JSON gracefully', () => { - const codec = stringCodec(); expect(codec.decode('{invalid')).toBe('{invalid'); - expect(codec.decode('[1,2,')).toBe('[1,2,'); - expect(codec.decode('null')).toBeNull(); - }); - - it('should round-trip strings correctly', () => { - const codec = stringCodec(); - const original = 'hello world'; - const encoded = codec.encode(original); - const decoded = codec.decode(encoded); - expect(decoded).toBe(original); - }); - - it('should round-trip objects correctly', () => { - const codec = stringCodec(); - const original = { name: 'test', nested: { value: 123 } }; - const encoded = codec.encode(original); - const decoded = codec.decode(encoded); - expect(decoded).toEqual(original); - }); - - it('should round-trip arrays correctly', () => { - const codec = stringCodec(); - const original = [1, 'two', { three: 3 }]; - const encoded = codec.encode(original); - const decoded = codec.decode(encoded); - expect(decoded).toEqual(original); - }); - - it('should maintain type safety with generics', () => { - const stringCodecInstance = stringCodec(); - const str: string = stringCodecInstance.decode('test'); - expect(typeof str).toBe('string'); - - const objectCodecInstance = stringCodec<{ id: number; name: string }>(); - const obj = objectCodecInstance.decode('{"id":1,"name":"test"}'); - expect(obj).toEqual({ id: 1, name: 'test' }); - - const unionCodecInstance = stringCodec(); - expect(unionCodecInstance.decode('string')).toBe('string'); - expect(unionCodecInstance.decode('[1,2,3]')).toEqual([1, 2, 3]); }); - it('should handle special JSON values', () => { - const codec = stringCodec(); + it('handles special JSON values', () => { + const codec = stringCodec(); expect(codec.decode('null')).toBeNull(); expect(codec.decode('true')).toBeTrue(); expect(codec.decode('false')).toBeFalse(); - expect(codec.decode('"quoted string"')).toBe('quoted string'); expect(codec.decode('42')).toBe(42); }); -}); - -describe('getShardId', () => { - it('should generate shard ID with readable timestamp', () => { - const result = getShardId(); - - expect(result).toMatch(WAL_ID_PATTERNS.SHARD_ID); - expect(result).toStartWith('20231114-221320-000.'); - }); - - it('should generate different shard IDs for different calls', () => { - const result1 = getShardId(); - const result2 = getShardId(); - - expect(result1).not.toBe(result2); - expect(result1).toStartWith('20231114-221320-000.'); - expect(result2).toStartWith('20231114-221320-000.'); - }); - - it('should handle zero values', () => { - const result = getShardId(); - expect(result).toStartWith('20231114-221320-000.'); - }); - - it('should handle negative timestamps', () => { - const result = getShardId(); - - expect(result).toStartWith('20231114-221320-000.'); - }); - - it('should handle large timestamps', () => { - const result = getShardId(); - - expect(result).toStartWith('20231114-221320-000.'); - }); - - it('should generate incrementing counter', () => { - const result1 = getShardId(); - const result2 = getShardId(); - const parts1 = result1.split('.'); - const parts2 = result2.split('.'); - const counter1 = parts1.at(-1) as string; - const counter2 = parts2.at(-1) as string; + it('round-trips values correctly', () => { + const stringCodecInstance = stringCodec(); + const original = 'hello world'; + expect( + stringCodecInstance.decode(stringCodecInstance.encode(original)), + ).toBe(original); - expect(Number.parseInt(counter1, 10)).toBe( - Number.parseInt(counter2, 10) - 1, + const objectCodecInstance = stringCodec(); + const obj = { name: 'test', nested: { value: 123 } }; + expect(objectCodecInstance.decode(objectCodecInstance.encode(obj))).toEqual( + obj, ); }); }); -describe('getShardedGroupId', () => { - it('should work with mocked timeOrigin', () => { - const result = getShardedGroupId(); - - expect(result).toBe('20231114-221320-000'); - expect(result).toMatch(WAL_ID_PATTERNS.GROUP_ID); - }); - - it('should be idempotent within same process', () => { - const result1 = getShardedGroupId(); - const result2 = getShardedGroupId(); - - expect(result1).toBe(result2); - }); -}); - describe('parseWalFormat', () => { - it('should apply all defaults when given empty config', () => { + it('applies all defaults when given empty config', () => { const result = parseWalFormat({}); - expect(result.baseName).toBe('wal'); expect(result.walExtension).toBe('.log'); expect(result.finalExtension).toBe('.log'); @@ -580,441 +366,46 @@ describe('parseWalFormat', () => { expect(typeof result.finalizer).toBe('function'); }); - it('should use provided baseName and default others', () => { - const result = parseWalFormat({ baseName: 'test' }); - - expect(result.baseName).toBe('test'); - expect(result.walExtension).toBe('.log'); - expect(result.finalExtension).toBe('.log'); - }); - - it('should use provided walExtension and default finalExtension to match', () => { - const result = parseWalFormat({ walExtension: '.wal' }); - - expect(result.walExtension).toBe('.wal'); - expect(result.finalExtension).toBe('.wal'); - }); - - it('should use provided finalExtension independently', () => { + it('uses provided parameters and defaults others', () => { + const customCodec = stringCodec(); const result = parseWalFormat({ + baseName: 'test', walExtension: '.wal', finalExtension: '.json', + codec: customCodec, }); - + expect(result.baseName).toBe('test'); expect(result.walExtension).toBe('.wal'); expect(result.finalExtension).toBe('.json'); + expect(result.codec.encode('value')).toBe(customCodec.encode('value')); }); - it('should use provided codec', () => { - const customCodec = stringCodec(); - const result = parseWalFormat({ codec: customCodec }); - - expect(result.codec).toBe(customCodec); + it('defaults finalExtension to walExtension when not provided', () => { + const result = parseWalFormat({ walExtension: '.wal' }); + expect(result.walExtension).toBe('.wal'); + expect(result.finalExtension).toBe('.wal'); }); - it('should use custom finalizer function', () => { + it('uses custom finalizer function', () => { const customFinalizer = (records: any[]) => `custom: ${records.length}`; const result = parseWalFormat({ finalizer: customFinalizer }); - expect(result.finalizer(['a', 'b'])).toBe('custom: 2'); }); - it('should work with all custom parameters', () => { - const config = { - baseName: 'my-wal', - walExtension: '.wal', - finalExtension: '.json', - codec: stringCodec(), - finalizer: (records: any[]) => JSON.stringify(records), - }; - - const result = parseWalFormat(config); - - expect(result.baseName).toBe('my-wal'); - expect(result.walExtension).toBe('.wal'); - expect(result.finalExtension).toBe('.json'); - expect(result.codec).toBe(config.codec); - expect(result.finalizer(['test'])).toBe('["test"]'); - }); - - it('should use default finalizer when none provided', () => { - const result = parseWalFormat({ baseName: 'test' }); - expect(result.finalizer(['line1', 'line2'])).toBe('line1\nline2\n'); + it('uses default finalizer when none provided', () => { + const result = parseWalFormat({ baseName: 'test' }); + expect(result.finalizer(['line1', 'line2'])).toBe('"line1"\n"line2"\n'); expect(result.finalizer([])).toBe('\n'); }); - it('should encode objects to JSON strings in default finalizer', () => { - const result = parseWalFormat({ baseName: 'test' }); + it('encodes objects to JSON strings in default finalizer', () => { + const result = parseWalFormat({ baseName: 'test' }); const records = [ { id: 1, name: 'test' }, { id: 2, name: 'test2' }, ]; - const output = result.finalizer(records); - expect(output).toBe('{"id":1,"name":"test"}\n{"id":2,"name":"test2"}\n'); - }); - - it('should handle InvalidEntry in default finalizer', () => { - const result = parseWalFormat({ baseName: 'test' }); - const records: (string | InvalidEntry)[] = [ - 'valid', - { __invalid: true, raw: 'invalid-raw' }, - 'also-valid', - ]; - const output = result.finalizer(records); - expect(output).toBe('valid\ninvalid-raw\nalso-valid\n'); - }); - - it('should encode objects correctly when using default type parameter', () => { - // Test parseWalFormat({}) with default type parameter (object) - const result = parseWalFormat({}); - const records = [ - { id: 1, name: 'test1' }, - { id: 2, name: 'test2' }, - ]; - const output = result.finalizer(records); - // Should be JSON strings, not [object Object] - expect(output).toBe('{"id":1,"name":"test1"}\n{"id":2,"name":"test2"}\n'); - expect(output).not.toContain('[object Object]'); - }); -}); - -describe('isCoordinatorProcess', () => { - it('should return true when env var matches current pid', () => { - const profilerId = `${Math.round(performance.timeOrigin)}${process.pid}.1.0`; - vi.stubEnv('TEST_LEADER_PID', profilerId); - - const result = isCoordinatorProcess('TEST_LEADER_PID', profilerId); - expect(result).toBeTrue(); - }); - - it('should return false when env var does not match current profilerId', () => { - const wrongProfilerId = `${Math.round(performance.timeOrigin)}${process.pid}.2.0`; - vi.stubEnv('TEST_LEADER_PID', wrongProfilerId); - - const currentProfilerId = `${Math.round(performance.timeOrigin)}${process.pid}.1.0`; - const result = isCoordinatorProcess('TEST_LEADER_PID', currentProfilerId); - expect(result).toBeFalse(); - }); - - it('should return false when env var is not set', () => { - vi.stubEnv('NON_EXISTENT_VAR', undefined as any); - - const profilerId = `${Math.round(performance.timeOrigin)}${process.pid}.1.0`; - const result = isCoordinatorProcess('NON_EXISTENT_VAR', profilerId); - expect(result).toBeFalse(); - }); - - it('should return false when env var is empty string', () => { - vi.stubEnv('TEST_LEADER_PID', ''); - - const profilerId = `${Math.round(performance.timeOrigin)}${process.pid}.1.0`; - const result = isCoordinatorProcess('TEST_LEADER_PID', profilerId); - expect(result).toBeFalse(); - }); -}); - -describe('setCoordinatorProcess', () => { - beforeEach(() => { - // Clean up any existing TEST_ORIGIN_PID - // eslint-disable-next-line functional/immutable-data - delete process.env['TEST_ORIGIN_PID']; - }); - - it('should set env var when not already set', () => { - expect(process.env['TEST_ORIGIN_PID']).toBeUndefined(); - - const profilerId = `${Math.round(performance.timeOrigin)}${process.pid}.1.0`; - setCoordinatorProcess('TEST_ORIGIN_PID', profilerId); - - expect(process.env['TEST_ORIGIN_PID']).toBe(profilerId); - }); - - it('should not overwrite existing env var', () => { - const existingProfilerId = `${Math.round(performance.timeOrigin)}${process.pid}.1.0`; - const newProfilerId = `${Math.round(performance.timeOrigin)}${process.pid}.2.0`; - - vi.stubEnv('TEST_ORIGIN_PID', existingProfilerId); - setCoordinatorProcess('TEST_ORIGIN_PID', newProfilerId); - - expect(process.env['TEST_ORIGIN_PID']).toBe(existingProfilerId); - }); - - it('should set env var to profiler id', () => { - const profilerId = `${Math.round(performance.timeOrigin)}${process.pid}.1.0`; - setCoordinatorProcess('TEST_ORIGIN_PID', profilerId); - - expect(process.env['TEST_ORIGIN_PID']).toBe(profilerId); - }); -}); - -describe('ShardedWal', () => { - beforeEach(() => { - vol.reset(); - vol.fromJSON({}, MEMFS_VOLUME); - }); - - it('should create instance with directory and format', () => { - const sw = new ShardedWal({ - dir: '/test/shards', - format: { - baseName: 'test-wal', - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - - expect(sw).toBeInstanceOf(ShardedWal); - }); - - it('should create shard with correct file path', () => { - const sw = new ShardedWal({ - dir: '/test/shards', - format: { - baseName: 'trace', - walExtension: '.log', - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - - const shard = sw.shard('20231114-221320-000.1.2.3'); - expect(shard).toBeInstanceOf(WriteAheadLogFile); - expect(shard.getPath()).toMatchPath( - '/test/shards/20231114-221320-000/trace.20231114-221320-000.1.2.3.log', - ); - }); - - it('should create shard with default shardId when no argument provided', () => { - const sw = new ShardedWal({ - dir: '/test/shards', - format: { - baseName: 'trace', - walExtension: '.log', - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - - const shard = sw.shard(); - expect(shard.getPath()).toStartWithPath( - '/test/shards/20231114-221320-000/trace.20231114-221320-000.10001', - ); - expect(shard.getPath()).toEndWithPath('.log'); - }); - - it('should list no shard files when directory does not exist', () => { - const sw = new ShardedWal({ - dir: '/nonexistent', - format: { - baseName: 'test-wal', - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - const files = (sw as any).shardFiles(); - expect(files).toEqual([]); - }); - - it('should list no shard files when directory is empty', () => { - const sw = new ShardedWal({ - dir: '/empty', - format: { - baseName: 'test-wal', - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - // Create the group directory (matches actual getShardedGroupId() output) - vol.mkdirSync('/empty/20231114-221320-000', { recursive: true }); - const files = (sw as any).shardFiles(); - expect(files).toEqual([]); - }); - - it('should list shard files matching extension', () => { - // Note: Real shard IDs look like "1704067200000.12345.1.1" (timestamp.pid.threadId.count) - // These test IDs use simplified format "001.1", "002.2" for predictability - vol.fromJSON({ - '/shards/20231114-221320-000/trace.19700101-000820-001.1.log': 'content1', - '/shards/20231114-221320-000/trace.19700101-000820-002.2.log': 'content2', - '/shards/other.txt': 'not a shard', - }); - - const sw = new ShardedWal({ - dir: '/shards', - format: { - baseName: 'trace', - walExtension: '.log', - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - const files = (sw as any).shardFiles(); - - expect(files).toHaveLength(2); - expect(files).toEqual( - expect.arrayContaining([ - expect.pathToMatch( - '/shards/20231114-221320-000/trace.19700101-000820-001.1.log', - ), - expect.pathToMatch( - '/shards/20231114-221320-000/trace.19700101-000820-002.2.log', - ), - ]), - ); - }); - - it('should finalize empty shards to empty result', () => { - const sw = new ShardedWal({ - dir: '/shards', - format: { - baseName: 'final', - finalExtension: '.json', - finalizer: records => `${JSON.stringify(records)}\n`, - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - - // Create the group directory - vol.mkdirSync('/shards/20231114-221320-000', { recursive: true }); - sw.finalize(); - - expect( - read('/shards/20231114-221320-000/final.20231114-221320-000.json'), - ).toBe('[]\n'); - }); - - it('should finalize multiple shards into single file', () => { - vol.fromJSON({ - '/shards/20231114-221320-000/merged.20240101-120000-001.1.log': - 'record1\n', - '/shards/20231114-221320-000/merged.20240101-120000-002.2.log': - 'record2\n', - }); - - const sw = new ShardedWal({ - dir: '/shards', - format: { - baseName: 'merged', - walExtension: '.log', - finalExtension: '.json', - finalizer: records => `${JSON.stringify(records)}\n`, - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - - sw.finalize(); - - const result = JSON.parse( - read( - '/shards/20231114-221320-000/merged.20231114-221320-000.json', - ).trim(), - ); - expect(result).toEqual(['record1', 'record2']); - }); - - it('should handle invalid entries during finalize', () => { - vol.fromJSON({ - '/shards/20231114-221320-000/final.20240101-120000-001.1.log': 'valid\n', - '/shards/20231114-221320-000/final.20240101-120000-002.2.log': - 'invalid\n', - }); - const tolerantCodec = createTolerantCodec({ - encode: (s: string) => s, - decode: (s: string) => { - if (s === 'invalid') throw new Error('Bad record'); - return s; - }, - }); - - const sw = new ShardedWal({ - dir: '/shards', - format: { - baseName: 'final', - walExtension: '.log', - finalExtension: '.json', - codec: tolerantCodec, - finalizer: records => `${JSON.stringify(records)}\n`, - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - - sw.finalize(); - - const result = JSON.parse( - read('/shards/20231114-221320-000/final.20231114-221320-000.json').trim(), - ); - expect(result).toHaveLength(2); - expect(result[0]).toBe('valid'); - expect(result[1]).toEqual({ __invalid: true, raw: 'invalid' }); - }); - - it('should cleanup shard files', () => { - vol.fromJSON({ - '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': - 'content1', - '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.2.log': - 'content2', - }); - const sw = new ShardedWal({ - dir: '/shards', - format: { - baseName: 'test', - walExtension: '.log', - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - - expect(vol.toJSON()).toStrictEqual({ - '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': - 'content1', - '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.2.log': - 'content2', - }); - - sw.cleanup(); - - expect(vol.toJSON()).toStrictEqual({}); - }); - - it('should handle cleanup when some shard files do not exist', () => { - vol.fromJSON({ - '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log': - 'content1', - }); - - const sw = new ShardedWal({ - dir: '/shards', - format: { - baseName: 'test', - walExtension: '.log', - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - - vol.unlinkSync( - '/shards/20231114-221320-000/test.20231114-221320-000.10001.2.1.log', - ); - expect(() => sw.cleanup()).not.toThrow(); - }); - - it('should use custom options in finalizer', () => { - vol.fromJSON({ - '/shards/20231114-221320-000/final.20231114-221320-000.10001.2.1.log': - 'record1\n', - }); - - const sw = new ShardedWal({ - dir: '/shards', - format: { - baseName: 'final', - walExtension: '.log', - finalExtension: '.json', - finalizer: (records, opt) => - `${JSON.stringify({ records, meta: opt })}\n`, - }, - coordinatorIdEnvVar: SHARDED_WAL_COORDINATOR_ID_ENV_VAR, - }); - - sw.finalize({ version: '1.0', compressed: true }); - - const result = JSON.parse( - read('/shards/20231114-221320-000/final.20231114-221320-000.json'), + expect(result.finalizer(records)).toBe( + '{"id":1,"name":"test"}\n{"id":2,"name":"test2"}\n', ); - expect(result.records).toEqual(['record1']); - expect(result.meta).toEqual({ version: '1.0', compressed: true }); }); }); diff --git a/testing/test-setup-config/src/lib/vitest-config-factory.ts b/testing/test-setup-config/src/lib/vitest-config-factory.ts index 0723f72b3..4f00b0031 100644 --- a/testing/test-setup-config/src/lib/vitest-config-factory.ts +++ b/testing/test-setup-config/src/lib/vitest-config-factory.ts @@ -13,11 +13,15 @@ function getIncludePatterns(kind: TestKind): string[] { switch (kind) { case 'unit': return [ + 'mocks/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.type.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', ]; case 'int': - return ['src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}']; + return [ + 'mocks/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + 'src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + ]; case 'e2e': return ['tests/**/*.e2e.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}']; } diff --git a/testing/test-setup-config/src/lib/vitest-config-factory.unit.test.ts b/testing/test-setup-config/src/lib/vitest-config-factory.unit.test.ts index df845928a..1484fd048 100644 --- a/testing/test-setup-config/src/lib/vitest-config-factory.unit.test.ts +++ b/testing/test-setup-config/src/lib/vitest-config-factory.unit.test.ts @@ -26,6 +26,7 @@ describe('createVitestConfig', () => { poolOptions: { threads: { singleThread: true } }, environment: 'node', include: [ + 'mocks/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.type.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', ], @@ -126,7 +127,10 @@ describe('createVitestConfig', () => { test: expect.objectContaining({ reporters: ['basic'], globals: true, - include: ['src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}'], + include: [ + 'mocks/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + 'src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + ], globalSetup: ['../../global-setup.ts'], coverage: expect.objectContaining({ reportsDirectory: '../../coverage/test-package/int-tests', @@ -243,10 +247,14 @@ describe('createVitestConfig', () => { const expectedIncludes = { unit: [ + 'mocks/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.unit.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', 'src/**/*.type.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', ], - int: ['src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}'], + int: [ + 'mocks/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + 'src/**/*.int.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}', + ], e2e: ['tests/**/*.e2e.test.{js,mjs,cjs,ts,mts,cts,jsx,tsx}'], }; diff --git a/testing/test-utils/src/index.ts b/testing/test-utils/src/index.ts index 38ce50155..f46cff971 100644 --- a/testing/test-utils/src/index.ts +++ b/testing/test-utils/src/index.ts @@ -10,5 +10,4 @@ export * from './lib/utils/file-system.js'; export * from './lib/utils/create-npm-workshpace.js'; export * from './lib/utils/project-graph.js'; export * from './lib/utils/test-folder-setup.js'; -export * from './lib/utils/omit-trace-json.js'; export * from './lib/utils/profiler.mock.js'; diff --git a/testing/test-utils/src/lib/utils/omit-trace-json.ts b/testing/test-utils/src/lib/utils/omit-trace-json.ts deleted file mode 100644 index e45a72a51..000000000 --- a/testing/test-utils/src/lib/utils/omit-trace-json.ts +++ /dev/null @@ -1,207 +0,0 @@ -/** - * Normalizes trace JSONL files for deterministic snapshot testing. - * - * Replaces variable values (pid, tid, ts) with deterministic incremental values - * while preserving the original order of events. - * - * - Assigns incremental IDs to pid fields starting from 10001, 10002, etc. - * - Assigns incremental IDs to tid fields starting from 1, 2, etc. - * - Normalizes timestamps by sorting them first to determine incremental order, - * then mapping to incremental values starting from mocked epoch clock base, - * while preserving the original order of events in the output. - * - * @param jsonlContent - JSONL string content (one JSON object per line) or parsed JSON object/array - * @param baseTimestampUs - Base timestamp in microseconds to start incrementing from (default: 1_700_000_005_000_000) - * @returns Normalized JSONL string with deterministic pid, tid, and ts values - */ -export function omitTraceJson( - jsonlContent: string | object, - baseTimestampUs = 1_700_000_005_000_000, -): string { - if (typeof jsonlContent !== 'string') { - const eventsArray = Array.isArray(jsonlContent) - ? jsonlContent - : [jsonlContent]; - if (eventsArray.length === 0) { - return ''; - } - const events = eventsArray as TraceEvent[]; - return normalizeAndFormatEvents(events, baseTimestampUs); - } - - // Handle string input (JSONL format) - const trimmedContent = jsonlContent.trim(); - if (!trimmedContent) { - return jsonlContent; - } - - // Parse all events from JSONL - const events = trimmedContent - .split('\n') - .filter(Boolean) - .map(line => JSON.parse(line) as TraceEvent); - - if (events.length === 0) { - return jsonlContent; - } - - return normalizeAndFormatEvents(events, baseTimestampUs); -} - -/** - * Normalizes trace events and formats them as JSONL. - */ -function normalizeAndFormatEvents( - events: TraceEvent[], - baseTimestampUs: number, -): string { - if (events.length === 0) { - return ''; - } - - // Collect unique pid and tid values - type Accumulator = { - uniquePids: Set; - uniqueTids: Set; - timestamps: number[]; - uniqueLocalIds: Set; - }; - - const { uniquePids, uniqueTids, timestamps, uniqueLocalIds } = - events.reduce( - (acc, event) => { - const newUniquePids = new Set(acc.uniquePids); - const newUniqueTids = new Set(acc.uniqueTids); - const newUniqueLocalIds = new Set(acc.uniqueLocalIds); - - if (typeof event.pid === 'number') { - newUniquePids.add(event.pid); - } - if (typeof event.tid === 'number') { - newUniqueTids.add(event.tid); - } - - const newTimestamps = - typeof event.ts === 'number' - ? [...acc.timestamps, event.ts] - : acc.timestamps; - - // Collect id2.local values - if ( - event.id2 && - typeof event.id2 === 'object' && - 'local' in event.id2 && - typeof event.id2.local === 'string' - ) { - newUniqueLocalIds.add(event.id2.local); - } - - return { - uniquePids: newUniquePids, - uniqueTids: newUniqueTids, - timestamps: newTimestamps, - uniqueLocalIds: newUniqueLocalIds, - }; - }, - { - uniquePids: new Set(), - uniqueTids: new Set(), - timestamps: [] as number[], - uniqueLocalIds: new Set(), - }, - ); - - // Create mappings: original value -> normalized incremental value - const pidMap = new Map(); - const tidMap = new Map(); - const localIdMap = new Map(); - - // Sort unique values to ensure consistent mapping order - const sortedPids = [...uniquePids].sort((a, b) => a - b); - const sortedTids = [...uniqueTids].sort((a, b) => a - b); - const sortedLocalIds = [...uniqueLocalIds].sort(); - - // Map pids starting from 10001 - sortedPids.forEach((pid, index) => { - pidMap.set(pid, 10_001 + index); - }); - - // Map tids starting from 1 - sortedTids.forEach((tid, index) => { - tidMap.set(tid, 1 + index); - }); - - // Map local IDs starting from "0x1" - sortedLocalIds.forEach((localId, index) => { - localIdMap.set(localId, `0x${(index + 1).toString(16)}`); - }); - - // Sort timestamps to determine incremental order - const sortedTimestamps = [...timestamps].sort((a, b) => a - b); - - // Map timestamps incrementally starting from baseTimestampUs - const tsMap = sortedTimestamps.reduce((map, ts, index) => { - if (!map.has(ts)) { - return new Map(map).set(ts, baseTimestampUs + index); - } - return map; - }, new Map()); - - // Normalize events while preserving original order - const normalizedEvents = events.map(event => { - const pidUpdate = - typeof event.pid === 'number' && pidMap.has(event.pid) - ? { pid: pidMap.get(event.pid)! } - : {}; - - const tidUpdate = - typeof event.tid === 'number' && tidMap.has(event.tid) - ? { tid: tidMap.get(event.tid)! } - : {}; - - const tsUpdate = - typeof event.ts === 'number' && tsMap.has(event.ts) - ? { ts: tsMap.get(event.ts)! } - : {}; - - // Normalize id2.local if present - const id2Update = - event.id2 && - typeof event.id2 === 'object' && - 'local' in event.id2 && - typeof event.id2.local === 'string' && - localIdMap.has(event.id2.local) - ? { - id2: { - ...event.id2, - local: localIdMap.get(event.id2.local)!, - }, - } - : {}; - - return { - ...event, - ...pidUpdate, - ...tidUpdate, - ...tsUpdate, - ...id2Update, - }; - }); - - // Convert back to JSONL format - return `${normalizedEvents.map(event => JSON.stringify(event)).join('\n')}\n`; -} - -/** - * Trace event structure with pid, tid, ts, and id2.local fields. - */ -type TraceEvent = { - pid?: number; - tid?: number; - ts?: number; - id2?: { - local?: string; - [key: string]: unknown; - }; - [key: string]: unknown; -}; diff --git a/testing/test-utils/src/lib/utils/omit-trace-json.unit.test.ts b/testing/test-utils/src/lib/utils/omit-trace-json.unit.test.ts deleted file mode 100644 index dbf5a079a..000000000 --- a/testing/test-utils/src/lib/utils/omit-trace-json.unit.test.ts +++ /dev/null @@ -1,235 +0,0 @@ -import { omitTraceJson } from './omit-trace-json.js'; - -describe('omitTraceJson', () => { - it('should return empty string unchanged', () => { - expect(omitTraceJson('')).toBe(''); - }); - - it('should return whitespace-only string unchanged', () => { - expect(omitTraceJson(' \n\t ')).toBe(' \n\t '); - }); - - it('should return empty JSONL unchanged', () => { - expect(omitTraceJson('\n\n')).toBe('\n\n'); - }); - - it('should return minimal event unchanged', () => { - const input = '{"name":"test"}\n'; - expect(omitTraceJson(input)).toBe(input); - }); - - it('should normalize pid field starting from 10001', () => { - const result = omitTraceJson('{"pid":12345}\n'); - const parsed = JSON.parse(result.trim()); - expect(parsed.pid).toBe(10_001); - }); - - it('should normalize tid field starting from 1', () => { - const result = omitTraceJson('{"tid":999}\n'); - const parsed = JSON.parse(result.trim()); - expect(parsed.tid).toBe(1); - }); - - it('should normalize ts field with default baseTimestampUs', () => { - const result = omitTraceJson('{"ts":1234567890}\n'); - const parsed = JSON.parse(result.trim()); - expect(parsed.ts).toBe(1_700_000_005_000_000); - }); - - it('should normalize ts field with custom baseTimestampUs', () => { - const customBase = 2_000_000_000_000_000; - const result = omitTraceJson('{"ts":1234567890}\n', customBase); - const parsed = JSON.parse(result.trim()); - expect(parsed.ts).toBe(customBase); - }); - - it('should normalize id2.local field starting from 0x1', () => { - const result = omitTraceJson('{"id2":{"local":"0xabc123"}}\n'); - const parsed = JSON.parse(result.trim()); - expect(parsed.id2.local).toBe('0x1'); - }); - - it('should preserve event order when timestamps are out of order', () => { - const input = - '{"ts":300,"name":"third"}\n{"ts":100,"name":"first"}\n{"ts":200,"name":"second"}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].name).toBe('third'); - expect(events[1].name).toBe('first'); - expect(events[2].name).toBe('second'); - expect(events[0].ts).toBe(1_700_000_005_000_002); - expect(events[1].ts).toBe(1_700_000_005_000_000); - expect(events[2].ts).toBe(1_700_000_005_000_001); - }); - - it('should preserve event order when PIDs are out of order', () => { - const input = - '{"pid":300,"name":"third"}\n{"pid":100,"name":"first"}\n{"pid":200,"name":"second"}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].name).toBe('third'); - expect(events[1].name).toBe('first'); - expect(events[2].name).toBe('second'); - expect(events[0].pid).toBe(10_003); - expect(events[1].pid).toBe(10_001); - expect(events[2].pid).toBe(10_002); - }); - - it('should preserve event order when TIDs are out of order', () => { - const input = - '{"tid":30,"name":"third"}\n{"tid":10,"name":"first"}\n{"tid":20,"name":"second"}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].name).toBe('third'); - expect(events[1].name).toBe('first'); - expect(events[2].name).toBe('second'); - expect(events[0].tid).toBe(3); - expect(events[1].tid).toBe(1); - expect(events[2].tid).toBe(2); - }); - - it('should preserve event order with mixed out-of-order fields', () => { - const input = - '{"pid":500,"tid":5,"ts":5000,"name":"e"}\n{"pid":100,"tid":1,"ts":1000,"name":"a"}\n{"pid":300,"tid":3,"ts":3000,"name":"c"}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events.map(e => e.name)).toEqual(['e', 'a', 'c']); - expect(events[0].pid).toBe(10_003); - expect(events[1].pid).toBe(10_001); - expect(events[2].pid).toBe(10_002); - }); - - it('should not normalize non-number pid values', () => { - const input = '{"pid":"string"}\n{"pid":null}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].pid).toBe('string'); - expect(events[1].pid).toBeNull(); - }); - - it('should not normalize non-number tid values', () => { - const input = '{"tid":"string"}\n{"tid":null}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].tid).toBe('string'); - expect(events[1].tid).toBeNull(); - }); - - it('should not normalize non-number ts values', () => { - const input = '{"ts":"string"}\n{"ts":null}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].ts).toBe('string'); - expect(events[1].ts).toBeNull(); - }); - - it('should not normalize id2.local when id2 is missing', () => { - const input = '{"name":"test"}\n'; - const result = omitTraceJson(input); - const parsed = JSON.parse(result.trim()); - expect(parsed.id2).toBeUndefined(); - }); - - it('should not normalize id2.local when id2 is not an object', () => { - const input = '{"id2":"string"}\n{"id2":null}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].id2).toBe('string'); - expect(events[1].id2).toBeNull(); - }); - - it('should not normalize id2.local when local is missing', () => { - const input = '{"id2":{"other":"value"}}\n'; - const result = omitTraceJson(input); - const parsed = JSON.parse(result.trim()); - expect(parsed.id2.local).toBeUndefined(); - expect(parsed.id2.other).toBe('value'); - }); - - it('should not normalize id2.local when local is not a string', () => { - const input = '{"id2":{"local":123}}\n{"id2":{"local":null}}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].id2.local).toBe(123); - expect(events[1].id2.local).toBeNull(); - }); - - it('should map duplicate values to same normalized value', () => { - const input = '{"pid":100}\n{"pid":200}\n{"pid":100}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].pid).toBe(10_001); - expect(events[1].pid).toBe(10_002); - expect(events[2].pid).toBe(10_001); - }); - - it('should handle duplicate timestamps correctly', () => { - const input = '{"ts":1000}\n{"ts":2000}\n{"ts":1000}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - expect(events[0].ts).toBe(1_700_000_005_000_000); - expect(events[1].ts).toBe(1_700_000_005_000_002); - expect(events[2].ts).toBe(1_700_000_005_000_000); - }); - - it('should preserve other id2 properties when normalizing local', () => { - const input = - '{"id2":{"local":"0xabc","other":"value","nested":{"key":123}}}\n'; - const result = omitTraceJson(input); - const parsed = JSON.parse(result.trim()); - expect(parsed.id2.local).toBe('0x1'); - expect(parsed.id2.other).toBe('value'); - expect(parsed.id2.nested).toEqual({ key: 123 }); - }); - - it('should map multiple id2.local values to incremental hex', () => { - const input = - '{"id2":{"local":"0xabc"}}\n{"id2":{"local":"0xdef"}}\n{"id2":{"local":"0x123"}}\n'; - const result = omitTraceJson(input); - const events = result - .trim() - .split('\n') - .map(line => JSON.parse(line)); - const locals = events.map(e => e.id2.local).sort(); - expect(locals).toEqual(['0x1', '0x2', '0x3']); - }); - - it('should output valid JSONL with trailing newline', () => { - const result = omitTraceJson('{"pid":123}\n'); - expect(result).toMatch(/\n$/); - expect(() => JSON.parse(result.trim())).not.toThrow(); - }); -});