Skip to content

Update npm package effect to v3.20.0 [SECURITY]#8565

Open
hash-worker[bot] wants to merge 1 commit intomainfrom
deps/js/npm-effect-vulnerability
Open

Update npm package effect to v3.20.0 [SECURITY]#8565
hash-worker[bot] wants to merge 1 commit intomainfrom
deps/js/npm-effect-vulnerability

Conversation

@hash-worker
Copy link
Contributor

@hash-worker hash-worker bot commented Mar 20, 2026

This PR contains the following updates:

Package Change Age Confidence
effect (source) 3.18.4 -> 3.20.0 age confidence

Warning

Some dependencies could not be looked up. Check the Dependency Dashboard for more information.

GitHub Vulnerability Alerts

CVE-2026-32887

Versions

  • effect: 3.19.15
  • @effect/rpc: 0.72.1
  • @effect/platform: 0.94.2
  • Node.js: v22.20.0
  • Vercel runtime with Fluid compute
  • Next.js: 16 (App Router)
  • @clerk/nextjs: 6.x

Root cause

Effect's MixedScheduler batches fiber continuations and drains them inside a single microtask or timer callback. The AsyncLocalStorage context active during that callback belongs to whichever request first triggered the scheduler's drain cycle — not the request that owns the fiber being resumed.

Detailed mechanism

1. Scheduler batching (effect/src/Scheduler.ts, MixedScheduler)

// MixedScheduler.starve() — called once when first task is scheduled
private starve(depth = 0) {
  if (depth >= this.maxNextTickBeforeTimer) {
    setTimeout(() => this.starveInternal(0), 0)       // timer queue
  } else {
    Promise.resolve(void 0).then(() => this.starveInternal(depth + 1)) // microtask queue
  }
}

// MixedScheduler.starveInternal() — drains ALL accumulated tasks in one call
private starveInternal(depth: number) {
  const tasks = this.tasks.buckets
  this.tasks.buckets = []
  for (const [_, toRun] of tasks) {
    for (let i = 0; i < toRun.length; i++) {
      toRun[i]()  // ← Every fiber continuation runs in the SAME ALS context
    }
  }
  // ...
}

scheduleTask only calls starve() when running is false. Subsequent tasks accumulate in this.tasks until starveInternal drains them all. The Promise.then() (or setTimeout) callback inherits the ALS context from whichever call site created it — i.e., whichever request's fiber first set running = true.

Result: Under concurrent load, fiber continuations from Request A and Request B execute inside the same starveInternal call, sharing a single ALS context. If Request A triggered starve(), then Request B's fiber reads Request A's ALS context.

2. toWebHandlerRuntime does not propagate ALS (@effect/platform/src/HttpApp.ts:211-240)

export const toWebHandlerRuntime = <R>(runtime: Runtime.Runtime<R>) => {
  const httpRuntime: Types.Mutable<Runtime.Runtime<R>> = Runtime.make(runtime)
  const run = Runtime.runFork(httpRuntime)
  return <E>(self: Default<E, R | Scope.Scope>, middleware?) => {
    return (request: Request, context?): Promise<Response> =>
      new Promise((resolve) => {
        // Per-request Effect context is correctly set via contextMap:
        const contextMap = new Map<string, any>(runtime.context.unsafeMap)
        const httpServerRequest = ServerRequest.fromWeb(request)
        contextMap.set(ServerRequest.HttpServerRequest.key, httpServerRequest)
        httpRuntime.context = Context.unsafeMake(contextMap)

        // But the fiber is forked without any ALS propagation:
        const fiber = run(httpApp as any)  // ← ALS context is NOT captured or restored
      })
  }
}

Effect's own Context (containing HttpServerRequest) is correctly set per-request. But the Node.js ALS context — which frameworks like Next.js, Clerk, and OpenTelemetry rely on — is not captured at fork time or restored when the fiber's continuations execute.

3. The dangerous pattern this enables

// RPC handler — runs inside an Effect fiber
const handler = Effect.gen(function*() {
  // This calls auth() from @&#8203;clerk/nextjs/server, which reads from ALS
  const { userId } = yield* Effect.tryPromise({
    try: async () => auth(),  // ← may read WRONG user's session
    catch: () => new UnauthorizedError({ message: "Auth failed" })
  })
  return yield* repository.getUser(userId)
})

The async () => auth() thunk executes when the fiber continuation is scheduled by MixedScheduler. At that point, the ALS context belongs to an arbitrary concurrent request.

Reproduction scenario

Timeline (two concurrent requests to the same toWebHandler endpoint):

T0: Request A arrives → POST handler → webHandler(requestA)
    → Promise executor runs synchronously
    → httpRuntime.context set to A's context
    → fiber A forked, runs first ops synchronously
    → fiber A yields (e.g., at Effect.tryPromise boundary)
    → scheduler.scheduleTask(fiberA_continuation)
    → running=false → starve() called → Promise.resolve().then(drain)
       ↑ ALS context captured = Request A's context

T1: Request B arrives → POST handler → webHandler(requestB)
    → Promise executor runs synchronously
    → httpRuntime.context set to B's context
    → fiber B forked, runs first ops synchronously
    → fiber B yields
    → scheduler.scheduleTask(fiberB_continuation)
    → running=true → task queued, no new starve()

T2: Microtask fires → starveInternal() runs
    → Drains fiberA_continuation → auth() reads ALS → gets A's context ✓
    → Drains fiberB_continuation → auth() reads ALS → gets A's context ✗ ← WRONG USER

Minimal reproduction

import { AsyncLocalStorage } from "node:async_hooks"
import { Effect, Layer } from "effect"
import { RpcServer, RpcSerialization, Rpc, RpcGroup } from "@&#8203;effect/rpc"
import { HttpServer } from "@&#8203;effect/platform"
import * as S from "effect/Schema"

// Simulate a framework's ALS (like Next.js / Clerk)
const requestStore = new AsyncLocalStorage<{ userId: string }>()

class GetUser extends Rpc.make("GetUser", {
  success: S.Struct({ userId: S.String, alsUserId: S.String }),
  failure: S.Never,
  payload: {}
}) {}

const MyRpc = RpcGroup.make("MyRpc").add(GetUser)

const MyRpcLive = MyRpc.toLayer(
  RpcGroup.toHandlers(MyRpc, {
    GetUser: () =>
      Effect.gen(function*() {
        // Simulate calling an ALS-dependent API inside an Effect fiber
        const alsResult = yield* Effect.tryPromise({
          try: async () => {
            const store = requestStore.getStore()
            return store?.userId ?? "NONE"
          },
          catch: () => { throw new Error("impossible") }
        })
        return { userId: "from-effect-context", alsUserId: alsResult }
      })
  })
)

const RpcLayer = MyRpcLive.pipe(
  Layer.provideMerge(RpcSerialization.layerJson),
  Layer.provideMerge(HttpServer.layerContext)
)

const { handler } = RpcServer.toWebHandler(MyRpc, { layer: RpcLayer })

// Simulate two concurrent requests with different ALS contexts
async function main() {
  const results = await Promise.all([
    requestStore.run({ userId: "user-A" }, () => handler(makeRpcRequest("GetUser"))),
    requestStore.run({ userId: "user-B" }, () => handler(makeRpcRequest("GetUser"))),
  ])

  // Parse responses and check if alsUserId matches the expected user
  // Under the bug: both responses may show "user-A" (or one shows the other's)
  for (const res of results) {
    console.log(await res.json())
  }
}

Impact

Symptom Severity
auth() returns wrong user's session Critical — authentication bypass
cookies() / headers() from Next.js read wrong request High — data leakage
OpenTelemetry trace context crosses requests Medium — incorrect traces
Works locally, fails in production Hard to diagnose — only manifests under concurrent load

Workaround

Capture ALS-dependent values before entering the Effect runtime and pass them via Effect's own context system:

// In the route handler — OUTSIDE the Effect fiber (ALS is correct here)
export const POST = async (request: Request) => {
  const { userId } = await auth()  // ← Safe: still in Next.js ALS context

  // Inject into request headers or use the `context` parameter
  const headers = new Headers(request.headers)
  headers.set("x-clerk-auth-user-id", userId ?? "")
  const enrichedRequest = new Request(request.url, {
    method: request.method,
    headers,
    body: request.body,
    duplex: "half" as any,
  })

  return webHandler(enrichedRequest)
}

// In Effect handlers — read from HttpServerRequest headers instead of calling auth()
const getAuthenticatedUserId = Effect.gen(function*() {
  const req = yield* HttpServerRequest.HttpServerRequest
  const userId = req.headers["x-clerk-auth-user-id"]
  if (!userId) return yield* Effect.fail(new UnauthorizedError({ message: "Auth required" }))
  return userId
})

Suggested fix (for Effect maintainers)

Option A: Propagate ALS context through the scheduler

Capture the AsyncLocalStorage snapshot when a fiber continuation is scheduled, and restore it when the continuation executes:

// In MixedScheduler or the fiber runtime
import { AsyncLocalStorage } from "node:async_hooks"

scheduleTask(task: Task, priority: number) {
  // Capture current ALS context
  const snapshot = AsyncLocalStorage.snapshot()
  this.tasks.scheduleTask(() => snapshot(task), priority)
  // ...
}

AsyncLocalStorage.snapshot() (Node.js 20.5+) returns a function that, when called, restores the ALS context from the point of capture. This ensures each fiber continuation runs with its originating request's ALS context.

Trade-off: Adds one closure allocation per scheduled task. Could be opt-in via a FiberRef or scheduler option.

Option B: Capture ALS at runFork and restore per fiber step

When Runtime.runFork is called, capture the ALS snapshot and associate it with the fiber. Before each fiber step (in the fiber runtime's evaluateEffect loop), restore the snapshot.

Trade-off: More invasive but provides correct ALS propagation for the fiber's entire lifetime, including across flatMap chains and Effect.tryPromise thunks.

Option C: Document the limitation and provide a context injection API

If ALS propagation is intentionally not supported, document this prominently and provide a first-class API for toWebHandler to accept per-request context. The existing context?: Context.Context<never> parameter on the handler function partially addresses this, but it requires callers to know about the issue and manually extract values before entering Effect.

Related

POC replica of my setup

// Create web handler from Effect RPC
// sharedMemoMap ensures all RPC routes share the same connection pool
const { handler: webHandler, dispose } = RpcServer.toWebHandler(DemoRpc, {
  layer: RpcLayer,
  memoMap: sharedMemoMap,
});

/**
 * POST /api/rpc/demo
 */
export const POST = async (request: Request) => {
  return webHandler(request);
};

registerDispose(dispose);

Used util functions


/**
 * Creates a dispose registry that collects dispose callbacks and runs them
 * when `runAll` is invoked. Handles both sync and async dispose functions,
 * catching errors to prevent one failing dispose from breaking others.
 *
 * @&#8203;internal Exported for testing — use `registerDispose` in application code.
 */
export const makeDisposeRegistry = () => {
  const disposeFns: Array<() => void | Promise<void>> = []

  const runAll = () => {
    for (const fn of disposeFns) {
      try {
        const result = fn()
        if (result && typeof result.then === "function") {
          result.then(undefined, (err: unknown) => console.error("Dispose error:", err))
        }
      } catch (err) {
        console.error("Dispose error:", err)
      }
    }
  }

  const register = (dispose: () => void | Promise<void>) => {
    disposeFns.push(dispose)
  }

  return { register, runAll }
}

export const registerDispose: (dispose: () => void | Promise<void>) => void = globalValue(
  Symbol.for("@&#8203;global/RegisterDispose"),
  () => {
    const registry = makeDisposeRegistry()

    if (typeof process !== "undefined") {
      process.once("beforeExit", registry.runAll)
    }

    return registry.register
  }
)

The actual effect that was run within the RPC context that the bug was found

export const getAuthenticatedUserId: Effect.Effect<string, UnauthorizedError> =
  Effect.gen(function*() {
    const authResult = yield* Effect.tryPromise({
      try: async () => auth(),
      catch: () =>
        new UnauthorizedError({
          message: "Failed to get auth session"
        })
    })

    if (!authResult.userId) {
      return yield* Effect.fail(
        new UnauthorizedError({
          message: "Authentication required"
        })
      )
    }

    return authResult.userId
  })

Release Notes

Effect-TS/effect (effect)

v3.20.0

Compare Source

Minor Changes
Patch Changes
  • #​6107 fc82e81 Thanks @​gcanti! - Backport Types.VoidIfEmpty to 3.x

  • #​6088 82996bc Thanks @​taylorOntologize! - Schema: fix Schema.omit producing wrong result on Struct with optionalWith({ default }) and index signatures

    getIndexSignatures now handles Transformation AST nodes by delegating to ast.to, matching the existing behavior of getPropertyKeys and getPropertyKeyIndexedAccess. Previously, Schema.omit on a struct combining Schema.optionalWith (with { default }, { as: "Option" }, etc.) and Schema.Record would silently take the wrong code path, returning a Transformation with property signatures instead of a TypeLiteral with index signatures.

  • #​6086 4d97a61 Thanks @​taylorOntologize! - Schema: fix getPropertySignatures crash on Struct with optionalWith({ default }) and other Transformation-producing variants

    SchemaAST.getPropertyKeyIndexedAccess now handles Transformation AST nodes by delegating to ast.to, matching the existing behavior of getPropertyKeys. Previously, calling getPropertySignatures on a Schema.Struct containing Schema.optionalWith with { default }, { as: "Option" }, { nullable: true }, or similar options would throw "Unsupported schema (Transformation)".

  • #​6097 f6b0960 Thanks @​gcanti! - Fix TupleWithRest post-rest validation to check each tail index sequentially.

v3.19.19

Compare Source

Patch Changes

v3.19.18

Compare Source

Patch Changes

v3.19.17

Compare Source

Patch Changes

v3.19.16

Compare Source

Patch Changes
  • #​6018 e71889f Thanks @​codewithkenzo! - fix(Match): handle null/undefined in Match.tag and Match.tagStartsWith

    Added null checks to discriminator and discriminatorStartsWith predicates to prevent crashes when matching nullable union types.

    Fixes #​6017

v3.19.15

Compare Source

Patch Changes
  • #​5981 7e925ea Thanks @​bxff! - Fix type inference loss in Array.flatten for complex nested structures like unions of Effects with contravariant requirements. Uses distributive indexed access (T[number][number]) in the Flatten type utility and adds const to the flatten generic parameter.

  • #​5970 d7e75d6 Thanks @​KhraksMamtsov! - fix Config.orElseIf signature

  • #​5996 4860d1e Thanks @​parischap! - fix Equal.equals plain object comparisons in structural mode

v3.19.14

Compare Source

Patch Changes

v3.19.13

Compare Source

Patch Changes

v3.19.12

Compare Source

Patch Changes

v3.19.11

Compare Source

Patch Changes
  • #​5888 38abd67 Thanks @​gcanti! - filter non-JSON values from schema examples and defaults, closes #​5884

    Introduce JsonValue type and update JsonSchemaAnnotations to use it for
    type safety. Add validation to filter invalid values (BigInt, cyclic refs)
    from examples and defaults, preventing infinite recursion on cycles.

  • #​5885 44e0b04 Thanks @​gcanti! - feat(JSONSchema): add missing options for target JSON Schema version in make function, closes #​5883

v3.19.10

Compare Source

Patch Changes

v3.19.9

Compare Source

Patch Changes

v3.19.8

Compare Source

Patch Changes
  • #​5815 f03b8e5 Thanks @​lokhmakov! - Prevent multiple iterations over the same Iterable in Array.intersectionWith and Array.differenceWith

v3.19.7

Compare Source

Patch Changes

v3.19.6

Compare Source

Patch Changes

v3.19.5

Compare Source

Patch Changes

v3.19.4

Compare Source

Patch Changes
  • #​5752 f445b87 Thanks @​janglad! - Fix Types.DeepMutable mapping over functions

  • #​5757 d2b68ac Thanks @​tim-smart! - add experimental PartitionedSemaphore module

    A PartitionedSemaphore is a concurrency primitive that can be used to
    control concurrent access to a resource across multiple partitions identified
    by keys.

    The total number of permits is shared across all partitions, with waiting
    permits equally distributed among partitions using a round-robin strategy.

    This is useful when you want to limit the total number of concurrent accesses
    to a resource, while still allowing for fair distribution of access across
    different partitions.

    import { Effect, PartitionedSemaphore } from "effect"
    
    Effect.gen(function* () {
      const semaphore = yield* PartitionedSemaphore.make<string>({ permits: 5 })
    
      // Take the first 5 permits with key "A", then the following permits will be
      // equally distributed between all the keys using a round-robin strategy
      yield* Effect.log("A").pipe(
        Effect.delay(1000),
        semaphore.withPermits("A", 1),
        Effect.replicateEffect(15, { concurrency: "unbounded" }),
        Effect.fork
      )
      yield* Effect.log("B").pipe(
        Effect.delay(1000),
        semaphore.withPermits("B", 1),
        Effect.replicateEffect(10, { concurrency: "unbounded" }),
        Effect.fork
      )
      yield* Effect.log("C").pipe(
        Effect.delay(1000),
        semaphore.withPermits("C", 1),
        Effect.replicateEffect(10, { concurrency: "unbounded" }),
        Effect.fork
      )
    
      return yield* Effect.never
    }).pipe(Effect.runFork)

v3.19.3

Compare Source

Patch Changes

v3.19.2

Compare Source

Patch Changes

v3.19.1

Compare Source

Patch Changes

v3.19.0

Compare Source

Minor Changes
Patch Changes

v3.18.5

Compare Source

Patch Changes
  • #​5669 a537469 Thanks @​fubhy! - Fix Graph.neighbors() returning self-loops in undirected graphs.

    Graph.neighbors() now correctly returns the other endpoint for undirected graphs instead of always returning edge.target, which caused nodes to appear as their own neighbors when queried from the target side of an edge.

  • #​5628 52d5963 Thanks @​mikearnaldi! - Make sure AsEffect is computed

  • #​5671 463345d Thanks @​gcanti! - JSON Schema generation: add jsonSchema2020-12 target and fix tuple output for:

    • JSON Schema 2019-09
    • OpenAPI 3.1

Configuration

📅 Schedule: Branch creation - "" (UTC), Automerge - "before 4am every weekday,every weekend" (UTC).

🚦 Automerge: Enabled.

Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.

🔕 Ignore: Close this PR and you won't be reminded about this update again.


  • If you want to rebase/retry this PR, check this box

This PR has been generated by Renovate Bot.

@vercel
Copy link

vercel bot commented Mar 20, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
hash Ready Ready Preview, Comment Mar 20, 2026 10:05pm
hashdotdesign Ready Ready Preview, Comment Mar 20, 2026 10:05pm
hashdotdesign-tokens Ready Ready Preview, Comment Mar 20, 2026 10:05pm
petrinaut Ready Ready Preview, Comment Mar 20, 2026 10:05pm

@cursor
Copy link

cursor bot commented Mar 20, 2026

PR Summary

Medium Risk
Upgrading a core runtime library (effect) can subtly change scheduling/async behavior across services that depend on it. Risk is mitigated by being a targeted version bump with lockfile updates only.

Overview
Updates the effect dependency from 3.18.4 to 3.20.0 across the API app and shared TypeScript packages (@local/eslint, @local/hash-graph-sdk, @local/harpc-client).

Regenerates yarn.lock to pull in the new effect@3.20.0 resolution/checksum.

Written by Cursor Bugbot for commit 437b27c. This will update automatically on new commits. Configure here.

@github-actions github-actions bot added area/deps Relates to third-party dependencies (area) area/apps > hash* Affects HASH (a `hash-*` app) area/apps > hash-api Affects the HASH API (app) area/libs Relates to first-party libraries/crates/packages (area) type/eng > backend Owned by the @backend team area/apps labels Mar 20, 2026
@augmentcode
Copy link

augmentcode bot commented Mar 20, 2026

🤖 Augment PR Summary

Summary: Updates the effect dependency to v3.20.0 to incorporate the upstream fix for AsyncLocalStorage isolation (CVE-2026-32887 / GHSA-38f7-945m-qr2g).
Changes: Bumps effect from 3.18.43.20.0 in the affected workspace package.json files (with corresponding lockfile update).

🤖 Was this summary useful? React with 👍 or 👎

Copy link

@augmentcode augmentcode bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review completed. No suggestions at this time.

Comment augment review to trigger a new review at any time.

@codspeed-hq
Copy link

codspeed-hq bot commented Mar 20, 2026

Merging this PR will not alter performance

✅ 80 untouched benchmarks


Comparing deps/js/npm-effect-vulnerability (437b27c) with main (1512579)1

Open in CodSpeed

Footnotes

  1. No successful run was found on main (618964d) during the generation of this report, so 1512579 was used instead as the comparison base. There might be some changes unrelated to this pull request in this report.

@codecov
Copy link

codecov bot commented Mar 20, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 62.49%. Comparing base (618964d) to head (437b27c).

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #8565      +/-   ##
==========================================
- Coverage   62.60%   62.49%   -0.11%     
==========================================
  Files        1317     1318       +1     
  Lines      133975   134209     +234     
  Branches     5517     5517              
==========================================
+ Hits        83877    83878       +1     
- Misses      49183    49416     +233     
  Partials      915      915              
Flag Coverage Δ
apps.hash-ai-worker-ts 1.40% <ø> (ø)
apps.hash-api 0.00% <ø> (ø)
blockprotocol.type-system 40.84% <ø> (ø)
local.claude-hooks 0.00% <ø> (ø)
local.harpc-client 51.24% <ø> (ø)
local.hash-graph-sdk 9.63% <ø> (ø)
local.hash-isomorphic-utils 0.00% <ø> (ø)
rust.antsi 0.00% <ø> (ø)
rust.error-stack 90.88% <ø> (ø)
rust.harpc-codec 84.70% <ø> (ø)
rust.harpc-net 96.18% <ø> (+0.01%) ⬆️
rust.harpc-tower 66.80% <ø> (ø)
rust.harpc-types 0.00% <ø> (ø)
rust.harpc-wire-protocol 92.23% <ø> (ø)
rust.hash-codec 72.76% <ø> (ø)
rust.hash-graph-api 2.52% <ø> (ø)
rust.hash-graph-authorization 62.34% <ø> (ø)
rust.hash-graph-postgres-store 26.39% <ø> (-0.34%) ⬇️
rust.hash-graph-store 37.76% <ø> (-0.13%) ⬇️
rust.hash-graph-temporal-versioning 47.95% <ø> (ø)
rust.hash-graph-types 0.00% <ø> (ø)
rust.hash-graph-validation 83.45% <ø> (ø)
rust.hashql-ast 87.23% <ø> (ø)
rust.hashql-compiletest 29.69% <ø> (ø)
rust.hashql-core 82.29% <ø> (ø)
rust.hashql-diagnostics 72.43% <ø> (ø)
rust.hashql-eval 69.13% <ø> (ø)
rust.hashql-hir 89.06% <ø> (ø)
rust.hashql-mir 92.64% <ø> (ø)
rust.hashql-syntax-jexpr 94.05% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@graphite-app graphite-app bot requested a review from a team March 20, 2026 22:17
@github-actions
Copy link
Contributor

Benchmark results

@rust/hash-graph-benches – Integrations

policy_resolution_large

Function Value Mean Flame graphs
resolve_policies_for_actor user: empty, selectivity: high, policies: 2002 $$28.5 \mathrm{ms} \pm 169 \mathrm{μs}\left({\color{gray}-0.354 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: empty, selectivity: low, policies: 1 $$3.60 \mathrm{ms} \pm 21.4 \mathrm{μs}\left({\color{gray}-1.874 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: empty, selectivity: medium, policies: 1001 $$13.9 \mathrm{ms} \pm 91.1 \mathrm{μs}\left({\color{gray}-1.208 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: seeded, selectivity: high, policies: 3314 $$45.2 \mathrm{ms} \pm 307 \mathrm{μs}\left({\color{gray}0.957 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: seeded, selectivity: low, policies: 1 $$16.0 \mathrm{ms} \pm 121 \mathrm{μs}\left({\color{gray}-0.342 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: seeded, selectivity: medium, policies: 1526 $$26.4 \mathrm{ms} \pm 260 \mathrm{μs}\left({\color{gray}1.53 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: high, policies: 2078 $$29.4 \mathrm{ms} \pm 163 \mathrm{μs}\left({\color{gray}-1.132 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: low, policies: 1 $$3.92 \mathrm{ms} \pm 21.8 \mathrm{μs}\left({\color{gray}-1.518 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: medium, policies: 1033 $$15.2 \mathrm{ms} \pm 112 \mathrm{μs}\left({\color{gray}-0.060 \mathrm{\%}}\right) $$ Flame Graph

policy_resolution_medium

Function Value Mean Flame graphs
resolve_policies_for_actor user: empty, selectivity: high, policies: 102 $$3.99 \mathrm{ms} \pm 21.6 \mathrm{μs}\left({\color{gray}0.087 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: empty, selectivity: low, policies: 1 $$3.24 \mathrm{ms} \pm 24.4 \mathrm{μs}\left({\color{gray}1.68 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: empty, selectivity: medium, policies: 51 $$3.55 \mathrm{ms} \pm 19.1 \mathrm{μs}\left({\color{gray}0.390 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: seeded, selectivity: high, policies: 269 $$5.49 \mathrm{ms} \pm 30.7 \mathrm{μs}\left({\color{gray}0.804 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: seeded, selectivity: low, policies: 1 $$3.83 \mathrm{ms} \pm 21.0 \mathrm{μs}\left({\color{gray}1.27 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: seeded, selectivity: medium, policies: 107 $$4.39 \mathrm{ms} \pm 22.4 \mathrm{μs}\left({\color{gray}0.242 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: high, policies: 133 $$4.71 \mathrm{ms} \pm 26.2 \mathrm{μs}\left({\color{gray}1.08 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: low, policies: 1 $$3.70 \mathrm{ms} \pm 22.3 \mathrm{μs}\left({\color{gray}2.57 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: medium, policies: 63 $$4.38 \mathrm{ms} \pm 26.7 \mathrm{μs}\left({\color{gray}2.57 \mathrm{\%}}\right) $$ Flame Graph

policy_resolution_none

Function Value Mean Flame graphs
resolve_policies_for_actor user: empty, selectivity: high, policies: 2 $$2.89 \mathrm{ms} \pm 15.3 \mathrm{μs}\left({\color{gray}-1.793 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: empty, selectivity: low, policies: 1 $$2.85 \mathrm{ms} \pm 14.6 \mathrm{μs}\left({\color{gray}-1.233 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: empty, selectivity: medium, policies: 1 $$2.97 \mathrm{ms} \pm 22.7 \mathrm{μs}\left({\color{gray}-0.551 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: high, policies: 8 $$3.25 \mathrm{ms} \pm 18.0 \mathrm{μs}\left({\color{gray}-0.168 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: low, policies: 1 $$3.06 \mathrm{ms} \pm 16.3 \mathrm{μs}\left({\color{gray}-0.561 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: medium, policies: 3 $$3.37 \mathrm{ms} \pm 21.5 \mathrm{μs}\left({\color{gray}0.605 \mathrm{\%}}\right) $$ Flame Graph

policy_resolution_small

Function Value Mean Flame graphs
resolve_policies_for_actor user: empty, selectivity: high, policies: 52 $$3.28 \mathrm{ms} \pm 19.4 \mathrm{μs}\left({\color{gray}0.351 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: empty, selectivity: low, policies: 1 $$3.01 \mathrm{ms} \pm 17.4 \mathrm{μs}\left({\color{gray}0.180 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: empty, selectivity: medium, policies: 25 $$3.15 \mathrm{ms} \pm 17.1 \mathrm{μs}\left({\color{gray}-0.439 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: seeded, selectivity: high, policies: 94 $$3.73 \mathrm{ms} \pm 28.4 \mathrm{μs}\left({\color{gray}-0.354 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: seeded, selectivity: low, policies: 1 $$3.28 \mathrm{ms} \pm 15.6 \mathrm{μs}\left({\color{gray}0.283 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: seeded, selectivity: medium, policies: 26 $$3.52 \mathrm{ms} \pm 17.7 \mathrm{μs}\left({\color{gray}0.109 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: high, policies: 66 $$3.69 \mathrm{ms} \pm 25.7 \mathrm{μs}\left({\color{gray}0.272 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: low, policies: 1 $$3.27 \mathrm{ms} \pm 18.0 \mathrm{μs}\left({\color{gray}0.086 \mathrm{\%}}\right) $$ Flame Graph
resolve_policies_for_actor user: system, selectivity: medium, policies: 29 $$3.56 \mathrm{ms} \pm 23.2 \mathrm{μs}\left({\color{gray}0.336 \mathrm{\%}}\right) $$ Flame Graph

read_scaling_complete

Function Value Mean Flame graphs
entity_by_id;one_depth 1 entities $$47.8 \mathrm{ms} \pm 221 \mathrm{μs}\left({\color{gray}-1.142 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;one_depth 10 entities $$87.5 \mathrm{ms} \pm 492 \mathrm{μs}\left({\color{gray}0.926 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;one_depth 25 entities $$52.6 \mathrm{ms} \pm 291 \mathrm{μs}\left({\color{gray}-2.973 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;one_depth 5 entities $$57.2 \mathrm{ms} \pm 406 \mathrm{μs}\left({\color{gray}4.78 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;one_depth 50 entities $$66.9 \mathrm{ms} \pm 384 \mathrm{μs}\left({\color{gray}2.73 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;two_depth 1 entities $$49.9 \mathrm{ms} \pm 264 \mathrm{μs}\left({\color{gray}-0.085 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;two_depth 10 entities $$435 \mathrm{ms} \pm 1.11 \mathrm{ms}\left({\color{gray}0.681 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;two_depth 25 entities $$107 \mathrm{ms} \pm 638 \mathrm{μs}\left({\color{gray}1.12 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;two_depth 5 entities $$95.1 \mathrm{ms} \pm 349 \mathrm{μs}\left({\color{gray}0.747 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;two_depth 50 entities $$301 \mathrm{ms} \pm 1.00 \mathrm{ms}\left({\color{gray}-0.379 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;zero_depth 1 entities $$21.1 \mathrm{ms} \pm 121 \mathrm{μs}\left({\color{gray}0.058 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;zero_depth 10 entities $$21.5 \mathrm{ms} \pm 131 \mathrm{μs}\left({\color{gray}0.780 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;zero_depth 25 entities $$21.6 \mathrm{ms} \pm 122 \mathrm{μs}\left({\color{gray}-0.196 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;zero_depth 5 entities $$21.6 \mathrm{ms} \pm 124 \mathrm{μs}\left({\color{gray}0.003 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id;zero_depth 50 entities $$26.4 \mathrm{ms} \pm 119 \mathrm{μs}\left({\color{gray}2.55 \mathrm{\%}}\right) $$ Flame Graph

read_scaling_linkless

Function Value Mean Flame graphs
entity_by_id 1 entities $$20.7 \mathrm{ms} \pm 104 \mathrm{μs}\left({\color{gray}-2.025 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id 10 entities $$20.8 \mathrm{ms} \pm 96.9 \mathrm{μs}\left({\color{gray}0.117 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id 100 entities $$20.9 \mathrm{ms} \pm 115 \mathrm{μs}\left({\color{gray}1.08 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id 1000 entities $$21.2 \mathrm{ms} \pm 108 \mathrm{μs}\left({\color{gray}-1.079 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id 10000 entities $$29.0 \mathrm{ms} \pm 227 \mathrm{μs}\left({\color{gray}4.69 \mathrm{\%}}\right) $$ Flame Graph

representative_read_entity

Function Value Mean Flame graphs
entity_by_id entity type ID: https://blockprotocol.org/@alice/types/entity-type/block/v/1 $$39.0 \mathrm{ms} \pm 313 \mathrm{μs}\left({\color{gray}2.04 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id entity type ID: https://blockprotocol.org/@alice/types/entity-type/book/v/1 $$38.5 \mathrm{ms} \pm 321 \mathrm{μs}\left({\color{gray}4.78 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id entity type ID: https://blockprotocol.org/@alice/types/entity-type/building/v/1 $$37.0 \mathrm{ms} \pm 272 \mathrm{μs}\left({\color{gray}3.92 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id entity type ID: https://blockprotocol.org/@alice/types/entity-type/organization/v/1 $$38.2 \mathrm{ms} \pm 299 \mathrm{μs}\left({\color{red}5.46 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id entity type ID: https://blockprotocol.org/@alice/types/entity-type/page/v/2 $$38.3 \mathrm{ms} \pm 313 \mathrm{μs}\left({\color{red}6.70 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id entity type ID: https://blockprotocol.org/@alice/types/entity-type/person/v/1 $$37.1 \mathrm{ms} \pm 357 \mathrm{μs}\left({\color{gray}1.88 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id entity type ID: https://blockprotocol.org/@alice/types/entity-type/playlist/v/1 $$36.9 \mathrm{ms} \pm 413 \mathrm{μs}\left({\color{gray}4.29 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id entity type ID: https://blockprotocol.org/@alice/types/entity-type/song/v/1 $$37.2 \mathrm{ms} \pm 372 \mathrm{μs}\left({\color{gray}4.37 \mathrm{\%}}\right) $$ Flame Graph
entity_by_id entity type ID: https://blockprotocol.org/@alice/types/entity-type/uk-address/v/1 $$37.1 \mathrm{ms} \pm 353 \mathrm{μs}\left({\color{gray}0.010 \mathrm{\%}}\right) $$ Flame Graph

representative_read_entity_type

Function Value Mean Flame graphs
get_entity_type_by_id Account ID: bf5a9ef5-dc3b-43cf-a291-6210c0321eba $$9.39 \mathrm{ms} \pm 70.6 \mathrm{μs}\left({\color{gray}3.03 \mathrm{\%}}\right) $$ Flame Graph

representative_read_multiple_entities

Function Value Mean Flame graphs
entity_by_property traversal_paths=0 0 $$101 \mathrm{ms} \pm 569 \mathrm{μs}\left({\color{gray}1.82 \mathrm{\%}}\right) $$
entity_by_property traversal_paths=255 1,resolve_depths=inherit:1;values:255;properties:255;links:127;link_dests:126;type:true $$156 \mathrm{ms} \pm 745 \mathrm{μs}\left({\color{gray}0.838 \mathrm{\%}}\right) $$
entity_by_property traversal_paths=2 1,resolve_depths=inherit:0;values:0;properties:0;links:0;link_dests:0;type:false $$107 \mathrm{ms} \pm 681 \mathrm{μs}\left({\color{gray}0.193 \mathrm{\%}}\right) $$
entity_by_property traversal_paths=2 1,resolve_depths=inherit:0;values:0;properties:0;links:1;link_dests:0;type:true $$116 \mathrm{ms} \pm 579 \mathrm{μs}\left({\color{gray}0.055 \mathrm{\%}}\right) $$
entity_by_property traversal_paths=2 1,resolve_depths=inherit:0;values:0;properties:2;links:1;link_dests:0;type:true $$129 \mathrm{ms} \pm 596 \mathrm{μs}\left({\color{gray}3.46 \mathrm{\%}}\right) $$
entity_by_property traversal_paths=2 1,resolve_depths=inherit:0;values:2;properties:2;links:1;link_dests:0;type:true $$133 \mathrm{ms} \pm 691 \mathrm{μs}\left({\color{gray}0.044 \mathrm{\%}}\right) $$
link_by_source_by_property traversal_paths=0 0 $$107 \mathrm{ms} \pm 662 \mathrm{μs}\left({\color{gray}1.48 \mathrm{\%}}\right) $$
link_by_source_by_property traversal_paths=255 1,resolve_depths=inherit:1;values:255;properties:255;links:127;link_dests:126;type:true $$137 \mathrm{ms} \pm 670 \mathrm{μs}\left({\color{gray}1.61 \mathrm{\%}}\right) $$
link_by_source_by_property traversal_paths=2 1,resolve_depths=inherit:0;values:0;properties:0;links:0;link_dests:0;type:false $$113 \mathrm{ms} \pm 526 \mathrm{μs}\left({\color{gray}0.455 \mathrm{\%}}\right) $$
link_by_source_by_property traversal_paths=2 1,resolve_depths=inherit:0;values:0;properties:0;links:1;link_dests:0;type:true $$123 \mathrm{ms} \pm 692 \mathrm{μs}\left({\color{gray}1.16 \mathrm{\%}}\right) $$
link_by_source_by_property traversal_paths=2 1,resolve_depths=inherit:0;values:0;properties:2;links:1;link_dests:0;type:true $$127 \mathrm{ms} \pm 697 \mathrm{μs}\left({\color{gray}2.31 \mathrm{\%}}\right) $$
link_by_source_by_property traversal_paths=2 1,resolve_depths=inherit:0;values:2;properties:2;links:1;link_dests:0;type:true $$125 \mathrm{ms} \pm 562 \mathrm{μs}\left({\color{gray}1.15 \mathrm{\%}}\right) $$

scenarios

Function Value Mean Flame graphs
full_test query-limited $$136 \mathrm{ms} \pm 587 \mathrm{μs}\left({\color{gray}-0.493 \mathrm{\%}}\right) $$ Flame Graph
full_test query-unlimited $$148 \mathrm{ms} \pm 479 \mathrm{μs}\left({\color{gray}0.669 \mathrm{\%}}\right) $$ Flame Graph
linked_queries query-limited $$108 \mathrm{ms} \pm 482 \mathrm{μs}\left({\color{gray}-0.590 \mathrm{\%}}\right) $$ Flame Graph
linked_queries query-unlimited $$545 \mathrm{ms} \pm 970 \mathrm{μs}\left({\color{lightgreen}-6.017 \mathrm{\%}}\right) $$ Flame Graph

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area/apps > hash* Affects HASH (a `hash-*` app) area/apps > hash-api Affects the HASH API (app) area/apps area/deps Relates to third-party dependencies (area) area/libs Relates to first-party libraries/crates/packages (area) type/eng > backend Owned by the @backend team

Development

Successfully merging this pull request may close these issues.

0 participants