Skip to content

fix(logs): optimize LogRecord memory by removing redundant context#4977

Open
ajuijas wants to merge 2 commits intoopen-telemetry:mainfrom
ajuijas:issue-4957-logrecord-memory
Open

fix(logs): optimize LogRecord memory by removing redundant context#4977
ajuijas wants to merge 2 commits intoopen-telemetry:mainfrom
ajuijas:issue-4957-logrecord-memory

Conversation

@ajuijas
Copy link

@ajuijas ajuijas commented Mar 13, 2026

Removed direct storage of the Context object in LogRecord to prevent memory inflation when logs are buffered. Correlation IDs (TraceId, SpanId, TraceFlags) are still preserved.

Resolves #4957

Description

This PR optimizes LogRecord memory usage by removing the redundant storage of the Context object in the API.

Motivation:
Currently, each LogRecord stores a reference to the full Context it was created with. In high-throughput scenarios where logs are emitted within unique and large contexts (e.g., each request having unique baggage), these objects are pinned in memory as long as the logs are buffered in a processor (like BatchLogRecordProcessor). This leads to significant memory inflation that scales with both the number of buffered logs and the size of the contexts.

Changes:

  • Removed self.context = context from LogRecord.__init__ in opentelemetry-api.
  • Correlation IDs (trace_id, span_id, trace_flags) are still extracted from the context during initialization, ensuring that trace correlation remains fully functional.
  • Updated SDK tests to remove assertions that check for the presence of the .context attribute.

Fixes #4957

Type of change

  • Breaking change (fix or feature that would cause existing functionality to not work as expected)

Note

While this removes an attribute from LogRecord, the Logging SDK is currently marked as experimental/unstable. Addressing this architectural memory issue now prevents a major performance hurdle for future stable releases.

How Has This Been Tested?

I performed local benchmarking and ran the full unit test suite to ensure correctness and quantify the memory savings.

  1. Memory Benchmark: Emitted 2,000 logs, each within a unique context containing ~50 unique items (~50KB each).
    • Before Fix: Memory usage increased by ~154.4 MB.
    • After Fix: Memory usage increased by ~1.73 MB.
    • Result: Reduced memory overhead by ~99% in this scenario.
  2. Unit Tests: Executed the full suite of log tests across opentelemetry-api and opentelemetry-sdk.
    • opentelemetry-api/tests/logs/test_log_record.py: PASSED
    • opentelemetry-sdk/tests/logs/test_log_record.py: PASSED
    • opentelemetry-sdk/tests/logs/test_handler.py: PASSED
    • opentelemetry-sdk/tests/logs/test_logs.py: PASSED
    • Total: 43 PASSED, 0 FAILED.
  3. Correlation Verification: Verified that TraceId is still correctly assigned to LogRecord even without the explicit context attribute storage.

Does This PR Require a Contrib Repo Change?

  • No.

Checklist:

  • Followed the style guidelines of this project
  • Changelogs have been updated
  • Unit tests have been added/updated
  • Documentation has been updated (N/A - no public documentation references .context on LogRecord)

@ajuijas ajuijas requested a review from a team as a code owner March 13, 2026 10:16
@linux-foundation-easycla
Copy link

linux-foundation-easycla bot commented Mar 13, 2026

CLA Signed

The committers listed above are authorized under a signed CLA.

  • ✅ login: lzchen / name: Leighton Chen (2a3c956)

Removed direct storage of the Context object in LogRecord to prevent
memory inflation when logs are buffered. Correlation IDs (TraceId,
SpanId, TraceFlags) are still preserved.

Resolves open-telemetry#4957
@ajuijas ajuijas force-pushed the issue-4957-logrecord-memory branch from 45c7255 to c77ecd0 Compare March 13, 2026 10:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

why does LogRecord store the entire context it was called with

2 participants