Skip to content

Add --log-for human|machine flag for LLM/agent-friendly output #1507

@waldekmastykarz

Description

@waldekmastykarz

Summary

Add a new CLI flag --log-for human|machine to make Dev Proxy output consumable by coding agents and LLMs.

Problem

The current console output is optimized for human readability but creates problems when consumed by LLMs/coding agents:

Problem Description
ANSI escape codes Color codes like \x1B[31m pollute text and waste tokens
Unicode box characters ╭ ╰ │ are visual, not semantic — hard to parse programmatically
No structured output Console is human-readable prose; LLMs need JSON to reliably extract fields
Buffered output Messages held until request completes; agents cannot react in real-time
Terse/informal labels oops, skip, proc are cute but ambiguous for machines
Missing context Logs like "URL not matched" lack the actual URL
No correlation IDs Hard to trace request→response chains
Timestamps hidden Filtered by default; agents cannot measure timing

Example of current output

 req   ╭ GET https://api.example.com/users
 oops  │ 429 TooManyRequests
 warn  │ Exceeded resource limit when calling https://api.example.com/users
       ╰ 

What an agent needs

{"type":"request","method":"GET","url":"https://api.example.com/users","requestId":"1","timestamp":"2026-01-19T10:30:00.000Z"}
{"type":"chaos","plugin":"RateLimitingPlugin","status":429,"message":"TooManyRequests","requestId":"1","timestamp":"2026-01-19T10:30:00.050Z"}
{"type":"warning","plugin":"RateLimitingPlugin","message":"Exceeded resource limit","url":"https://api.example.com/users","requestId":"1","timestamp":"2026-01-19T10:30:00.051Z"}

Proposed solution

Add a new flag:

devproxy --log-for human   # Current behavior (default)
devproxy --log-for machine # Structured JSON Lines output

Why --log-for instead of --output json or --format json?

We evaluated common CLI conventions:

Tool Flag
kubectl -o, --output json|yaml|wide
Azure CLI -o, --output json|table|tsv
GitHub CLI --json

However, for LLM consumption, audience-centric naming is more intuitive than format-centric:

Aspect --log-format json --log-for machine
Semantics Describes the how Describes the who
LLM intuition "Output in JSON format" "This output is meant for me"
Discoverability Need to know JSON is the right format Obvious: "I am a machine, I use machine"
Future-proof Locked to specific format Can evolve what "machine" means internally

An LLM reading --help naturally self-identifies with --log-for machine.

Machine output requirements

When --log-for machine is set:

  1. JSON Lines format — one JSON object per line (streamable)
  2. No ANSI codes — strip all color/formatting escape sequences
  3. No Unicode decorations — no box-drawing characters
  4. Include correlation IDsrequestId in every log entry
  5. Include timestamps — ISO 8601 format
  6. Full context — URL, method, plugin name in each entry
  7. Semantic message typesrequest, response, warning, error, mock, chaos (not oops, skip)

Configuration

Also support in devproxyrc.json:

{
  "logFor": "machine"
}

Related

  • Reporters (JsonReporter, PlainTextReporter) handle file output after recording
  • This flag handles real-time console/stdout output

Metadata

Metadata

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions