Skip to content

Add --max-rows flag to limit input size #82

@vmvarela

Description

@vmvarela

Description

sql-pipe loads the entire CSV into an in-memory SQLite database. A malformed input or an accidental pipe of a 10M-row file can exhaust RAM with no warning. A --max-rows flag gives users a safe fast-fail mechanism.

Example

$ cat huge.csv | sql-pipe --max-rows 100000 'SELECT COUNT(*) FROM t'
error: input exceeds --max-rows limit (100000 rows)

Acceptance Criteria

  • Add --max-rows <n> flag (integer, must be > 0)
  • After inserting n rows, stop reading stdin and exit with code 1 and a clear error message to stderr: error: input exceeds --max-rows limit (<n> rows)
  • --max-rows 0 is an error (print usage error)
  • Flag is documented in --help output, README.md Flags table, and docs/sql-pipe.1.scd
  • Tests cover: normal operation (under limit), limit hit (correct error + exit code)

Notes

  • Counter is incremented in the CSV insert loop in src/main.zig
  • Exit code 1 (usage/input error) is appropriate; consider a dedicated code if it helps scripting

Metadata

Metadata

Assignees

No one assigned

    Labels

    priority:highMust be in the next sprintsize:sSmall — 1 to 4 hoursstatus:readyRefined and ready for sprint selectiontype:featureNew functionality

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions