Skip to content

Latest commit

 

History

History
48 lines (33 loc) · 3.8 KB

File metadata and controls

48 lines (33 loc) · 3.8 KB

Copilot Instructions for Module-Data-Flows

Use these repo-specific guidelines so AI coding agents are productive and non-disruptive across this multi-exercise teaching repo.

Scope & Structure

Workflows

  • Formatting: run npm run format at repo root (see package.json). Keep edits Prettier-friendly.
  • Tests (root): npm test runs Jest across the repo using a broad match (see package.json). Prefer running tests in the relevant subfolder when possible to keep scope tight.
  • Tests (subfolders): some folders have their own config/scripts:
  • Script-free exercises: execute files directly with Node to check console output (e.g., node Sprint-3/dead-code/exercise-1.js).

Module Systems

Testing Patterns

Exercise-Specific Conventions

  • Dead code refactors: maintain exact final console output while removing unused/duplicated logic (see Sprint-3/dead-code). Prefer pure functions and minimal variables.
  • Destructuring tasks: follow the README’s output formatting precisely; use object/array destructuring where asked (see Sprint-1/destructuring).
  • Writing tests: when a test is a stub, derive behaviour from the README then write tests first, followed by the simplest implementation.

Do / Don’t

  • Do: keep changes local to a single exercise; preserve file names and locations; match the folder’s module system; run only relevant tests.
  • Don’t: introduce new npm deps, change top-level configs, or modify unrelated exercises to “fix” failing global runs.

Start Here (per task)

  • Open the exercise README and files in its folder.
  • If tests exist, run them in the subfolder and implement the missing logic. If tests are stubs, write them first based on the README.
  • Verify with local run (node ... or npm test), then format (npm run format).