diff --git a/.agents/skills/ci-prep/SKILL.md b/.agents/skills/ci-prep/SKILL.md new file mode 100644 index 00000000..040221ff --- /dev/null +++ b/.agents/skills/ci-prep/SKILL.md @@ -0,0 +1,102 @@ +--- +name: ci-prep +description: Prepares the current branch for CI by running the exact same steps locally and fixing issues. If CI is already failing, fetches the GH Actions logs first to diagnose. Use before pushing, when CI is red, or when the user says "fix ci". +argument-hint: "[--failing] [optional job name to focus on]" +--- + +# CI Prep + +Prepare the current state for CI. If CI is already failing, fetch and analyze the logs first. + +## Arguments + +- `--failing` — Indicates a GitHub Actions run is already failing. When present, you MUST execute **Step 1** before doing anything else. +- Any other argument is treated as a job name to focus on (but all failures are still reported). + +If `--failing` is NOT passed, skip directly to **Step 2**. + +## Step 1 �� Fetch failed CI logs (only when `--failing`) + +You MUST do this before any other work. + +```bash +BRANCH=$(git branch --show-current) +PR_JSON=$(gh pr list --head "$BRANCH" --state open --json number,title,url --limit 1) +``` + +If the JSON array is empty, **stop immediately**: +> No open PR found for branch `$BRANCH`. Create a PR first. + +Otherwise fetch the logs: + +```bash +PR_NUMBER=$(echo "$PR_JSON" | jq -r '.[0].number') +gh pr checks "$PR_NUMBER" +RUN_ID=$(gh run list --branch "$BRANCH" --limit 1 --json databaseId --jq '.[0].databaseId') +gh run view "$RUN_ID" +gh run view "$RUN_ID" --log-failed +``` + +Read **every line** of `--log-failed` output. For each failure note the exact file, line, and error message. If a job name argument was provided, prioritize that job but still report all failures. + +## Step 2 — Analyze the CI workflow + +1. Read `.github/workflows/ci.yml` completely. Parse every job and every step. +2. Extract the ordered list of commands the CI actually runs. +3. Note environment variables, matrix strategies, conditional steps, and service containers. + +**Do NOT assume the steps are `make lint`, `make test`, `make build`.** Extract what the CI *actually does*. + +## Step 3 — Run each CI step locally, in order + +Work through failures in this priority order: + +1. **Formatting** — run auto-formatters first to clear noise +2. **Compilation errors** — must compile before lint/test +3. **Lint violations** — fix the code pattern +4. **Runtime / test failures** — fix source code to satisfy the test + +For each command extracted from the CI workflow: + +1. Run the command exactly as CI would run it. +2. If the step fails, **stop and fix the issues** before continuing to the next step. +3. After fixing, re-run the same step to confirm it passes. +4. Move to the next step only after the current one succeeds. + +### Hard constraints + +- **NEVER modify test files** — fix the source code, not the tests +- **NEVER add suppressions** (`#pragma warning disable`, `#[allow(...)]`, `// eslint-disable`) +- **NEVER delete or ignore failing tests** +- **NEVER remove assertions** + +If stuck on the same failure after 5 attempts, ask the user for help. + +## Step 4 — Loop + +- Go back to the first step and repeat until all steps pass locally. If `--failing`, you should see the exact same errors in your terminal that CI shows in the logs. Fix those errors until they are resolved. + +## Step 5 — Commit/Push (only when `--failing`) + +Once all CI steps pass locally: + +1. Commit, but DO NOT MARK THE COMMIT WITH YOU AS AN AUTHOR!!! Only the user authors the commit! +2. Push +3. Monitor until completion or failure +4. Upon failure, go back to Step 1 + +## Rules + +- *You are not allowed to commi/push until all tests pass*. Do not waste GitHub action minutes! The local CI must prove that everything is working. +- **Always read the CI workflow first.** Never assume what commands CI runs. +- Do not push if any step fails (unless `--failing` and all steps now pass) +- Fix issues found in each step before moving to the next +- Never skip steps or suppress errors +- If the CI workflow has multiple jobs, run all of them (respecting dependency order) +- Skip steps that are CI-infrastructure-only (checkout, setup actions, cache steps, artifact uploads) — focus on the actual build/test/lint commands + +## Success criteria + +- Every command that CI runs has been executed locally and passed +- All fixes are applied to the working tree +- The CI passes successfully (if you are correcting an existing failure) diff --git a/.agents/skills/code-dedup/SKILL.md b/.agents/skills/code-dedup/SKILL.md new file mode 100644 index 00000000..6f49707c --- /dev/null +++ b/.agents/skills/code-dedup/SKILL.md @@ -0,0 +1,106 @@ +--- +name: code-dedup +description: Searches for duplicate code, duplicate tests, and dead code, then safely merges or removes them. Use when the user says "deduplicate", "find duplicates", "remove dead code", "DRY up", or "code dedup". Requires test coverage — refuses to touch untested code. +--- + +# Code Dedup + +Carefully search for duplicate code, duplicate tests, and dead code across the repo. Merge duplicates and delete dead code — but only when test coverage proves the change is safe. + +## Prerequisites — hard gate + +Before touching ANY code, verify these conditions. If any fail, stop and report why. + +1. Run `make test` — all tests must pass. If tests fail, stop. Do not dedup a broken codebase. +2. Run `make coverage-check` — coverage must meet the repo's threshold. If it doesn't, stop. +3. This repo uses **C#, F#, Rust, and TypeScript** — all statically typed. Proceed. + +## Steps + +Copy this checklist and track progress: + +``` +Dedup Progress: +- [ ] Step 1: Prerequisites passed (tests green, coverage met, typed) +- [ ] Step 2: Dead code scan complete +- [ ] Step 3: Duplicate code scan complete +- [ ] Step 4: Duplicate test scan complete +- [ ] Step 5: Changes applied +- [ ] Step 6: Verification passed (tests green, coverage stable) +``` + +### Step 1 — Inventory test coverage + +Before deciding what to touch, understand what is tested. + +1. Run `make test` and `make coverage-check` to confirm green baseline +2. Note the current coverage percentage — this is the floor. It must not drop. +3. Identify which files/modules have coverage and which do not. Only files WITH coverage are candidates for dedup. + +### Step 2 — Scan for dead code + +Search for code that is never called, never imported, never referenced. + +1. Look for unused exports, unused functions, unused records, unused variables +2. Use language-appropriate tools: + - **C#/F#:** Analyzer warnings for unused members (build with `-warnaserror` catches these) + - **Rust:** The compiler already warns on dead code — check `make lint` output + - **TypeScript:** Check for unexported functions with zero references in `Lql/LqlExtension/` +3. For each candidate: **grep the entire codebase** for references (including tests, scripts, configs). Only mark as dead if truly zero references. +4. List all dead code found with file paths and line numbers. Do NOT delete yet. + +### Step 3 — Scan for duplicate code + +Search for code blocks that do the same thing in multiple places. + +1. Look for functions/methods with identical or near-identical logic +2. Look for copy-pasted blocks (same structure, maybe different variable names) +3. Look for multiple implementations of the same algorithm or pattern +4. Check across module boundaries — duplicates often hide in different projects (DataProvider, Lql, Sync, Gatekeeper, Samples) +5. For each duplicate pair: note both locations, what they do, and how they differ (if at all) +6. List all duplicates found. Do NOT merge yet. + +### Step 4 — Scan for duplicate tests + +Search for tests that verify the same behavior. + +1. Look for test functions with identical assertions against the same code paths +2. Look for test fixtures/helpers that are duplicated across test files +3. Look for integration tests that fully cover what a unit test also covers (keep the integration test, mark the unit test as redundant per CLAUDE.md rules) +4. List all duplicate tests found. Do NOT delete yet. + +### Step 5 — Apply changes (one at a time) + +For each change, follow this cycle: **change -> test -> verify coverage -> continue or revert**. + +#### 5a. Remove dead code +- Delete dead code identified in Step 2 +- After each deletion: run `make test` and `make coverage-check` +- If tests fail or coverage drops: **revert immediately** and investigate + +#### 5b. Merge duplicate code +- For each duplicate pair: extract the shared logic into a single function/module +- Update all call sites to use the shared version +- After each merge: run `make test` and `make coverage-check` +- If tests fail: **revert immediately**. The duplicates may have subtle differences you missed. + +#### 5c. Remove duplicate tests +- Delete the redundant test (keep the more thorough one) +- After each deletion: run `make coverage-check` +- If coverage drops: **revert immediately**. The "duplicate" test was covering something the other wasn't. + +### Step 6 — Final verification + +1. Run `make test` — all tests must still pass +2. Run `make coverage-check` — coverage must be >= the baseline from Step 1 +3. Run `make lint` and `make fmt-check` — code must be clean +4. Report: what was removed, what was merged, final coverage vs baseline + +## Rules + +- **No test coverage = do not touch.** If a file has no tests covering it, leave it alone entirely. +- **Coverage must not drop.** The coverage floor from Step 1 is sacred. +- **One change at a time.** Make one dedup change, run tests, verify coverage. Never batch multiple dedup changes before testing. +- **When in doubt, leave it.** If two code blocks look similar but you're not 100% sure they're functionally identical, leave both. +- **Preserve public API surface.** Do not change function signatures, record names, or module exports that external code depends on. Internal refactoring only. +- **Three similar lines is fine.** Only dedup when the shared logic is substantial (>10 lines) or when there are 3+ copies. diff --git a/.agents/skills/spec-check/SKILL.md b/.agents/skills/spec-check/SKILL.md new file mode 100644 index 00000000..b9e41376 --- /dev/null +++ b/.agents/skills/spec-check/SKILL.md @@ -0,0 +1,69 @@ +--- +name: spec-check +description: Audits spec/plan documents against the codebase to ensure every spec section has implementing code and tests. Use when the user says "check specs", "audit specs", "spec coverage", or "validate specs". +--- + + +# Spec Check + +Audit spec and plan documents against the codebase. + +## Steps + +### Step 1 — Validate spec ID structure + +For every markdown file in `docs/specs/`: +1. Find all headings that contain a spec ID (pattern: `[GROUP-TOPIC-DETAIL]`) +2. Validate each ID: + - MUST be uppercase, hyphen-separated + - MUST NOT contain sequential numbers (e.g., `[SPEC-001]` is ILLEGAL) + - First word is the **group** — all sections sharing the same group MUST be adjacent +3. Check for duplicate IDs across all spec files +4. Report any violations + +### Step 2 — Find spec documents + +Scan `docs/specs/` and `docs/plans/` for all markdown files. For each file: +1. Extract all spec section IDs +2. Build a map: `spec ID → file path + heading` + +### Step 3 — Check code references + +For each spec ID found in Step 2: +1. Search the entire codebase (C#, Rust, TypeScript, F# files) for references to the ID +2. A reference is any comment containing the spec ID (e.g., `// Implements [AUTH-TOKEN-VERIFY]`) +3. Record which files reference each spec ID + +### Step 4 — Check test references + +For each spec ID: +1. Search test files for references to the ID +2. A test reference is a comment like `// Tests [AUTH-TOKEN-VERIFY]` in a test file + +### Step 5 — Verify code logic matches spec + +For spec IDs that DO have code references: +1. Read the spec section +2. Read the implementing code +3. Check that the code actually does what the spec describes +4. Flag any discrepancies + +### Step 6 — Report + +Output a table: + +| Spec ID | Spec File | Code References | Test References | Status | +|---------|-----------|-----------------|-----------------|--------| + +Status values: +- **COVERED** — has both code and test references +- **UNTESTED** — has code references but no test references +- **UNIMPLEMENTED** — has no code references at all +- **ORPHANED** — spec ID found in code but not in any spec document + +## Rules + +- Never modify spec documents — only report findings +- Never modify code — only report findings +- Every spec section MUST have at least one code reference and one test reference +- Orphaned references (code mentioning a spec ID that doesn't exist) are errors diff --git a/.agents/skills/submit-pr/SKILL.md b/.agents/skills/submit-pr/SKILL.md new file mode 100644 index 00000000..30656c3e --- /dev/null +++ b/.agents/skills/submit-pr/SKILL.md @@ -0,0 +1,36 @@ +--- +name: submit-pr +description: Creates a pull request with a well-structured description after verifying CI passes. Use when the user asks to submit, create, or open a pull request. +disable-model-invocation: true +--- + +# Submit PR + +Create a pull request for the current branch with a well-structured description. + +## Steps + +1. Run `make ci` — must pass completely before creating PR +2. **Generate the diff against main.** Run `git diff main...HEAD > /tmp/pr-diff.txt` to capture the full diff between the current branch and the head of main. This is the ONLY source of truth for what the PR contains. **Warning:** the diff can be very large. If the diff file exceeds context limits, process it in chunks (e.g., read sections with `head`/`tail` or split by file) rather than trying to load it all at once. +3. **Derive the PR title and description SOLELY from the diff.** Read the diff output and summarize what changed. Ignore commit messages, branch names, and any other metadata — only the actual code/content diff matters. +4. Write PR body using the template in `.github/pull_request_template.md` +5. Fill in (based on the diff analysis from step 3): + - TLDR: one sentence + - What Was Added: new files, features, deps + - What Was Changed/Deleted: modified behaviour + - How Tests Prove It Works: specific test names or output + - Spec/Doc Changes: if any + - Breaking Changes: yes/no + description +6. Use `gh pr create` with the filled template + +## Rules + +- Never create a PR if `make ci` fails +- PR description must be specific and tight — no vague placeholders +- Link to the relevant GitHub issue if one exists + +## Success criteria + +- `make ci` passed +- PR created with `gh pr create` +- PR URL returned to user diff --git a/.agents/skills/upgrade-packages/SKILL.md b/.agents/skills/upgrade-packages/SKILL.md new file mode 100644 index 00000000..f6dd2ff0 --- /dev/null +++ b/.agents/skills/upgrade-packages/SKILL.md @@ -0,0 +1,57 @@ +--- +name: upgrade-packages +description: Upgrades all dependencies to latest versions across C#, Rust, and TypeScript. Use when the user says "upgrade packages", "update dependencies", "bump versions", or "upgrade deps". +argument-hint: "[language: dotnet|rust|typescript|all]" +--- + + +# Upgrade Packages + +Upgrade all dependencies to their latest versions. + +## Steps + +### Step 1 — Detect packages to upgrade + +Based on `$ARGUMENTS` (default: all): + +**C# (.NET):** +- Check `Directory.Build.props` for centrally managed package versions +- Check individual `.csproj` files for project-specific packages +- Run `dotnet list package --outdated` on `DataProvider.sln` + +**Rust:** +- Check `Lql/lql-lsp-rust/Cargo.toml` workspace dependencies +- Run `cd Lql/lql-lsp-rust && cargo outdated` (install with `cargo install cargo-outdated` if needed) + +**TypeScript:** +- Check `Lql/LqlExtension/package.json` +- Run `cd Lql/LqlExtension && npm outdated` + +### Step 2 — Upgrade + +**C# (.NET):** +- Update version numbers in `Directory.Build.props` for central packages +- For project-specific packages: `dotnet add package ` +- Run `dotnet restore` + +**Rust:** +- Update versions in `Cargo.toml` +- Run `cargo update` + +**TypeScript:** +- Run `npm update` or manually update `package.json` for major versions +- Run `npm install` + +### Step 3 — Verify + +1. Run `make ci` — must pass completely +2. If any tests fail, investigate whether the failure is from the upgrade +3. Report which packages were upgraded and from/to versions + +## Rules + +- Never downgrade a package +- If a major version upgrade breaks tests, report it and revert that specific upgrade +- Always run the full test suite after upgrading +- Update lock files (`Cargo.lock`, `package-lock.json`) as part of the upgrade diff --git a/.agents/skills/website-audit/SKILL.md b/.agents/skills/website-audit/SKILL.md new file mode 100644 index 00000000..9c1b57fc --- /dev/null +++ b/.agents/skills/website-audit/SKILL.md @@ -0,0 +1,179 @@ +--- +name: website-audit +description: Audits a website for SEO, AI search performance, structured data, mobile usability, broken links, and social media cards. Fixes issues found. Use when the user mentions "audit website", "SEO", "fix search ranking", "AI search", "structured data", "social media cards", or "website performance". +--- + +# Website Audit + +Performs a comprehensive website audit and fixes issues affecting search visibility and AI discoverability. + +Copy this checklist and track your progress: + +``` +Audit Progress: +- [ ] Step 1: Read guidelines +- [ ] Step 2: Audit AI search readiness +- [ ] Step 3: Audit SEO and keywords +- [ ] Step 4: Audit crawling and indexing +- [ ] Step 5: Audit broken links and canonicalization +- [ ] Step 6: Audit mobile usability +- [ ] Step 7: Audit structured data +- [ ] Step 8: Audit social media cards +- [ ] Step 9: Audit For Unsubstantiated Claims +- [ ] Step 10: Audit Design Compliance +- [ ] Step 11: Test with Playwright +- [ ] Step 12: Report findings +``` + +- Check the outputted HTML/CSS/JavaScript AFTER the website is generated by the static content generator. - Don't just check the static content before the website is generated. +- Fix issues at the core where the static content templates are stored - not in the outputted HTML (e.g. _site) +- Never manually edit the generated website content directly + +## Step 1 — Read guidelines + +Fetch and read each of these before auditing. These are the authoritative references for every step that follows. + +- [Google's guidance on using generative AI content](https://developers.google.com/search/docs/fundamentals/using-gen-ai-content) +- [Top ways to ensure content performs well in Google's AI experiences](https://developers.google.com/search/blog/2025/05/succeeding-in-ai-search) +- [SEO Starter Guide](https://developers.google.com/search/docs/fundamentals/seo-starter-guide) + +Take the business plan into account: +[text](../../../business_plan/business_plan.md) + +Identify the website source files in the repo. Determine the framework (static site generator, Next.js, Hugo, etc.) so you know where to find templates, metadata, and content. + +## Step 2 — Audit AI search readiness + +Apply the guidance from the AI search article. Check: + +1. **Content quality** — Is content original, expert-level, and comprehensive? Flag thin or duplicated pages. +2. **Clear structure** — Do pages use descriptive headings, lists, and concise answers to likely questions? +3. **Entity clarity** — Are key terms, products, and concepts defined clearly so AI can extract them? +4. **Freshness signals** — Are dates, update timestamps, and authorship present? + +Fix issues directly in the source files. For each fix, note what changed and why. + +## Step 3 — Audit SEO and keywords + +1. Search [Google Trends](https://trends.google.com/home) for trending keywords related to the website's content. +2. Review each page's ``, `<meta name="description">`, and `<h1>` tags. +3. Check for keyword opportunities — can trending terms be naturally inserted into headings, descriptions, or body content? +4. Verify each page has a unique, descriptive title (50-60 chars) and meta description (150-160 chars). +5. Check image `alt` attributes describe the image content and include relevant keywords where natural. + +Apply the [SEO Starter Guide](https://developers.google.com/search/docs/fundamentals/seo-starter-guide) principles. Fix issues directly. + +## Step 4 — Audit crawling and indexing + +Reference: [Overview of crawling and indexing topics](https://developers.google.com/search/docs/crawling-indexing) + +1. **robots.txt** — Locate and review it. Verify it doesn't block important pages. Reference: [robots.txt spec](https://developers.google.com/search/docs/crawling-indexing/robots-txt) +2. **Sitemap** — Locate the sitemap (or sitemap index). Verify all important pages are listed and no dead URLs are included. Reference: [Sitemap guidelines](https://developers.google.com/search/docs/crawling-indexing/sitemaps/large-sitemaps) +3. **Meta robots tags** — Check for unintended `noindex` or `nofollow` directives on pages that should be indexed. + +Note: robots.txt and sitemaps are often auto-generated. If so, check the generator config rather than the output file. + +## Step 5 — Audit broken links and canonicalization + +Reference: [What is canonicalization](https://developers.google.com/search/docs/crawling-indexing/canonicalization) + +1. Check all internal links resolve to valid pages (no 404s). +2. Verify `<link rel="canonical">` tags are present and point to the correct URL. +3. Check for duplicate content accessible via multiple URLs (with/without trailing slash, www vs non-www). +4. Verify redirects use 301 (permanent) not 302 (temporary) where appropriate. + +## Step 6 — Audit mobile usability + +Reference: [Mobile-first indexing best practices](https://developers.google.com/search/docs/crawling-indexing/mobile/mobile-sites-mobile-first-indexing) + +1. Verify the `<meta name="viewport">` tag is present and correctly configured. +2. Check that content is identical between mobile and desktop (mobile-first indexing requires this). +3. Verify touch targets are adequately sized (min 48x48px). +4. Check font sizes are readable without zooming (min 16px body text). + +## Step 7 — Audit structured data + +Reference: [Structured data guidelines](https://developers.google.com/search/docs/appearance/structured-data/sd-policies) + +1. Check for existing JSON-LD `<script type="application/ld+json">` blocks. +2. Verify the structured data matches the page content (no misleading markup). +3. Add missing structured data where appropriate: + - **Organization/Person** on the homepage + - **Article/BlogPosting** on blog posts (with author, datePublished, dateModified) + - **BreadcrumbList** for navigation + - **FAQ** for pages with question/answer content +4. Validate JSON-LD syntax is correct. + +## Step 8 — Audit social media cards + +Reference: [Implementing Social Media Preview Cards](https://documentation.platformos.com/use-cases/implementing-social-media-preview-cards) + +Check every page template includes: + +**Open Graph (Facebook/LinkedIn):** +- `og:title`, `og:description`, `og:image`, `og:url`, `og:type` + +**Twitter Card:** +- `twitter:card`, `twitter:title`, `twitter:description`, `twitter:image` + +Verify `og:image` dimensions are at least 1200x630px. Fix missing or incomplete tags. + +## Step 9 - Audit For Unsubstantiated Claims + +Ensure that all claims are backed up with a link to a reputable source. As an example, this claim isn't valid as content unless it links to an authority that found this through research + +> Research shows teams with strong DevEx perform 4-5x better across speed, quality, and engagement + +Search for the authoritative URL and add a link to the URL. If it is not available, change the claim to something that can be substatiated. + +## Step 10 — Audit Design Compliance + +Read the design system docs and view the design screens in the designsystem folder. + +## Step 11 — Test with Playwright + +Build and run the website locally using `make website-run` (or the project's equivalent dev server command). + +**Desktop tests (1280x720):** + +1. Navigate to the homepage — take a screenshot. +2. Navigate to each major section — verify pages load without errors. +3. Check the browser console for JavaScript errors. +4. Verify all navigation links work. + +**Mobile tests (375x667, iPhone SE):** + +1. Resize the browser to mobile dimensions. +2. Navigate to the homepage — take a screenshot. +3. Verify the layout is responsive (no horizontal overflow, readable text). +4. Test navigation menu (hamburger menu if applicable). + +If any page fails to load or has console errors, fix the issue and retest. + +## Step 12 — Report findings + +Summarize the audit results: + +``` +## Website Audit Report + +### Fixed +- [List each issue fixed with file and line reference] + +### Warnings (manual review needed) +- [Issues that need human judgment] + +### Passed +- [Areas that passed audit with no issues] + +### Screenshots +- [Reference Playwright screenshots taken] +``` + +## Rules + +- **Fix issues directly** — don't just report them. Only flag issues as warnings when they require human judgment (e.g., content tone, keyword selection). +- **One step at a time** — complete each step before moving to the next. +- **Preserve existing content** — improve structure and metadata without rewriting the author's voice. +- **No keyword stuffing** — keywords must read naturally in context. +- **Respect the framework** — edit templates/configs, not generated output files. diff --git a/.claude/skills/ci-prep/SKILL.md b/.claude/skills/ci-prep/SKILL.md index 040221ff..8524d5e9 100644 --- a/.claude/skills/ci-prep/SKILL.md +++ b/.claude/skills/ci-prep/SKILL.md @@ -1,102 +1,6 @@ --- name: ci-prep description: Prepares the current branch for CI by running the exact same steps locally and fixing issues. If CI is already failing, fetches the GH Actions logs first to diagnose. Use before pushing, when CI is red, or when the user says "fix ci". -argument-hint: "[--failing] [optional job name to focus on]" --- -# CI Prep - -Prepare the current state for CI. If CI is already failing, fetch and analyze the logs first. - -## Arguments - -- `--failing` — Indicates a GitHub Actions run is already failing. When present, you MUST execute **Step 1** before doing anything else. -- Any other argument is treated as a job name to focus on (but all failures are still reported). - -If `--failing` is NOT passed, skip directly to **Step 2**. - -## Step 1 �� Fetch failed CI logs (only when `--failing`) - -You MUST do this before any other work. - -```bash -BRANCH=$(git branch --show-current) -PR_JSON=$(gh pr list --head "$BRANCH" --state open --json number,title,url --limit 1) -``` - -If the JSON array is empty, **stop immediately**: -> No open PR found for branch `$BRANCH`. Create a PR first. - -Otherwise fetch the logs: - -```bash -PR_NUMBER=$(echo "$PR_JSON" | jq -r '.[0].number') -gh pr checks "$PR_NUMBER" -RUN_ID=$(gh run list --branch "$BRANCH" --limit 1 --json databaseId --jq '.[0].databaseId') -gh run view "$RUN_ID" -gh run view "$RUN_ID" --log-failed -``` - -Read **every line** of `--log-failed` output. For each failure note the exact file, line, and error message. If a job name argument was provided, prioritize that job but still report all failures. - -## Step 2 — Analyze the CI workflow - -1. Read `.github/workflows/ci.yml` completely. Parse every job and every step. -2. Extract the ordered list of commands the CI actually runs. -3. Note environment variables, matrix strategies, conditional steps, and service containers. - -**Do NOT assume the steps are `make lint`, `make test`, `make build`.** Extract what the CI *actually does*. - -## Step 3 — Run each CI step locally, in order - -Work through failures in this priority order: - -1. **Formatting** — run auto-formatters first to clear noise -2. **Compilation errors** — must compile before lint/test -3. **Lint violations** — fix the code pattern -4. **Runtime / test failures** — fix source code to satisfy the test - -For each command extracted from the CI workflow: - -1. Run the command exactly as CI would run it. -2. If the step fails, **stop and fix the issues** before continuing to the next step. -3. After fixing, re-run the same step to confirm it passes. -4. Move to the next step only after the current one succeeds. - -### Hard constraints - -- **NEVER modify test files** — fix the source code, not the tests -- **NEVER add suppressions** (`#pragma warning disable`, `#[allow(...)]`, `// eslint-disable`) -- **NEVER delete or ignore failing tests** -- **NEVER remove assertions** - -If stuck on the same failure after 5 attempts, ask the user for help. - -## Step 4 — Loop - -- Go back to the first step and repeat until all steps pass locally. If `--failing`, you should see the exact same errors in your terminal that CI shows in the logs. Fix those errors until they are resolved. - -## Step 5 — Commit/Push (only when `--failing`) - -Once all CI steps pass locally: - -1. Commit, but DO NOT MARK THE COMMIT WITH YOU AS AN AUTHOR!!! Only the user authors the commit! -2. Push -3. Monitor until completion or failure -4. Upon failure, go back to Step 1 - -## Rules - -- *You are not allowed to commi/push until all tests pass*. Do not waste GitHub action minutes! The local CI must prove that everything is working. -- **Always read the CI workflow first.** Never assume what commands CI runs. -- Do not push if any step fails (unless `--failing` and all steps now pass) -- Fix issues found in each step before moving to the next -- Never skip steps or suppress errors -- If the CI workflow has multiple jobs, run all of them (respecting dependency order) -- Skip steps that are CI-infrastructure-only (checkout, setup actions, cache steps, artifact uploads) — focus on the actual build/test/lint commands - -## Success criteria - -- Every command that CI runs has been executed locally and passed -- All fixes are applied to the working tree -- The CI passes successfully (if you are correcting an existing failure) +@../../../.agents/skills/ci-prep/SKILL.md diff --git a/.claude/skills/code-dedup/SKILL.md b/.claude/skills/code-dedup/SKILL.md index 6f49707c..5167d144 100644 --- a/.claude/skills/code-dedup/SKILL.md +++ b/.claude/skills/code-dedup/SKILL.md @@ -3,104 +3,4 @@ name: code-dedup description: Searches for duplicate code, duplicate tests, and dead code, then safely merges or removes them. Use when the user says "deduplicate", "find duplicates", "remove dead code", "DRY up", or "code dedup". Requires test coverage — refuses to touch untested code. --- -# Code Dedup - -Carefully search for duplicate code, duplicate tests, and dead code across the repo. Merge duplicates and delete dead code — but only when test coverage proves the change is safe. - -## Prerequisites — hard gate - -Before touching ANY code, verify these conditions. If any fail, stop and report why. - -1. Run `make test` — all tests must pass. If tests fail, stop. Do not dedup a broken codebase. -2. Run `make coverage-check` — coverage must meet the repo's threshold. If it doesn't, stop. -3. This repo uses **C#, F#, Rust, and TypeScript** — all statically typed. Proceed. - -## Steps - -Copy this checklist and track progress: - -``` -Dedup Progress: -- [ ] Step 1: Prerequisites passed (tests green, coverage met, typed) -- [ ] Step 2: Dead code scan complete -- [ ] Step 3: Duplicate code scan complete -- [ ] Step 4: Duplicate test scan complete -- [ ] Step 5: Changes applied -- [ ] Step 6: Verification passed (tests green, coverage stable) -``` - -### Step 1 — Inventory test coverage - -Before deciding what to touch, understand what is tested. - -1. Run `make test` and `make coverage-check` to confirm green baseline -2. Note the current coverage percentage — this is the floor. It must not drop. -3. Identify which files/modules have coverage and which do not. Only files WITH coverage are candidates for dedup. - -### Step 2 — Scan for dead code - -Search for code that is never called, never imported, never referenced. - -1. Look for unused exports, unused functions, unused records, unused variables -2. Use language-appropriate tools: - - **C#/F#:** Analyzer warnings for unused members (build with `-warnaserror` catches these) - - **Rust:** The compiler already warns on dead code — check `make lint` output - - **TypeScript:** Check for unexported functions with zero references in `Lql/LqlExtension/` -3. For each candidate: **grep the entire codebase** for references (including tests, scripts, configs). Only mark as dead if truly zero references. -4. List all dead code found with file paths and line numbers. Do NOT delete yet. - -### Step 3 — Scan for duplicate code - -Search for code blocks that do the same thing in multiple places. - -1. Look for functions/methods with identical or near-identical logic -2. Look for copy-pasted blocks (same structure, maybe different variable names) -3. Look for multiple implementations of the same algorithm or pattern -4. Check across module boundaries — duplicates often hide in different projects (DataProvider, Lql, Sync, Gatekeeper, Samples) -5. For each duplicate pair: note both locations, what they do, and how they differ (if at all) -6. List all duplicates found. Do NOT merge yet. - -### Step 4 — Scan for duplicate tests - -Search for tests that verify the same behavior. - -1. Look for test functions with identical assertions against the same code paths -2. Look for test fixtures/helpers that are duplicated across test files -3. Look for integration tests that fully cover what a unit test also covers (keep the integration test, mark the unit test as redundant per CLAUDE.md rules) -4. List all duplicate tests found. Do NOT delete yet. - -### Step 5 — Apply changes (one at a time) - -For each change, follow this cycle: **change -> test -> verify coverage -> continue or revert**. - -#### 5a. Remove dead code -- Delete dead code identified in Step 2 -- After each deletion: run `make test` and `make coverage-check` -- If tests fail or coverage drops: **revert immediately** and investigate - -#### 5b. Merge duplicate code -- For each duplicate pair: extract the shared logic into a single function/module -- Update all call sites to use the shared version -- After each merge: run `make test` and `make coverage-check` -- If tests fail: **revert immediately**. The duplicates may have subtle differences you missed. - -#### 5c. Remove duplicate tests -- Delete the redundant test (keep the more thorough one) -- After each deletion: run `make coverage-check` -- If coverage drops: **revert immediately**. The "duplicate" test was covering something the other wasn't. - -### Step 6 — Final verification - -1. Run `make test` — all tests must still pass -2. Run `make coverage-check` — coverage must be >= the baseline from Step 1 -3. Run `make lint` and `make fmt-check` — code must be clean -4. Report: what was removed, what was merged, final coverage vs baseline - -## Rules - -- **No test coverage = do not touch.** If a file has no tests covering it, leave it alone entirely. -- **Coverage must not drop.** The coverage floor from Step 1 is sacred. -- **One change at a time.** Make one dedup change, run tests, verify coverage. Never batch multiple dedup changes before testing. -- **When in doubt, leave it.** If two code blocks look similar but you're not 100% sure they're functionally identical, leave both. -- **Preserve public API surface.** Do not change function signatures, record names, or module exports that external code depends on. Internal refactoring only. -- **Three similar lines is fine.** Only dedup when the shared logic is substantial (>10 lines) or when there are 3+ copies. +@../../../.agents/skills/code-dedup/SKILL.md diff --git a/.claude/skills/spec-check/SKILL.md b/.claude/skills/spec-check/SKILL.md index b9e41376..944484ab 100644 --- a/.claude/skills/spec-check/SKILL.md +++ b/.claude/skills/spec-check/SKILL.md @@ -2,68 +2,5 @@ name: spec-check description: Audits spec/plan documents against the codebase to ensure every spec section has implementing code and tests. Use when the user says "check specs", "audit specs", "spec coverage", or "validate specs". --- -<!-- agent-pmo:d75d5c8 --> -# Spec Check - -Audit spec and plan documents against the codebase. - -## Steps - -### Step 1 — Validate spec ID structure - -For every markdown file in `docs/specs/`: -1. Find all headings that contain a spec ID (pattern: `[GROUP-TOPIC-DETAIL]`) -2. Validate each ID: - - MUST be uppercase, hyphen-separated - - MUST NOT contain sequential numbers (e.g., `[SPEC-001]` is ILLEGAL) - - First word is the **group** — all sections sharing the same group MUST be adjacent -3. Check for duplicate IDs across all spec files -4. Report any violations - -### Step 2 — Find spec documents - -Scan `docs/specs/` and `docs/plans/` for all markdown files. For each file: -1. Extract all spec section IDs -2. Build a map: `spec ID → file path + heading` - -### Step 3 — Check code references - -For each spec ID found in Step 2: -1. Search the entire codebase (C#, Rust, TypeScript, F# files) for references to the ID -2. A reference is any comment containing the spec ID (e.g., `// Implements [AUTH-TOKEN-VERIFY]`) -3. Record which files reference each spec ID - -### Step 4 — Check test references - -For each spec ID: -1. Search test files for references to the ID -2. A test reference is a comment like `// Tests [AUTH-TOKEN-VERIFY]` in a test file - -### Step 5 — Verify code logic matches spec - -For spec IDs that DO have code references: -1. Read the spec section -2. Read the implementing code -3. Check that the code actually does what the spec describes -4. Flag any discrepancies - -### Step 6 — Report - -Output a table: - -| Spec ID | Spec File | Code References | Test References | Status | -|---------|-----------|-----------------|-----------------|--------| - -Status values: -- **COVERED** — has both code and test references -- **UNTESTED** — has code references but no test references -- **UNIMPLEMENTED** — has no code references at all -- **ORPHANED** — spec ID found in code but not in any spec document - -## Rules - -- Never modify spec documents — only report findings -- Never modify code — only report findings -- Every spec section MUST have at least one code reference and one test reference -- Orphaned references (code mentioning a spec ID that doesn't exist) are errors +@../../../.agents/skills/spec-check/SKILL.md diff --git a/.claude/skills/submit-pr/SKILL.md b/.claude/skills/submit-pr/SKILL.md index 30656c3e..786e837a 100644 --- a/.claude/skills/submit-pr/SKILL.md +++ b/.claude/skills/submit-pr/SKILL.md @@ -1,36 +1,6 @@ --- name: submit-pr description: Creates a pull request with a well-structured description after verifying CI passes. Use when the user asks to submit, create, or open a pull request. -disable-model-invocation: true --- -# Submit PR - -Create a pull request for the current branch with a well-structured description. - -## Steps - -1. Run `make ci` — must pass completely before creating PR -2. **Generate the diff against main.** Run `git diff main...HEAD > /tmp/pr-diff.txt` to capture the full diff between the current branch and the head of main. This is the ONLY source of truth for what the PR contains. **Warning:** the diff can be very large. If the diff file exceeds context limits, process it in chunks (e.g., read sections with `head`/`tail` or split by file) rather than trying to load it all at once. -3. **Derive the PR title and description SOLELY from the diff.** Read the diff output and summarize what changed. Ignore commit messages, branch names, and any other metadata — only the actual code/content diff matters. -4. Write PR body using the template in `.github/pull_request_template.md` -5. Fill in (based on the diff analysis from step 3): - - TLDR: one sentence - - What Was Added: new files, features, deps - - What Was Changed/Deleted: modified behaviour - - How Tests Prove It Works: specific test names or output - - Spec/Doc Changes: if any - - Breaking Changes: yes/no + description -6. Use `gh pr create` with the filled template - -## Rules - -- Never create a PR if `make ci` fails -- PR description must be specific and tight — no vague placeholders -- Link to the relevant GitHub issue if one exists - -## Success criteria - -- `make ci` passed -- PR created with `gh pr create` -- PR URL returned to user +@../../../.agents/skills/submit-pr/SKILL.md diff --git a/.claude/skills/upgrade-packages/SKILL.md b/.claude/skills/upgrade-packages/SKILL.md index f6dd2ff0..efbdc35b 100644 --- a/.claude/skills/upgrade-packages/SKILL.md +++ b/.claude/skills/upgrade-packages/SKILL.md @@ -1,57 +1,6 @@ --- name: upgrade-packages description: Upgrades all dependencies to latest versions across C#, Rust, and TypeScript. Use when the user says "upgrade packages", "update dependencies", "bump versions", or "upgrade deps". -argument-hint: "[language: dotnet|rust|typescript|all]" --- -<!-- agent-pmo:d75d5c8 --> -# Upgrade Packages - -Upgrade all dependencies to their latest versions. - -## Steps - -### Step 1 — Detect packages to upgrade - -Based on `$ARGUMENTS` (default: all): - -**C# (.NET):** -- Check `Directory.Build.props` for centrally managed package versions -- Check individual `.csproj` files for project-specific packages -- Run `dotnet list package --outdated` on `DataProvider.sln` - -**Rust:** -- Check `Lql/lql-lsp-rust/Cargo.toml` workspace dependencies -- Run `cd Lql/lql-lsp-rust && cargo outdated` (install with `cargo install cargo-outdated` if needed) - -**TypeScript:** -- Check `Lql/LqlExtension/package.json` -- Run `cd Lql/LqlExtension && npm outdated` - -### Step 2 — Upgrade - -**C# (.NET):** -- Update version numbers in `Directory.Build.props` for central packages -- For project-specific packages: `dotnet add <project> package <name>` -- Run `dotnet restore` - -**Rust:** -- Update versions in `Cargo.toml` -- Run `cargo update` - -**TypeScript:** -- Run `npm update` or manually update `package.json` for major versions -- Run `npm install` - -### Step 3 — Verify - -1. Run `make ci` — must pass completely -2. If any tests fail, investigate whether the failure is from the upgrade -3. Report which packages were upgraded and from/to versions - -## Rules - -- Never downgrade a package -- If a major version upgrade breaks tests, report it and revert that specific upgrade -- Always run the full test suite after upgrading -- Update lock files (`Cargo.lock`, `package-lock.json`) as part of the upgrade +@../../../.agents/skills/upgrade-packages/SKILL.md diff --git a/.claude/skills/website-audit/SKILL.md b/.claude/skills/website-audit/SKILL.md index 9c1b57fc..eae7cfbd 100644 --- a/.claude/skills/website-audit/SKILL.md +++ b/.claude/skills/website-audit/SKILL.md @@ -3,177 +3,4 @@ name: website-audit description: Audits a website for SEO, AI search performance, structured data, mobile usability, broken links, and social media cards. Fixes issues found. Use when the user mentions "audit website", "SEO", "fix search ranking", "AI search", "structured data", "social media cards", or "website performance". --- -# Website Audit - -Performs a comprehensive website audit and fixes issues affecting search visibility and AI discoverability. - -Copy this checklist and track your progress: - -``` -Audit Progress: -- [ ] Step 1: Read guidelines -- [ ] Step 2: Audit AI search readiness -- [ ] Step 3: Audit SEO and keywords -- [ ] Step 4: Audit crawling and indexing -- [ ] Step 5: Audit broken links and canonicalization -- [ ] Step 6: Audit mobile usability -- [ ] Step 7: Audit structured data -- [ ] Step 8: Audit social media cards -- [ ] Step 9: Audit For Unsubstantiated Claims -- [ ] Step 10: Audit Design Compliance -- [ ] Step 11: Test with Playwright -- [ ] Step 12: Report findings -``` - -- Check the outputted HTML/CSS/JavaScript AFTER the website is generated by the static content generator. - Don't just check the static content before the website is generated. -- Fix issues at the core where the static content templates are stored - not in the outputted HTML (e.g. _site) -- Never manually edit the generated website content directly - -## Step 1 — Read guidelines - -Fetch and read each of these before auditing. These are the authoritative references for every step that follows. - -- [Google's guidance on using generative AI content](https://developers.google.com/search/docs/fundamentals/using-gen-ai-content) -- [Top ways to ensure content performs well in Google's AI experiences](https://developers.google.com/search/blog/2025/05/succeeding-in-ai-search) -- [SEO Starter Guide](https://developers.google.com/search/docs/fundamentals/seo-starter-guide) - -Take the business plan into account: -[text](../../../business_plan/business_plan.md) - -Identify the website source files in the repo. Determine the framework (static site generator, Next.js, Hugo, etc.) so you know where to find templates, metadata, and content. - -## Step 2 — Audit AI search readiness - -Apply the guidance from the AI search article. Check: - -1. **Content quality** — Is content original, expert-level, and comprehensive? Flag thin or duplicated pages. -2. **Clear structure** — Do pages use descriptive headings, lists, and concise answers to likely questions? -3. **Entity clarity** — Are key terms, products, and concepts defined clearly so AI can extract them? -4. **Freshness signals** — Are dates, update timestamps, and authorship present? - -Fix issues directly in the source files. For each fix, note what changed and why. - -## Step 3 — Audit SEO and keywords - -1. Search [Google Trends](https://trends.google.com/home) for trending keywords related to the website's content. -2. Review each page's `<title>`, `<meta name="description">`, and `<h1>` tags. -3. Check for keyword opportunities — can trending terms be naturally inserted into headings, descriptions, or body content? -4. Verify each page has a unique, descriptive title (50-60 chars) and meta description (150-160 chars). -5. Check image `alt` attributes describe the image content and include relevant keywords where natural. - -Apply the [SEO Starter Guide](https://developers.google.com/search/docs/fundamentals/seo-starter-guide) principles. Fix issues directly. - -## Step 4 — Audit crawling and indexing - -Reference: [Overview of crawling and indexing topics](https://developers.google.com/search/docs/crawling-indexing) - -1. **robots.txt** — Locate and review it. Verify it doesn't block important pages. Reference: [robots.txt spec](https://developers.google.com/search/docs/crawling-indexing/robots-txt) -2. **Sitemap** — Locate the sitemap (or sitemap index). Verify all important pages are listed and no dead URLs are included. Reference: [Sitemap guidelines](https://developers.google.com/search/docs/crawling-indexing/sitemaps/large-sitemaps) -3. **Meta robots tags** — Check for unintended `noindex` or `nofollow` directives on pages that should be indexed. - -Note: robots.txt and sitemaps are often auto-generated. If so, check the generator config rather than the output file. - -## Step 5 — Audit broken links and canonicalization - -Reference: [What is canonicalization](https://developers.google.com/search/docs/crawling-indexing/canonicalization) - -1. Check all internal links resolve to valid pages (no 404s). -2. Verify `<link rel="canonical">` tags are present and point to the correct URL. -3. Check for duplicate content accessible via multiple URLs (with/without trailing slash, www vs non-www). -4. Verify redirects use 301 (permanent) not 302 (temporary) where appropriate. - -## Step 6 — Audit mobile usability - -Reference: [Mobile-first indexing best practices](https://developers.google.com/search/docs/crawling-indexing/mobile/mobile-sites-mobile-first-indexing) - -1. Verify the `<meta name="viewport">` tag is present and correctly configured. -2. Check that content is identical between mobile and desktop (mobile-first indexing requires this). -3. Verify touch targets are adequately sized (min 48x48px). -4. Check font sizes are readable without zooming (min 16px body text). - -## Step 7 — Audit structured data - -Reference: [Structured data guidelines](https://developers.google.com/search/docs/appearance/structured-data/sd-policies) - -1. Check for existing JSON-LD `<script type="application/ld+json">` blocks. -2. Verify the structured data matches the page content (no misleading markup). -3. Add missing structured data where appropriate: - - **Organization/Person** on the homepage - - **Article/BlogPosting** on blog posts (with author, datePublished, dateModified) - - **BreadcrumbList** for navigation - - **FAQ** for pages with question/answer content -4. Validate JSON-LD syntax is correct. - -## Step 8 — Audit social media cards - -Reference: [Implementing Social Media Preview Cards](https://documentation.platformos.com/use-cases/implementing-social-media-preview-cards) - -Check every page template includes: - -**Open Graph (Facebook/LinkedIn):** -- `og:title`, `og:description`, `og:image`, `og:url`, `og:type` - -**Twitter Card:** -- `twitter:card`, `twitter:title`, `twitter:description`, `twitter:image` - -Verify `og:image` dimensions are at least 1200x630px. Fix missing or incomplete tags. - -## Step 9 - Audit For Unsubstantiated Claims - -Ensure that all claims are backed up with a link to a reputable source. As an example, this claim isn't valid as content unless it links to an authority that found this through research - -> Research shows teams with strong DevEx perform 4-5x better across speed, quality, and engagement - -Search for the authoritative URL and add a link to the URL. If it is not available, change the claim to something that can be substatiated. - -## Step 10 — Audit Design Compliance - -Read the design system docs and view the design screens in the designsystem folder. - -## Step 11 — Test with Playwright - -Build and run the website locally using `make website-run` (or the project's equivalent dev server command). - -**Desktop tests (1280x720):** - -1. Navigate to the homepage — take a screenshot. -2. Navigate to each major section — verify pages load without errors. -3. Check the browser console for JavaScript errors. -4. Verify all navigation links work. - -**Mobile tests (375x667, iPhone SE):** - -1. Resize the browser to mobile dimensions. -2. Navigate to the homepage — take a screenshot. -3. Verify the layout is responsive (no horizontal overflow, readable text). -4. Test navigation menu (hamburger menu if applicable). - -If any page fails to load or has console errors, fix the issue and retest. - -## Step 12 — Report findings - -Summarize the audit results: - -``` -## Website Audit Report - -### Fixed -- [List each issue fixed with file and line reference] - -### Warnings (manual review needed) -- [Issues that need human judgment] - -### Passed -- [Areas that passed audit with no issues] - -### Screenshots -- [Reference Playwright screenshots taken] -``` - -## Rules - -- **Fix issues directly** — don't just report them. Only flag issues as warnings when they require human judgment (e.g., content tone, keyword selection). -- **One step at a time** — complete each step before moving to the next. -- **Preserve existing content** — improve structure and metadata without rewriting the author's voice. -- **No keyword stuffing** — keywords must read naturally in context. -- **Respect the framework** — edit templates/configs, not generated output files. +@../../../.agents/skills/website-audit/SKILL.md diff --git a/.github/workflows/deploy-lql-website.yml b/.github/workflows/deploy-lql-website.yml index cd8cb20d..83c43e10 100644 --- a/.github/workflows/deploy-lql-website.yml +++ b/.github/workflows/deploy-lql-website.yml @@ -23,12 +23,25 @@ jobs: name: github-pages url: ${{ steps.deployment.outputs.page_url }} runs-on: ubuntu-latest - timeout-minutes: 10 + timeout-minutes: 12 steps: - name: Checkout uses: actions/checkout@v4 + - name: Setup Node + uses: actions/setup-node@v4 + with: + node-version: '22' + cache: npm + cache-dependency-path: Lql/LqlWebsite-Eleventy/package-lock.json + + - name: Build Eleventy documentation site + working-directory: Lql/LqlWebsite-Eleventy + run: | + npm ci + npm run build + - name: Setup .NET uses: actions/setup-dotnet@v4 with: @@ -39,23 +52,21 @@ jobs: - name: Install .NET WebAssembly workload run: dotnet workload install wasm-tools - - name: Restore dependencies + - name: Restore Blazor playground dependencies run: dotnet restore ./Lql/Nimblesite.Lql.Website/Nimblesite.Lql.Website.csproj - - name: Build + - name: Build Blazor transpiler playground env: VERSION: ${{ inputs.version }} run: | if [ -n "${VERSION:-}" ]; then - # AssemblyVersion only accepts numeric major[.minor[.build[.revision]]] — - # strip any -prerelease suffix (e.g. 0.9.7-beta -> 0.9.7). ASM_VERSION="${VERSION%%-*}" dotnet build ./Lql/Nimblesite.Lql.Website/Nimblesite.Lql.Website.csproj -c Release --no-restore -p:Version="$VERSION" -p:AssemblyVersion="$ASM_VERSION" -p:FileVersion="$ASM_VERSION" else dotnet build ./Lql/Nimblesite.Lql.Website/Nimblesite.Lql.Website.csproj -c Release --no-restore fi - - name: Publish Blazor WebAssembly project + - name: Publish Blazor transpiler playground env: VERSION: ${{ inputs.version }} run: | @@ -66,13 +77,13 @@ jobs: dotnet publish ./Lql/Nimblesite.Lql.Website/Nimblesite.Lql.Website.csproj -c Release -o release --nologo fi - - name: Update base href for custom domain + - name: Mount Blazor playground under Eleventy site run: | - sed -i 's/<base href="\/" \/>/<base href="\/" \/>/g' release/wwwroot/index.html - - - name: Add .nojekyll file - run: touch release/wwwroot/.nojekyll - + sed -i 's|<base href="/" />|<base href="/playground/" />|g' release/wwwroot/index.html + rm -rf Lql/LqlWebsite-Eleventy/_site/playground + mkdir -p Lql/LqlWebsite-Eleventy/_site/playground + cp -R release/wwwroot/. Lql/LqlWebsite-Eleventy/_site/playground/ + touch Lql/LqlWebsite-Eleventy/_site/.nojekyll - name: Setup Pages uses: actions/configure-pages@v4 @@ -80,7 +91,7 @@ jobs: - name: Upload artifact uses: actions/upload-pages-artifact@v3 with: - path: release/wwwroot + path: Lql/LqlWebsite-Eleventy/_site - name: Deploy to GitHub Pages id: deployment diff --git a/.gitignore b/.gitignore index a24db9ab..3f440c93 100644 --- a/.gitignore +++ b/.gitignore @@ -487,8 +487,6 @@ Lql/LqlWebsite-Eleventy/_site/ -.claude/skills/website-audit/SKILL.md - Lql/lql-lsp-rust/target/ *.vsix Reporting/Nimblesite.Reporting.React/wwwroot/js/Nimblesite.Reporting.React.js diff --git a/CLAUDE.md b/CLAUDE.md index a755fdb9..49382a7f 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -199,7 +199,8 @@ make setup # post-create dev environment setup ``` DataProvider/ ├── .github/workflows/ # CI/CD pipelines -├── .claude/skills/ # Claude Code skills +├── .agents/skills/ # Shared agent skills +├── .claude/skills/ # Claude Code skill pointer files ├── DataProvider/ # Core source generator + CLI tools ├── Lql/ # Lambda Query Language │ ├── Lql/ # Core transpiler library @@ -225,7 +226,10 @@ DataProvider/ - Central config in `Directory.Build.props` - Format: `dotnet csharpier .` -## Claude Code Skills +## Agent Skills +- Canonical project skills live in `.agents/skills/` so Codex can load them from the repository scope. +- `.claude/skills/*/SKILL.md` files are thin `@../../../.agents/skills/.../SKILL.md` pointers for Claude Code. Do not duplicate full skill bodies there. - [Claude Code Skills Overview](https://platform.claude.com/docs/en/agents-and-tools/agent-skills/overview) +- [Codex Skills](https://developers.openai.com/codex/skills) - [The Complete Guide to Building Skills for Claude (PDF)](https://resources.anthropic.com/hubfs/The-Complete-Guide-to-Building-Skill-for-Claude.pdf) diff --git a/DataProvider/Nimblesite.DataProvider.Example/Nimblesite.DataProvider.Example.csproj b/DataProvider/Nimblesite.DataProvider.Example/Nimblesite.DataProvider.Example.csproj index 83d9cf86..facd4cb0 100644 --- a/DataProvider/Nimblesite.DataProvider.Example/Nimblesite.DataProvider.Example.csproj +++ b/DataProvider/Nimblesite.DataProvider.Example/Nimblesite.DataProvider.Example.csproj @@ -45,10 +45,34 @@ </Content> </ItemGroup> + <PropertyGroup> + <DataProviderMigrateToolDll>$(MSBuildThisFileDirectory)../../Migration/DataProviderMigrate/bin/$(Configuration)/net9.0/DataProviderMigrate.dll</DataProviderMigrateToolDll> + <LqlToolDll>$(MSBuildThisFileDirectory)../../Lql/Lql/bin/$(Configuration)/net9.0/Lql.dll</LqlToolDll> + <DataProviderToolDll>$(MSBuildThisFileDirectory)../DataProvider/bin/$(Configuration)/net10.0/DataProvider.dll</DataProviderToolDll> + </PropertyGroup> + + <Target Name="BuildCodegenTools" BeforeTargets="CreateDatabaseSchema"> + <MSBuild + Projects="$(MSBuildThisFileDirectory)../../Migration/DataProviderMigrate/DataProviderMigrate.csproj" + Targets="Build" + Properties="Configuration=$(Configuration)" + /> + <MSBuild + Projects="$(MSBuildThisFileDirectory)../../Lql/Lql/Lql.csproj" + Targets="Build" + Properties="Configuration=$(Configuration)" + /> + <MSBuild + Projects="$(MSBuildThisFileDirectory)../DataProvider/DataProvider.csproj" + Targets="Build" + Properties="Configuration=$(Configuration)" + /> + </Target> + <!-- Create database from YAML using DataProviderMigrate (YAML stored in git) --> <Target Name="CreateDatabaseSchema" BeforeTargets="TranspileLqlAndGenerateDataProvider"> <Exec - Command="dotnet run --project "$(MSBuildThisFileDirectory)../../Migration/DataProviderMigrate/DataProviderMigrate.csproj" -- --schema "$(MSBuildProjectDirectory)/example-schema.yaml" --output "$(MSBuildProjectDirectory)/invoices.db" --provider sqlite" + Command="dotnet "$(DataProviderMigrateToolDll)" --schema "$(MSBuildProjectDirectory)/example-schema.yaml" --output "$(MSBuildProjectDirectory)/invoices.db" --provider sqlite" WorkingDirectory="$(MSBuildProjectDirectory)" StandardOutputImportance="High" StandardErrorImportance="High" @@ -71,7 +95,7 @@ </ItemGroup> <Message Importance="High" Text="Transpiling LQL files (@(LqlFiles))" /> <Exec - Command="dotnet run --project $(MSBuildProjectDirectory)/../../Lql/Lql/Lql.csproj -- sqlite --input "%(LqlFiles.Identity)" --output "%(LqlFiles.RootDir)%(LqlFiles.Directory)%(LqlFiles.Filename).generated.sql"" + Command="dotnet "$(LqlToolDll)" sqlite --input "%(LqlFiles.Identity)" --output "%(LqlFiles.RootDir)%(LqlFiles.Directory)%(LqlFiles.Filename).generated.sql"" Condition="'$(EnableLqlTranspile)' == 'true' and @(LqlFiles) != ''" WorkingDirectory="$(MSBuildProjectDirectory)" StandardOutputImportance="High" @@ -80,7 +104,7 @@ /> <!-- Run SQLite generator CLI to emit .g.cs into Generated folder --> <Exec - Command="dotnet run --project $(MSBuildThisFileDirectory)../DataProvider/DataProvider.csproj -- sqlite --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated"" + Command="dotnet "$(DataProviderToolDll)" sqlite --project-dir "$(MSBuildProjectDirectory)" --config "$(MSBuildProjectDirectory)/DataProvider.json" --out "$(MSBuildProjectDirectory)/Generated"" WorkingDirectory="$(MSBuildProjectDirectory)" StandardOutputImportance="High" StandardErrorImportance="High" diff --git a/Lql/LqlWebsite-Eleventy/_site/assets/css/styles.css b/Lql/LqlWebsite-Eleventy/_site/assets/css/styles.css index d4c5b05b..f1ab758e 100644 --- a/Lql/LqlWebsite-Eleventy/_site/assets/css/styles.css +++ b/Lql/LqlWebsite-Eleventy/_site/assets/css/styles.css @@ -1,352 +1,219 @@ -/* LQL - Design System CSS - Dark-first, high contrast, minimal, reusable classes */ - :root { - /* Core Colors from LQL Design System */ - --volcanic: #FF4500; - --forest: #228B22; - --obsidian: #1C1C1C; - --amber: #FFA500; - --violet: #8A2BE2; - --charcoal: #36454F; - --ivory: #FFFFF0; - --dark-bg: #0F0F0F; - --darker-bg: #0A0A0A; - --card-bg: #1A1A1A; - --border: #2A2A2A; - - /* Semantic mappings (dark theme default) */ - --bg-primary: var(--dark-bg); - --bg-secondary: var(--darker-bg); - --bg-tertiary: var(--obsidian); - --text-primary: var(--ivory); - --text-secondary: #B0B0B0; - --text-muted: #808080; - --border-color: var(--border); - --code-bg: var(--darker-bg); - --accent: var(--volcanic); - --accent-hover: var(--amber); - - /* Typography */ - --font-sans: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif; - --font-mono: 'JetBrains Mono', 'Fira Code', Consolas, monospace; - - /* Type Scale */ - --text-xs: 0.75rem; - --text-sm: 0.875rem; - --text-base: 1rem; - --text-lg: 1.125rem; - --text-xl: 1.25rem; - --text-2xl: 1.5rem; - --text-3xl: 1.875rem; - --text-4xl: 2.25rem; - --text-5xl: 3rem; - - /* Spacing */ - --space-1: 0.25rem; - --space-2: 0.5rem; - --space-3: 0.75rem; - --space-4: 1rem; - --space-6: 1.5rem; - --space-8: 2rem; - --space-10: 2.5rem; - --space-12: 3rem; - --space-16: 4rem; - --space-20: 5rem; - - /* Layout */ - --max-width: 1200px; - --header-height: 64px; - --sidebar-width: 260px; - --radius: 8px; - --transition: 200ms ease; -} - -/* Reset */ -*, *::before, *::after { box-sizing: border-box; margin: 0; padding: 0; } - -html { - scroll-behavior: smooth; - scroll-padding-top: calc(var(--header-height) + var(--space-4)); - overflow-x: hidden; -} - + --volcanic: #FF4500; + --forest: #228B22; + --obsidian: #1C1C1C; + --amber: #FFA500; + --violet: #8A2BE2; + --charcoal: #36454F; + --ivory: #FFFFF0; + --dark-bg: #0F0F0F; + --darker-bg: #0A0A0A; + --card-bg: #1A1A1A; + --border: #2A2A2A; + --text-primary: #FFFFF0; + --text-secondary: #B0B0B0; + --text-muted: #808080; +} + +* { margin: 0; padding: 0; box-sizing: border-box; } +html { scroll-behavior: smooth; scroll-padding-top: 96px; } body { - font-family: var(--font-sans); - font-size: var(--text-base); - line-height: 1.6; - color: var(--text-primary); - background: var(--bg-primary); - -webkit-font-smoothing: antialiased; - overflow-x: hidden; - width: 100%; - max-width: 100vw; + font-family: 'Inter', sans-serif; + background: var(--dark-bg); + color: var(--text-primary); + line-height: 1.6; + overflow-x: hidden; } -/* Skip link */ .skip-link { - position: absolute; - top: -100%; - left: var(--space-4); - padding: var(--space-2) var(--space-4); - background: var(--volcanic); - color: white; - border-radius: var(--radius); - z-index: 1000; -} -.skip-link:focus { top: var(--space-4); } + position: absolute; + top: -100px; + left: 24px; + padding: 8px 16px; + background: var(--volcanic); + color: white; + border-radius: 8px; + z-index: 1000; +} +.skip-link:focus { top: 16px; } -/* Container */ .container { - width: 100%; - max-width: var(--max-width); - margin: 0 auto; - padding: 0 var(--space-4); + max-width: 1200px; + margin: 0 auto; + padding: 0 24px; } -/* Typography */ -h1, h2, h3, h4, h5, h6 { - font-weight: 600; - line-height: 1.25; - color: var(--text-primary); -} -h1 { font-size: var(--text-4xl); font-weight: 700; } -h2 { font-size: var(--text-3xl); } -h3 { font-size: var(--text-2xl); } -h4 { font-size: var(--text-xl); } - -p { margin-bottom: var(--space-4); color: var(--text-secondary); } - -a { color: var(--volcanic); text-decoration: none; transition: color var(--transition); } -a:hover { color: var(--amber); } - -ul, ol { margin-bottom: var(--space-4); padding-left: var(--space-6); } -li { margin-bottom: var(--space-2); color: var(--text-secondary); } - -/* Code */ -code { - font-family: var(--font-mono); - font-size: 0.9em; - padding: 0.2em 0.4em; - background: var(--code-bg); - border: 1px solid var(--border-color); - border-radius: 4px; - color: var(--amber); +header { + background: var(--darker-bg); + border-bottom: 1px solid var(--border); + padding: 16px 0; + position: sticky; + top: 0; + z-index: 100; + backdrop-filter: blur(10px); } -pre { - font-family: var(--font-mono); - font-size: var(--text-sm); - line-height: 1.7; - padding: var(--space-4); - background: var(--code-bg); - border: 1px solid var(--border-color); - border-radius: var(--radius); - overflow-x: auto; - margin-bottom: var(--space-4); -} -pre code { padding: 0; background: none; border: none; color: inherit; } - -/* Header */ -.header { - position: sticky; - top: 0; - height: var(--header-height); - background: var(--bg-secondary); - border-bottom: 1px solid var(--border-color); - z-index: 100; - backdrop-filter: blur(10px); -} - -.nav { - display: flex; - align-items: center; - justify-content: space-between; - height: 100%; +.header-content { + display: flex; + align-items: center; + justify-content: space-between; + gap: 24px; } .logo { - display: flex; - align-items: center; - gap: var(--space-2); + display: flex; + align-items: center; + gap: 12px; + text-decoration: none; } -.logo img { height: 32px; } -.logo:hover { opacity: 0.9; } +.logo img { width: 40px; height: 40px; } .logo-text { - font-size: var(--text-xl); - font-weight: 800; - color: var(--amber); + font-size: 24px; + font-weight: 800; + color: var(--amber); } -.nav-links { - display: flex; - align-items: center; - gap: var(--space-6); - list-style: none; - margin: 0; - padding: 0; +nav ul { + display: flex; + list-style: none; + gap: 32px; } - -.nav-link { - font-weight: 500; - color: var(--text-secondary); - transition: color var(--transition); +nav a { + color: var(--text-secondary); + text-decoration: none; + font-weight: 500; + transition: color 0.2s; } -.nav-link:hover, .nav-link.active { color: var(--volcanic); } +nav a:hover { color: var(--volcanic); } -/* Site Toggle */ .site-toggle { - display: flex; - background: var(--bg-primary); - border-radius: var(--radius); - padding: 2px; - border: 1px solid var(--border-color); + display: flex; + background: var(--darker-bg); + border-radius: 8px; + padding: 2px; + border: 1px solid var(--border); + margin-right: 24px; } .site-toggle-btn { - padding: var(--space-2) var(--space-4); - font-size: var(--text-sm); - font-weight: 500; - color: var(--text-secondary); - border-radius: 6px; - transition: all var(--transition); + padding: 8px 16px; + font-size: 14px; + font-weight: 500; + color: var(--text-secondary); + border-radius: 6px; + text-decoration: none; + transition: all 0.2s; } .site-toggle-btn:hover { color: var(--text-primary); } .site-toggle-btn.active { - background: var(--amber); - color: var(--obsidian); + background: var(--amber); + color: var(--obsidian); } -.nav-actions { display: flex; align-items: center; gap: var(--space-3); } - -/* Mobile menu toggle */ -.mobile-menu-toggle { - display: none; - flex-direction: column; - gap: 4px; - padding: var(--space-2); - background: transparent; - border: none; - cursor: pointer; -} -.mobile-menu-toggle span { - display: block; - width: 24px; - height: 2px; - background: var(--text-primary); - transition: all var(--transition); +.hero { + padding: 120px 0; + background: linear-gradient(135deg, var(--darker-bg) 0%, var(--obsidian) 100%); + position: relative; + overflow: hidden; +} +.hero::before { + content: ''; + position: absolute; + inset: 0; + background: radial-gradient(circle at 30% 20%, rgba(255, 69, 0, 0.1) 0%, transparent 50%), + radial-gradient(circle at 70% 80%, rgba(34, 139, 34, 0.1) 0%, transparent 50%); + pointer-events: none; +} +.hero-content { + text-align: center; + position: relative; + z-index: 2; +} +.hero h1 { + font-size: 64px; + font-weight: 800; + margin-bottom: 24px; + background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); + -webkit-background-clip: text; + -webkit-text-fill-color: transparent; + background-clip: text; +} +.hero .subtitle { + font-size: 24px; + color: var(--text-secondary); + margin: 0 auto 48px; + max-width: 760px; +} + +.cta-buttons { + display: flex; + gap: 24px; + justify-content: center; + margin-bottom: 80px; } - -/* Buttons */ .btn { - display: inline-flex; - align-items: center; - justify-content: center; - gap: var(--space-2); - padding: var(--space-3) var(--space-6); - font-family: var(--font-sans); - font-size: var(--text-base); - font-weight: 600; - border-radius: var(--radius); - border: none; - cursor: pointer; - transition: all var(--transition); - text-decoration: none; + padding: 16px 32px; + border-radius: 8px; + font-weight: 600; + text-decoration: none; + transition: all 0.3s ease; + border: none; + cursor: pointer; + font-size: 16px; } - .btn-primary { - background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); - color: white; + background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); + color: white; } .btn-primary:hover { - transform: translateY(-2px); - box-shadow: 0 8px 25px rgba(255, 69, 0, 0.4); - color: white; + transform: translateY(-2px); + box-shadow: 0 12px 40px rgba(255, 69, 0, 0.4); } - .btn-secondary { - background: transparent; - color: var(--forest); - border: 2px solid var(--forest); + background: transparent; + color: var(--forest); + border: 2px solid var(--forest); } -.btn-secondary:hover { background: var(--forest); color: white; } - -.btn-large { padding: var(--space-4) var(--space-8); font-size: var(--text-lg); } - -/* Hero */ -.hero { - position: relative; - padding: var(--space-20) 0; - text-align: center; - background: linear-gradient(135deg, var(--darker-bg) 0%, var(--obsidian) 100%); - overflow: hidden; -} -.hero::before { - content: ''; - position: absolute; - inset: 0; - background: radial-gradient(circle at 30% 20%, rgba(255, 69, 0, 0.1) 0%, transparent 50%), - radial-gradient(circle at 70% 80%, rgba(34, 139, 34, 0.1) 0%, transparent 50%); - pointer-events: none; -} -.hero > * { position: relative; z-index: 1; } -.hero h1 { - font-size: var(--text-5xl); - font-weight: 800; - margin-bottom: var(--space-6); - background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); - -webkit-background-clip: text; - -webkit-text-fill-color: transparent; - background-clip: text; -} -.hero-subtitle { - font-size: var(--text-xl); - color: var(--text-secondary); - max-width: 600px; - margin: 0 auto var(--space-8); -} -.hero-buttons { display: flex; gap: var(--space-4); justify-content: center; flex-wrap: wrap; } -.hero-code { - max-width: 800px; - margin: var(--space-12) auto 0; - text-align: left; +.btn-secondary:hover { + background: var(--forest); + color: white; } -/* Code example styling */ -.code-window { - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - padding: var(--space-8); - overflow: hidden; +.hero-code, +.example-code { + background: var(--card-bg); + border: 1px solid var(--border); + border-radius: 12px; + padding: 32px; + max-width: 800px; + margin: 0 auto; + position: relative; } .code-header { - display: flex; - align-items: center; - gap: 8px; - margin-bottom: var(--space-6); + display: flex; + align-items: center; + gap: 8px; + margin-bottom: 24px; } .code-dot { - width: 12px; - height: 12px; - border-radius: 50%; + width: 12px; + height: 12px; + border-radius: 50%; } .code-dot:nth-child(1) { background: #FF5F57; } .code-dot:nth-child(2) { background: #FFBD2E; } .code-dot:nth-child(3) { background: #28CA42; } .code-title { - margin-left: var(--space-4); - color: var(--text-muted); - font-size: var(--text-sm); + margin-left: 16px; + color: var(--text-muted); + font-size: 14px; } .code-block { - font-family: var(--font-mono); - font-size: var(--text-base); - line-height: 1.8; - color: var(--text-primary); - white-space: pre-wrap; + font-family: 'JetBrains Mono', monospace; + font-size: 16px; + line-height: 1.8; + color: var(--text-primary); + white-space: pre-wrap; } - -/* LQL syntax highlighting */ .keyword { color: var(--volcanic); } .operator { color: var(--forest); } .function { color: var(--amber); } @@ -354,437 +221,214 @@ pre code { padding: 0; background: none; border: none; color: inherit; } .comment { color: var(--text-muted); } .identifier { color: var(--text-primary); } -/* Feature cards */ -.features { padding: var(--space-16) 0; } -.features-alt { padding: var(--space-16) 0; background: var(--obsidian); } - +.features { + padding: 120px 0; + background: var(--obsidian); +} +.examples, +.docs-section, +.playground-strip { + padding: 120px 0; + background: var(--dark-bg); +} .section-header { - text-align: center; - margin-bottom: var(--space-12); + text-align: center; + margin-bottom: 80px; } .section-header h2 { - font-size: var(--text-4xl); - font-weight: 700; - margin-bottom: var(--space-4); + font-size: 48px; + font-weight: 700; + margin-bottom: 16px; + color: var(--text-primary); } .section-header p { - font-size: var(--text-lg); - max-width: 600px; - margin: 0 auto; + font-size: 20px; + color: var(--text-secondary); + max-width: 760px; + margin: 0 auto; } .features-grid { - display: grid; - grid-template-columns: repeat(auto-fit, minmax(280px, 1fr)); - gap: var(--space-6); -} - -.card { - padding: var(--space-6); - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - transition: all 0.3s ease; -} -.card:hover { - transform: translateY(-4px); - border-color: var(--volcanic); - box-shadow: 0 12px 40px rgba(0, 0, 0, 0.3); + display: grid; + grid-template-columns: repeat(auto-fit, minmax(350px, 1fr)); + gap: 32px; +} +.feature-card, +.doc-card, +.examples-section, +.input-section, +.output-section { + background: var(--card-bg); + border: 1px solid var(--border); + border-radius: 12px; + padding: 32px; +} +.feature-card { + transition: all 0.3s ease; +} +.feature-card:hover { + transform: translateY(-4px); + border-color: var(--volcanic); + box-shadow: 0 12px 40px rgba(0, 0, 0, 0.3); +} +.feature-icon { + width: 48px; + height: 48px; + background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); + border-radius: 8px; + display: flex; + align-items: center; + justify-content: center; + margin-bottom: 24px; + font-size: 18px; + font-weight: 800; +} +.feature-card h3, +.doc-card h3 { + font-size: 24px; + font-weight: 600; + margin-bottom: 16px; + color: var(--text-primary); +} +.feature-card p, +.doc-card p { + color: var(--text-secondary); + line-height: 1.6; + margin-bottom: 16px; +} + +.examples-grid { + display: grid; + gap: 48px; } - -.card-icon { - width: 48px; - height: 48px; - display: flex; - align-items: center; - justify-content: center; - background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); - color: white; - border-radius: var(--radius); - margin-bottom: var(--space-4); - font-size: var(--text-xl); - font-weight: 700; -} -.card h3 { margin-bottom: var(--space-2); } -.card p { margin: 0; } -.card a { display: inline-block; margin-top: var(--space-3); } - -/* Examples section */ -.examples { padding: var(--space-16) 0; } -.examples-grid { display: grid; gap: var(--space-12); } - .example { - display: grid; - grid-template-columns: 1fr 1fr; - gap: var(--space-12); - align-items: center; + display: grid; + grid-template-columns: 1fr 1fr; + gap: 48px; + align-items: center; } .example:nth-child(even) { direction: rtl; } .example:nth-child(even) > * { direction: ltr; } - .example-content h3 { - font-size: var(--text-2xl); - font-weight: 700; - margin-bottom: var(--space-4); + font-size: 32px; + font-weight: 700; + margin-bottom: 16px; + color: var(--text-primary); } .example-content p { - font-size: var(--text-lg); - margin-bottom: var(--space-6); + font-size: 18px; + color: var(--text-secondary); + margin-bottom: 24px; } .example-features { - list-style: none; - padding: 0; + list-style: none; } .example-features li { - display: flex; - align-items: center; - gap: var(--space-3); - margin-bottom: var(--space-3); + display: flex; + align-items: center; + gap: 12px; + margin-bottom: 12px; + color: var(--text-secondary); } .example-features li::before { - content: '\2192'; - color: var(--forest); - font-weight: bold; + content: '→'; + color: var(--forest); + font-weight: bold; } - -.example-code { - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - padding: var(--space-8); +.docs-grid { + display: grid; + gap: 32px; } - -/* F# section */ -.fsharp { padding: var(--space-16) 0; background: var(--obsidian); } -.fsharp-content { - display: grid; - grid-template-columns: 1fr 1fr; - gap: var(--space-12); - align-items: center; -} -.fsharp-info h3 { - font-size: var(--text-2xl); - font-weight: 700; - margin-bottom: var(--space-4); -} -.fsharp-info p { - font-size: var(--text-lg); - margin-bottom: var(--space-6); - line-height: 1.7; -} -.fsharp-features { - list-style: none; - padding: 0; -} -.fsharp-features li { - display: flex; - align-items: center; - gap: var(--space-3); - margin-bottom: var(--space-3); -} -.fsharp-features li::before { - content: '\2192'; - color: var(--violet); - font-weight: bold; -} -.fsharp-code { - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - padding: var(--space-8); +.doc-card { + overflow-x: auto; } - -/* Playground */ -.playground { padding: var(--space-16) 0; } -.playground-content { - display: grid; - grid-template-columns: 1fr 1fr; - gap: var(--space-8); - margin-bottom: var(--space-8); -} -.playground-panel { - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - padding: var(--space-6); -} -.playground-panel h3 { - font-size: var(--text-lg); - font-weight: 600; - margin-bottom: var(--space-4); -} -.playground-controls { - display: flex; - gap: var(--space-4); - align-items: center; - margin-bottom: var(--space-4); -} -.dialect-selector { - background: var(--darker-bg); - border: 1px solid var(--border-color); - border-radius: 6px; - padding: var(--space-2) var(--space-3); - color: var(--text-primary); - font-size: var(--text-sm); -} -.dialect-selector:focus { outline: none; border-color: var(--volcanic); } -.convert-btn { - background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); - color: white; - border: none; - padding: var(--space-2) var(--space-4); - border-radius: 6px; - font-weight: 600; - cursor: pointer; - transition: all 0.3s ease; -} -.convert-btn:hover { - transform: translateY(-1px); - box-shadow: 0 6px 20px rgba(255, 69, 0, 0.3); -} -.convert-btn:disabled { opacity: 0.6; cursor: not-allowed; transform: none; } -.lql-input { - width: 100%; - height: 300px; - background: var(--darker-bg); - border: 1px solid var(--border-color); - border-radius: var(--radius); - padding: var(--space-4); - font-family: var(--font-mono); - font-size: var(--text-sm); - color: var(--text-primary); - resize: vertical; -} -.lql-input:focus { outline: none; border-color: var(--volcanic); } -.sql-output { - background: var(--darker-bg); - border: 1px solid var(--border-color); - border-radius: var(--radius); - padding: var(--space-4); - font-family: var(--font-mono); - font-size: var(--text-sm); - color: var(--text-primary); - min-height: 300px; - white-space: pre-wrap; - overflow-y: auto; -} -.error-message { - background: rgba(255, 69, 0, 0.1); - border: 1px solid var(--volcanic); - border-radius: 6px; - padding: var(--space-3); - color: var(--volcanic); - font-size: var(--text-sm); - margin-top: var(--space-4); -} -.example-buttons { - display: flex; - gap: var(--space-3); - flex-wrap: wrap; -} -.example-btn { - background: transparent; - color: var(--forest); - border: 1px solid var(--forest); - padding: var(--space-2) var(--space-4); - border-radius: 6px; - font-size: var(--text-sm); - cursor: pointer; - transition: all var(--transition); -} -.example-btn:hover { background: var(--forest); color: white; } - -/* Docs layout */ -.docs-layout { - display: grid; - grid-template-columns: var(--sidebar-width) 1fr; - min-height: calc(100vh - var(--header-height)); +pre { + background: var(--darker-bg); + border: 1px solid var(--border); + border-radius: 8px; + padding: 16px; + margin: 16px 0 24px; + overflow-x: auto; } - -.sidebar { - position: sticky; - top: var(--header-height); - height: calc(100vh - var(--header-height)); - overflow-y: auto; - padding: var(--space-6); - background: var(--bg-secondary); - border-right: 1px solid var(--border-color); +code { + font-family: 'JetBrains Mono', monospace; + color: var(--amber); + font-size: 14px; } - -.sidebar-section { margin-bottom: var(--space-6); } -.sidebar-section h4 { - font-size: var(--text-sm); - font-weight: 600; - text-transform: uppercase; - letter-spacing: 0.05em; - color: var(--text-muted); - margin-bottom: var(--space-3); -} -.sidebar-section ul { list-style: none; padding: 0; margin: 0; } -.sidebar-section li { margin: 0; } -.sidebar-section a { - display: block; - padding: var(--space-2) var(--space-3); - color: var(--text-secondary); - border-radius: 4px; - transition: all var(--transition); -} -.sidebar-section a:hover, .sidebar-section a.active { - background: var(--bg-tertiary); - color: var(--volcanic); +pre code { + color: var(--text-primary); + line-height: 1.7; } - -.docs-content { - padding: var(--space-8); - max-width: 900px; -} -.docs-content h1 { margin-bottom: var(--space-6); } -.docs-content h2 { - margin-top: var(--space-10); - margin-bottom: var(--space-4); - padding-bottom: var(--space-2); - border-bottom: 1px solid var(--border-color); -} -.docs-content h3 { margin-top: var(--space-8); margin-bottom: var(--space-3); } - -/* Tables */ table { - width: 100%; - border-collapse: collapse; - margin-bottom: var(--space-6); - background: var(--card-bg); - border-radius: var(--radius); - overflow: hidden; -} -th, td { - padding: var(--space-3) var(--space-4); - text-align: left; - border-bottom: 1px solid var(--border-color); + width: 100%; + border-collapse: collapse; + margin: 16px 0 24px; + overflow: hidden; + border-radius: 8px; +} +th, +td { + text-align: left; + padding: 12px 16px; + border-bottom: 1px solid var(--border); + color: var(--text-secondary); + vertical-align: top; } th { - background: var(--obsidian); - color: var(--text-primary); - font-weight: 600; - text-transform: uppercase; - font-size: var(--text-sm); - letter-spacing: 0.05em; -} -td code { - background: var(--bg-tertiary); - padding: var(--space-1) var(--space-2); - border-radius: 4px; - font-size: var(--text-sm); + background: var(--obsidian); + color: var(--text-primary); + font-weight: 700; } -/* Syntax highlighting (Eleventy plugin) */ -.token.comment { color: #6B7280; } -.token.keyword { color: var(--volcanic); } -.token.string { color: var(--violet); } -.token.function, .token.class-name { color: var(--amber); } -.token.number { color: var(--amber); } -.token.operator { color: var(--forest); } - -/* Footer */ -.footer { - padding: var(--space-16) 0 var(--space-8); - background: var(--bg-secondary); - border-top: 1px solid var(--border-color); -} -.footer-grid { - display: grid; - grid-template-columns: repeat(auto-fit, minmax(180px, 1fr)); - gap: var(--space-8); - margin-bottom: var(--space-8); -} -.footer-section h3 { - font-size: var(--text-sm); - font-weight: 600; - text-transform: uppercase; - letter-spacing: 0.05em; - margin-bottom: var(--space-4); -} -.footer-section ul { list-style: none; padding: 0; margin: 0; } -.footer-section li { margin-bottom: var(--space-2); } -.footer-section a { color: var(--text-secondary); font-size: var(--text-sm); } -.footer-section a:hover { color: var(--volcanic); } +.text-center { text-align: center; } -.footer-bottom { - padding-top: var(--space-8); - border-top: 1px solid var(--border-color); - text-align: center; +footer { + background: var(--darker-bg); + border-top: 1px solid var(--border); + padding: 48px 0; + text-align: center; +} +.footer-content { color: var(--text-muted); } +.footer-links { + display: flex; + justify-content: center; + gap: 32px; + margin-bottom: 24px; + flex-wrap: wrap; } -.footer-bottom p { - font-size: var(--text-sm); - color: var(--text-muted); - margin-bottom: var(--space-2); +.footer-links a { + color: var(--text-secondary); + text-decoration: none; + transition: color 0.2s; } +.footer-links a:hover { color: var(--volcanic); } -/* Responsive */ -@media (max-width: 1024px) { - .docs-layout { grid-template-columns: 1fr; } - .sidebar { - display: none; - position: fixed; - top: var(--header-height); - left: 0; - width: 100%; - height: calc(100vh - var(--header-height)); - z-index: 50; - } - .sidebar.open { display: block; } +@media (max-width: 980px) { + .header-content { flex-wrap: wrap; } + nav ul { gap: 18px; flex-wrap: wrap; } + .site-toggle { margin-right: 0; } } @media (max-width: 768px) { - :root { - --text-5xl: 2.25rem; - --text-4xl: 1.875rem; - --text-3xl: 1.5rem; - } - - .container { padding: 0 var(--space-3); } - - .nav-links { - display: none; - position: fixed; - top: var(--header-height); - left: 0; - width: 100%; - padding: var(--space-4); - background: var(--bg-secondary); - border-bottom: 1px solid var(--border-color); - flex-direction: column; - gap: var(--space-2); - } - .nav-links.open { display: flex; } - - .mobile-menu-toggle { display: flex; } - - .nav-actions .btn { display: none; } - - .hero { padding: var(--space-12) 0; } - .hero h1 { font-size: var(--text-4xl); } - .hero-subtitle { font-size: var(--text-base); } - .hero-buttons { flex-direction: column; align-items: center; width: 100%; } - .hero-buttons .btn { width: 100%; max-width: 280px; } - .hero-code { margin: var(--space-8) auto 0; } - - .example { grid-template-columns: 1fr; } - .example:nth-child(even) { direction: ltr; } - - .fsharp-content { grid-template-columns: 1fr; } - - .playground-content { grid-template-columns: 1fr; } - - .features-grid { grid-template-columns: 1fr; } - - .card { padding: var(--space-4); } - .docs-content { padding: var(--space-4); } - pre { font-size: var(--text-xs); padding: var(--space-3); } - table { display: block; overflow-x: auto; } - .footer-grid { gap: var(--space-6); } + .hero { padding: 80px 0; } + .hero h1 { font-size: 48px; } + .hero .subtitle { font-size: 20px; } + .cta-buttons { flex-direction: column; align-items: center; } + .section-header { margin-bottom: 48px; } + .section-header h2 { font-size: 36px; } + .features, + .examples, + .docs-section, + .playground-strip { padding: 72px 0; } + .features-grid { grid-template-columns: 1fr; } + .example { grid-template-columns: 1fr; } + .example { text-align: center; } + .example:nth-child(even) { direction: ltr; } + nav ul { display: none; } + .hero-code, + .example-code, + .feature-card, + .doc-card { padding: 24px; } + .code-block, + pre code { font-size: 13px; } } - -/* Utilities */ -.text-center { text-align: center; } -.mt-8 { margin-top: var(--space-8); } -.mb-8 { margin-bottom: var(--space-8); } diff --git a/Lql/LqlWebsite-Eleventy/_site/assets/js/playground.js b/Lql/LqlWebsite-Eleventy/_site/assets/js/playground.js deleted file mode 100644 index 65c2f148..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/assets/js/playground.js +++ /dev/null @@ -1,80 +0,0 @@ -(function() { - 'use strict'; - - const examples = { - simple: 'users |> select(users.id, users.name, users.email)', - join: 'users\n|> join(orders, on = users.id = orders.user_id)\n|> select(users.name, orders.total, orders.status)', - filter: 'employees\n|> select(employees.id, employees.name, employees.salary)\n|> filter(fn(row) => row.employees.salary > 50000 and row.employees.department = \'Engineering\')', - aggregate: 'orders\n|> group_by(orders.user_id)\n|> select(\n orders.user_id,\n count(*) as order_count,\n sum(orders.total) as total_amount,\n avg(orders.total) as avg_amount\n)\n|> having(fn(group) => count(*) > 2)\n|> order_by(total_amount desc)', - complex: '-- Complex analytics query\nlet joined =\n users\n |> join(orders, on = users.id = orders.user_id)\n |> filter(fn(row) => row.orders.status = \'completed\')\n\njoined\n|> group_by(users.id)\n|> select(\n users.name,\n count(*) as total_orders,\n sum(orders.total) as revenue,\n avg(orders.total) as avg_order_value\n)\n|> filter(fn(row) => row.revenue > 1000)\n|> order_by(revenue desc)\n|> limit(10)' - }; - - const lqlInput = document.getElementById('lql-input'); - const sqlOutput = document.getElementById('sql-output'); - const errorMessage = document.getElementById('error-message'); - const convertBtn = document.getElementById('convert-btn'); - const dialectSelector = document.getElementById('dialect-selector'); - const outputTitle = document.getElementById('output-title'); - - // Load default example - lqlInput.value = examples.simple; - - // Update output title when dialect changes - dialectSelector.addEventListener('change', function() { - outputTitle.textContent = this.value === 'SqlServer' ? 'SQL Server Output' : 'PostgreSQL Output'; - }); - - // Convert button - calls the Blazor WASM transpiler via JS interop - convertBtn.addEventListener('click', async function() { - const lql = lqlInput.value.trim(); - if (!lql) { - showError('Please enter some LQL code to convert.'); - return; - } - - convertBtn.disabled = true; - convertBtn.textContent = 'Converting...'; - errorMessage.style.display = 'none'; - sqlOutput.textContent = 'Converting...'; - - try { - // Call the Blazor WASM transpiler if available - if (window.lqlTranspile) { - const dialect = dialectSelector.value; - const result = await window.lqlTranspile(lql, dialect); - if (result.error) { - showError(result.error); - sqlOutput.textContent = ''; - } else { - sqlOutput.textContent = result.sql; - } - } else { - // Fallback: show a message that the transpiler is loading or unavailable - sqlOutput.textContent = 'The LQL transpiler is loading. Please wait a moment and try again.\n\nIf this persists, the Blazor WASM runtime may not be available.'; - } - } catch (err) { - showError('An unexpected error occurred: ' + err.message); - sqlOutput.textContent = ''; - } finally { - convertBtn.disabled = false; - convertBtn.textContent = 'Convert to SQL'; - } - }); - - // Example buttons - document.querySelectorAll('.example-btn[data-example]').forEach(function(btn) { - btn.addEventListener('click', function() { - const key = this.getAttribute('data-example'); - if (examples[key]) { - lqlInput.value = examples[key]; - errorMessage.style.display = 'none'; - sqlOutput.textContent = "Click 'Convert to SQL' to see the result."; - } - }); - }); - - function showError(msg) { - errorMessage.textContent = msg; - errorMessage.style.display = 'block'; - } -})(); diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/aggregation/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/aggregation/index.html deleted file mode 100644 index b2746e62..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/aggregation/index.html +++ /dev/null @@ -1,394 +0,0 @@ -<!DOCTYPE html> -<html lang="en"> -<head> - <meta charset="UTF-8"> - <meta name="viewport" content="width=device-width, initial-scale=1.0"> - <title>Aggregation - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Aggregation

-

LQL provides full aggregation support with group by, aggregate functions, and having clauses.

-

Aggregate Functions

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
FunctionDescription
count(*)Count all rows
sum(column)Sum of values
avg(column)Average of values
min(column)Minimum value
max(column)Maximum value
-

Basic Aggregation

-
orders
-|> group_by(orders.status)
-|> select(
-    orders.status,
-    count(*) as order_count
-)
-
-

Multiple Group Columns

-
orders
-|> group_by(orders.user_id, orders.status)
-|> select(
-    orders.user_id,
-    orders.status,
-    count(*) as order_count,
-    sum(orders.total) as total_amount
-)
-
-

Having Clause

-

Filter groups after aggregation using lambda expressions:

-
orders
-|> group_by(orders.user_id)
-|> having(fn(group) => count(*) > 2)
-|> select(
-    orders.user_id,
-    count(*) as order_count,
-    sum(orders.total) as total_amount,
-    avg(orders.total) as avg_amount
-)
-
-

Complete Analytics Query

-
orders
-|> group_by(orders.user_id, orders.status)
-|> select(
-    orders.user_id,
-    orders.status,
-    count(*) as order_count,
-    sum(orders.total) as total_amount,
-    avg(orders.total) as avg_amount
-)
-|> having(fn(group) => count(*) > 2)
-|> order_by(total_amount desc)
-
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/ai-integration/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/ai-integration/index.html deleted file mode 100644 index 5ae3da91..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/ai-integration/index.html +++ /dev/null @@ -1,584 +0,0 @@ - - - - - - AI-Powered Completions - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

AI-Powered Completions

-

The LQL Language Server has built-in support for AI-powered code completions. Connect a local model via Ollama or use a cloud provider to get intelligent, context-aware query suggestions alongside the standard schema and keyword completions.

-

How It Works

-
graph TD
-    A[VS Code Editor] <-->|completions| B[lql-lsp]
-    B --> C["Schema\ncolumns, tables\npriority 0–4"]
-    B --> D["Keywords\nfunctions, operators\npriority 1–3"]
-    B --> E["AI Model\nasync with timeout\npriority 6"]
-

On every completion request, the LSP runs three sources in parallel:

-
    -
  1. Schema completions (priority 0-4) - Table names, column names from your database
  2. -
  3. Keyword completions (priority 1-3) - Pipeline operations, functions, keywords
  4. -
  5. AI completions (priority 6) - Intelligent suggestions from a language model
  6. -
-

All results are merged and sorted by priority. Schema and keyword completions always appear first; AI suggestions supplement them at the bottom. If the AI model is slow or unavailable, you still get instant schema and keyword completions.

-

Timeout Enforcement

-

AI completions are wrapped in a configurable timeout (default: 2000ms). If the model doesn't respond in time, the LSP silently drops the AI results and returns only schema/keyword completions. This guarantees the editor never feels sluggish, regardless of AI model latency.

-

What the AI Model Receives

-

Every completion request sends the AI model rich context about your current editing state:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
ContextDescription
Full documentThe complete .lql file text
Cursor positionLine and column number
Line prefixText from the start of the line to the cursor
Word prefixThe partial word being typed
File URIPath to the current file
Table namesAll tables from the database schema
Schema descriptionCompact schema: users(id uuid PK NOT NULL, name text, email text)
-

The schema description gives the model full knowledge of your database structure, so it can suggest syntactically valid and schema-aware queries.

- -

Ollama runs language models locally on your machine. No API keys, no cloud, no data leaves your laptop.

-

1. Install Ollama

-

Download and install from ollama.com.

-

2. Pull a Code Model

-
ollama pull qwen2.5-coder:1.5b
-

3. Configure VS Code

-

Add to your settings.json:

-
{
-  "lql.aiProvider": {
-    "provider": "ollama",
-    "endpoint": "http://localhost:11434/api/generate",
-    "model": "qwen2.5-coder:1.5b",
-    "enabled": true
-  }
-}
-

4. Start Writing LQL

-

Open any .lql file and start typing. AI suggestions appear in the completion list alongside schema and keyword completions, marked as Snippet items.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
ModelParametersSpeedQualityBest For
qwen2.5-coder:1.5b1.5BFastGoodDaily use, quick responses
deepseek-coder:1.3b1.3BFastGoodLightweight alternative
codellama:7b7BModerateBetterComplex queries, more context
qwen2.5-coder:7b7BModerateBetterHigher quality suggestions
-

For the best experience, start with qwen2.5-coder:1.5b - it provides good suggestions with minimal latency.

-

Provider Configuration

-

The AI provider is configured via initializationOptions during the LSP handshake, which VS Code passes from your settings:

-
{
-  "lql.aiProvider": {
-    "provider": "ollama",
-    "endpoint": "http://localhost:11434/api/generate",
-    "model": "qwen2.5-coder:1.5b",
-    "apiKey": "",
-    "timeoutMs": 2000,
-    "enabled": true
-  }
-}
-

Configuration Fields

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
FieldTypeDefaultDescription
providerstringrequiredProvider type: ollama, openai, anthropic, custom
endpointstringrequiredFull URL of the API endpoint
modelstring"default"Model identifier (provider-specific)
apiKeystringnullAPI key for cloud providers
timeoutMsnumber2000Maximum time to wait for AI response (ms)
enabledbooleantrueEnable/disable AI completions
-

Supported Providers

-

Ollama (Local)

-

Runs entirely on your machine. The LSP calls the Ollama /api/generate endpoint and injects the LQL language reference as system context, giving the model knowledge of LQL syntax.

-
{
-  "provider": "ollama",
-  "endpoint": "http://localhost:11434/api/generate",
-  "model": "qwen2.5-coder:1.5b"
-}
-

The Ollama provider:

-
    -
  • Sends the full document, cursor position, and schema as a structured prompt
  • -
  • Uses low temperature (0.1) for deterministic, focused completions
  • -
  • Limits response to 256 tokens for fast turnaround
  • -
  • Parses the model's JSON array response into completion items
  • -
  • Handles markdown code fence wrapping in model responses
  • -
-

OpenAI / Anthropic / Custom (Cloud)

-

Configure any OpenAI-compatible or custom endpoint:

-
{
-  "provider": "openai",
-  "endpoint": "https://api.openai.com/v1/completions",
-  "model": "gpt-4",
-  "apiKey": "sk-..."
-}
-

Cloud providers require an API key. The same context (document, cursor, schema) is sent to the model.

-

Custom Providers

-

Set provider to "custom" and point endpoint to any API that accepts the same prompt format. The LSP logs the configuration on startup so you can verify it's active.

-

How AI Completions Merge

-

The completion pipeline works as follows:

-
    -
  1. Schema + keyword completions are computed synchronously (instant)
  2. -
  3. AI completions are requested asynchronously with a timeout
  4. -
  5. If AI responds within the timeout, results are appended to the list
  6. -
  7. If AI times out, only schema + keyword results are returned
  8. -
  9. All items are sorted by sort_priority before sending to the editor
  10. -
-
Priority 0: Column completions (users.id, users.name)
-Priority 1: Pipeline operations (select, filter, join)
-Priority 2: Functions (count, sum, avg, concat)
-Priority 3: Keywords (let, fn, as, and, or)
-Priority 4: Table names (users, orders, products)
-Priority 5: Let bindings (active_users, joined)
-Priority 6: AI suggestions (context-aware snippets)
-
-

This means AI suggestions never push schema completions out of view - they always appear at the bottom of the list as supplementary suggestions.

-

Disabling AI

-

To disable AI completions without removing the configuration:

-
{
-  "lql.aiProvider": {
-    "provider": "ollama",
-    "endpoint": "http://localhost:11434/api/generate",
-    "model": "qwen2.5-coder:1.5b",
-    "enabled": false
-  }
-}
-

Or simply remove the lql.aiProvider section from your settings.

-

Troubleshooting

-

No AI completions appearing

-
    -
  1. Check the LQL Language Server output channel (View > Output > LQL Language Server) for provider activation messages
  2. -
  3. Verify Ollama is running: curl http://localhost:11434/api/tags
  4. -
  5. Verify the model is pulled: ollama list
  6. -
  7. Check enabled is not set to false
  8. -
-

AI completions are slow

-
    -
  1. Try a smaller model (qwen2.5-coder:1.5b instead of codellama:7b)
  2. -
  3. Increase timeoutMs if you prefer waiting for better results
  4. -
  5. Ensure Ollama has enough RAM (1.5B models need ~2GB, 7B models need ~8GB)
  6. -
-

AI suggestions are irrelevant

-
    -
  1. Ensure your database is connected - schema context dramatically improves AI suggestions
  2. -
  3. Try a different model - qwen2.5-coder tends to produce better LQL-specific completions
  4. -
  5. The LQL reference document is automatically injected as system context for Ollama
  6. -
-

Verifying the pipeline works

-

Use the built-in test provider to confirm AI completions flow end-to-end:

-
{
-  "lql.aiProvider": {
-    "provider": "test",
-    "endpoint": "http://localhost",
-    "enabled": true
-  }
-}
-

This returns deterministic completions (like ai_suggest_filter, ai_suggest_join) without any external service, proving the full pipeline works.

- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/database-config/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/database-config/index.html deleted file mode 100644 index 68c6ee02..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/database-config/index.html +++ /dev/null @@ -1,469 +0,0 @@ - - - - - - Database Configuration - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Database Configuration

-

The LQL Language Server can connect to your PostgreSQL database to provide schema-aware features like column completions, table hover, and qualified column hover.

-

Why Connect a Database?

-

Without a database connection, the LSP still provides:

-
    -
  • Keyword and function completions
  • -
  • Pipeline operation suggestions
  • -
  • Parse error diagnostics
  • -
  • Hover documentation for LQL constructs
  • -
  • Document formatting and symbols
  • -
-

With a database connection, you additionally get:

-
    -
  • Column completions - Type users. and see all columns with types
  • -
  • Table name completions - See all tables with column counts
  • -
  • Table hover - Hover over a table name to see its full schema
  • -
  • Column hover - Hover over users.email to see type, nullability, and PK status
  • -
-

Connection Methods

-

The LSP resolves the database connection in this priority order:

- -

Add a connection string to your VS Code settings.json:

-
{
-  "lql.connectionString": "host=localhost dbname=myapp user=postgres password=secret"
-}
-

This is passed to the LSP via initializationOptions.connectionString.

-

2. Environment Variable: LQL_CONNECTION_STRING

-
export LQL_CONNECTION_STRING="host=localhost dbname=myapp user=postgres password=secret"
-

3. Environment Variable: DATABASE_URL

-
export DATABASE_URL="postgres://postgres:secret@localhost/myapp"
-

Supported Connection Formats

-

The LSP accepts multiple PostgreSQL connection string formats and normalizes them automatically.

-

libpq Format

-

The native PostgreSQL format:

-
host=localhost dbname=myapp user=postgres password=secret
-
-

With port:

-
host=localhost port=5433 dbname=myapp user=postgres password=secret
-
-

Npgsql Format (.NET style)

-

Semicolon-delimited key=value pairs. These are automatically converted to libpq format:

-
Host=localhost;Database=myapp;Username=postgres;Password=secret
-
-

Mapping:

-
    -
  • Host -> host
  • -
  • Database -> dbname
  • -
  • Username -> user
  • -
  • Password -> password
  • -
  • Port -> port
  • -
-

URI Format

-

PostgreSQL connection URI:

-
postgres://postgres:secret@localhost/myapp
-postgresql://postgres:secret@localhost:5433/myapp
-
-

Schema Introspection

-

On startup (and when the connection is available), the LSP queries information_schema.columns and information_schema.key_column_usage to discover:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
MetadataSource
Table namesinformation_schema.columns
Column namesinformation_schema.columns
Column typesdata_type column
Nullabilityis_nullable column
Primary keysinformation_schema.key_column_usage
-

The schema is cached in memory for fast lookups. Connection timeout is 10 seconds, query timeout is 30 seconds.

-

Graceful Degradation

-

If the database is unreachable or the connection string is invalid:

-
    -
  • The LSP logs the error and continues without schema
  • -
  • All non-schema features remain fully functional
  • -
  • No error is shown to the user (check the LQL Language Server output channel for diagnostics)
  • -
-

This means you can use the extension without any database - you just won't get table/column completions.

-

Schema-Aware Features in Detail

-

Column Completions

-

When you type a table name followed by ., the LSP shows all columns for that table:

-
users.
-
-

Completion list shows:

-
id       uuid (PK) NOT NULL
-name     text NOT NULL
-email    text
-status   text
-
-

Table Completions

-

Table names appear in the completion list with metadata:

-
users    (4 columns: id, name, email, status)
-orders   (6 columns: id, user_id, total, status, ...)
-
-

Table Hover

-

Hovering over a table name shows the full schema:

-
Table: users
-
-| Column | Type | PK | Nullable |
-|--------|------|----|----------|
-| id     | uuid | Y  | N        |
-| name   | text |    | N        |
-| email  | text |    | Y        |
-| status | text |    | Y        |
-
-

Qualified Column Hover

-

Hovering over users.email shows:

-
Column: users.email
-Type: text
-Nullable: yes
-Primary Key: no
-
-

Troubleshooting

-

No schema completions

-
    -
  1. Check the LQL Language Server output channel (View > Output > LQL Language Server)
  2. -
  3. Verify your connection string is correct
  4. -
  5. Ensure PostgreSQL is running and accessible
  6. -
  7. Check firewall/network rules
  8. -
-

Connection string not picked up

-
    -
  1. VS Code settings take priority over environment variables
  2. -
  3. Restart VS Code after changing environment variables
  4. -
  5. Try the libpq format if other formats don't work
  6. -
-

Schema is stale

-

The schema is fetched once on startup. Restart the language server to refresh:

-
    -
  1. Open the command palette (Ctrl+Shift+P)
  2. -
  3. Run Developer: Reload Window
  4. -
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/fsharp-type-provider/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/fsharp-type-provider/index.html deleted file mode 100644 index 4be795e6..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/fsharp-type-provider/index.html +++ /dev/null @@ -1,373 +0,0 @@ - - - - - - F# Type Provider - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

F# Type Provider

-

The LQL Type Provider brings compile-time type checking to your LQL queries in F#. Write queries with IntelliSense support, catch errors before runtime, and enjoy seamless integration with your F# codebase.

-

Installation

-
<PackageReference Include="Lql.TypeProvider.FSharp" Version="*" />
-

Basic Usage

-
open Lql
-
-// Define types with validated LQL - errors caught at COMPILE TIME
-type GetUsers = LqlCommand<"Users |> select(Users.Id, Users.Name, Users.Email)">
-type ActiveUsers = LqlCommand<"Users |> filter(fn(row) => row.Status = 'active') |> select(*)">
-
-// Access generated SQL and original query
-let sql = GetUsers.Sql      // Generated SQL string
-let query = GetUsers.Query  // Original LQL string
-

What Gets Validated

-

The type provider validates your LQL at compile time and generates two properties:

-
    -
  • Query - The original LQL query string
  • -
  • Sql - The generated SQL (SQLite dialect)
  • -
-

Query Examples

-
// Select with columns
-type SelectColumns = LqlCommand<"Users |> select(Users.Id, Users.Name, Users.Email)">
-
-// Filtering with AND/OR
-type FilterComplex = LqlCommand<"Users |> filter(fn(row) => row.Users.Age > 18 and row.Users.Status = 'active') |> select(*)">
-
-// Joins
-type JoinQuery = LqlCommand<"Users |> join(Orders, on = Users.Id = Orders.UserId) |> select(Users.Name, Orders.Total)">
-type LeftJoin = LqlCommand<"Users |> left_join(Orders, on = Users.Id = Orders.UserId) |> select(*)">
-
-// Aggregations with GROUP BY and HAVING
-type GroupBy = LqlCommand<"Orders |> group_by(Orders.UserId) |> select(Orders.UserId, count(*) as order_count)">
-type Having = LqlCommand<"Orders |> group_by(Orders.UserId) |> having(fn(g) => count(*) > 5) |> select(Orders.UserId, count(*) as cnt)">
-
-// Order, limit, offset
-type Pagination = LqlCommand<"Users |> order_by(Users.Name asc) |> limit(10) |> offset(20) |> select(*)">
-
-// Arithmetic expressions
-type Calculated = LqlCommand<"Products |> select(Products.Price * Products.Quantity as total)">
-

Compile-Time Error Example

-

Invalid LQL causes a build error with line/column position:

-
// This FAILS to compile with: "Invalid LQL syntax at line 1, column 15"
-type BadQuery = LqlCommand<"Users |> selectt(*)">  // typo: 'selectt'
-

Executing Queries

-
open Microsoft.Data.Sqlite
-
-let executeQuery() =
-    use conn = new SqliteConnection("Data Source=mydb.db")
-    conn.Open()
-
-    // SQL is validated at compile time, safe to execute
-    use cmd = new SqliteCommand(GetUsers.Sql, conn)
-    use reader = cmd.ExecuteReader()
-    // ... process results
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/index.html deleted file mode 100644 index 5dd4a0f8..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/index.html +++ /dev/null @@ -1,399 +0,0 @@ - - - - - - Introduction - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Introduction

-

A functional pipeline-style DSL that transpiles to SQL. Write database logic once, run it anywhere.

-

The Problem

-

SQL dialects differ. PostgreSQL, SQLite, and SQL Server each have their own quirks. This creates problems:

-
    -
  • Migrations - Schema changes need different SQL for each database
  • -
  • Business Logic - Triggers, stored procedures, and constraints vary by vendor
  • -
  • Sync Logic - Offline-first apps need identical logic on client (SQLite) and server (Postgres)
  • -
  • Testing - Running tests against SQLite while production uses Postgres
  • -
-

The Solution

-

LQL is a single query language that transpiles to any SQL dialect. Write once, deploy everywhere.

-
Users
-|> filter(fn(row) => row.Age > 18 and row.Status = 'active')
-|> join(Orders, on = Users.Id = Orders.UserId)
-|> group_by(Users.Id, Users.Name)
-|> select(Users.Name, sum(Orders.Total) as TotalSpent)
-|> order_by(TotalSpent desc)
-|> limit(10)
-
-

This transpiles to correct SQL for PostgreSQL, SQLite, or SQL Server.

-

Use Cases

-

Cross-Database Migrations

-

Define schema changes in LQL. Migration.CLI generates the right SQL for your target database.

-

Cross DB Platform Business Logic With Triggers

-

Write triggers and constraints in LQL. Deploy the same logic to any database.

-

Offline-First Sync

-

Sync framework uses LQL for conflict resolution. Same logic runs on mobile (SQLite) and server (Postgres).

-

Integration Testing

-

Test against SQLite locally, deploy to Postgres in production. Same queries, same results.

-

Pipeline Operations

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
OperationDescription
select(cols...)Choose columns
filter(fn(row) => ...)Filter rows
join(table, on = ...)Join tables
left_join(table, on = ...)Left join
group_by(cols...)Group rows
having(fn(row) => ...)Filter groups
order_by(col [asc/desc])Sort results
limit(n) / offset(n)Pagination
distinct()Unique rows
union(query)Combine queries
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/installation/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/installation/index.html deleted file mode 100644 index a35830f9..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/installation/index.html +++ /dev/null @@ -1,349 +0,0 @@ - - - - - - Installation - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Installation

-

NuGet Packages

-

LQL provides dialect-specific packages for each target database:

-
<!-- SQLite -->
-<PackageReference Include="Lql.SQLite" Version="*" />
-
-<!-- PostgreSQL -->
-<PackageReference Include="Lql.Postgres" Version="*" />
-
-<!-- SQL Server -->
-<PackageReference Include="Lql.SqlServer" Version="*" />
-

CLI Tool

-

Install the LQL CLI for command-line transpilation:

-
dotnet tool install -g LqlCli.SQLite
-

F# Type Provider

-

For compile-time validated LQL queries in F#:

-
<PackageReference Include="Lql.TypeProvider.FSharp" Version="*" />
-

VS Code Extension

-

Search for LQL in VS Code Extensions marketplace for:

-
    -
  • Syntax highlighting
  • -
  • IntelliSense completions
  • -
  • Real-time diagnostics
  • -
  • Hover documentation
  • -
  • Document formatting
  • -
-

Requirements

-
    -
  • .NET 9.0 or later
  • -
  • One of the supported databases: SQLite, PostgreSQL, or SQL Server
  • -
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/joins/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/joins/index.html deleted file mode 100644 index b0d1905a..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/joins/index.html +++ /dev/null @@ -1,357 +0,0 @@ - - - - - - Joins - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Joins

-

LQL supports multiple join types for combining data from different tables.

-

Inner Join

-

Returns only rows that have matching values in both tables:

-
users
-|> join(orders, on = users.id = orders.user_id)
-|> select(users.name, orders.total, orders.status)
-
-

Left Join

-

Returns all rows from the left table, with matching rows from the right table (or NULL):

-
users
-|> left_join(orders, on = users.id = orders.user_id)
-|> select(users.name, orders.total)
-
-

Multiple Joins

-

Chain joins to combine more than two tables:

-
users
-|> join(orders, on = users.id = orders.user_id)
-|> join(products, on = orders.product_id = products.id)
-|> select(users.name, products.name, orders.quantity)
-
-

Join with Filter

-

Combine joins with filtering:

-
users
-|> join(orders, on = users.id = orders.user_id)
-|> filter(fn(row) => row.orders.status = 'completed')
-|> select(users.name, orders.total)
-
-

Join with Aggregation

-
users
-|> join(orders, on = users.id = orders.user_id)
-|> group_by(users.id, users.name)
-|> select(
-    users.name,
-    count(*) as total_orders,
-    sum(orders.total) as revenue
-)
-|> order_by(revenue desc)
-
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/lambdas/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/lambdas/index.html deleted file mode 100644 index d1770cec..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/lambdas/index.html +++ /dev/null @@ -1,393 +0,0 @@ - - - - - - Lambda Expressions - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Lambda Expressions

-

Lambda expressions are the functional core of LQL. They provide type-safe, composable predicates for filtering and transforming data.

-

Syntax

-
fn(parameter) => expression
-
-

The parameter represents a row of data. Access columns using parameter.table.column:

-
fn(row) => row.users.age > 18
-
-

In filter

-

The most common use is filtering rows:

-
users |> filter(fn(row) => row.users.active = true)
-
-

Compound Expressions

-

Combine conditions with and and or:

-
employees |> filter(fn(row) =>
-    row.employees.salary > 50000 and
-    row.employees.salary < 100000
-)
-
-
users |> filter(fn(row) =>
-    row.users.role = 'admin' or
-    row.users.role = 'superadmin'
-)
-
-

In having

-

Lambdas also work with having to filter groups:

-
orders
-|> group_by(orders.user_id)
-|> having(fn(group) => count(*) > 5)
-|> select(orders.user_id, count(*) as order_count)
-
-

Comparison Operators

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
OperatorMeaning
=Equal
!=Not equal
>Greater than
<Less than
>=Greater than or equal
<=Less than or equal
-

String Comparisons

-
users |> filter(fn(row) => row.users.name = 'Alice')
-users |> filter(fn(row) => row.users.status != 'inactive')
-
-

Arithmetic in Lambdas

-
products |> filter(fn(row) =>
-    row.products.price * row.products.quantity > 1000
-)
-
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/language-server/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/language-server/index.html deleted file mode 100644 index 110a6fba..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/language-server/index.html +++ /dev/null @@ -1,561 +0,0 @@ - - - - - - Language Server - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Language Server

-

The LQL Language Server (lql-lsp) is a native Rust implementation that provides IDE features for .lql files. It communicates via the Language Server Protocol (LSP) over JSON-RPC on stdin/stdout.

-

Key capabilities: schema-aware completions via Database Configuration, intelligent suggestions via AI-Powered Completions, real-time diagnostics, hover documentation, and formatting.

-

Architecture

-
graph TD
-    A[VS Code Extension] <-->|JSON-RPC / stdio| B[lql-lsp - Rust]
-    B --> C[lql-parser - ANTLR]
-    B --> D[lql-analyzer]
-    B --> E[PostgreSQL Schema Cache]
-    B --> F[AI Model - Ollama / Cloud]
-

Built with:

-
    -
  • tower-lsp - LSP protocol framework (JSON-RPC, message framing)
  • -
  • antlr-rust - ANTLR4 grammar-based parser with error recovery
  • -
  • tokio - Async runtime for concurrent schema fetching and AI calls
  • -
  • tokio-postgres - PostgreSQL client for schema introspection
  • -
  • reqwest - HTTP client for AI provider communication
  • -
-

LSP Capabilities

-

The server registers these capabilities on initialization:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
CapabilityDescription
textDocumentSync: FullFull document synced on every change
completionProviderTriggered by: . | > (
hoverProviderHover info for keywords, tables, columns
documentSymbolProviderlet bindings shown in outline
documentFormattingProviderFull-document formatting
-

Completion Engine

-

Completions are organized into priority layers. Lower numbers appear first:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
PriorityCategoryCountRequires DB
0Column completions (table.col)Per-tableYes
1Pipeline operations14No
2Functions (aggregate, string, math, date)40+No
3Keywords30+No
4Table namesPer-schemaYes
5Variable bindings (let names)Per-documentNo
6AI completionsVariableOptional
-

Context Detection

-

The completion engine detects context to filter suggestions:

-
    -
  • After |> - Shows pipeline operations
  • -
  • After table. - Shows columns for that table
  • -
  • In argument list - Shows functions, columns, keywords
  • -
  • In lambda body - Shows row field access patterns
  • -
  • Word prefix - Filters all completions by typed prefix
  • -
-

Trigger Characters

-

Completions auto-trigger on: . (column access), | and > (pipe), ( (function args), (space after pipe).

-

AI Completion Pipeline

-

When an AI provider is configured, the LSP merges AI-generated completions with schema and keyword results on every request:

-
    -
  1. Schema + keyword completions are computed synchronously (instant)
  2. -
  3. AI completions are requested asynchronously via HTTP (e.g., Ollama /api/generate)
  4. -
  5. A configurable timeout (default 2000ms) wraps the AI call
  6. -
  7. If AI responds in time, results are appended at priority 6
  8. -
  9. If AI times out, only schema + keyword results are returned - no latency penalty
  10. -
-

The AI model receives full context: document text, cursor position, line prefix, word prefix, and a compact schema description (e.g., users(id uuid PK NOT NULL, name text, email text)). This enables schema-aware suggestions even from small local models.

-

See AI-Powered Completions for setup instructions and provider configuration.

-

Diagnostics

-

Four categories of diagnostics, published on every document change:

-

Parse Errors (ERROR)

-

From the ANTLR parser. Syntax errors with exact line/column positions:

-
-- Error: mismatched input 'selectt' expecting...
-users |> selectt(users.id)
-         ^^^^^^^^
-
-

Bracket Validation (ERROR)

-

Document-level parenthesis matching:

-
-- Error: unclosed parenthesis
-users |> select(users.id
-                       ^
-
-

Pipe Spacing (WARNING)

-

The |> operator should be surrounded by spaces:

-
-- Warning: pipe operator should be surrounded by spaces
-users|>select(*)
-     ^^
-
-

Unknown Functions (INFO)

-

Functions not in the 82-entry known function list:

-
-- Info: unknown function 'foobar'
-users |> foobar(users.id)
-         ^^^^^^
-
-

Hover Information

-

The hover database contains 50+ entries covering all LQL constructs.

-

Keyword Hover

-

Hovering over select, filter, join, etc. shows descriptions with usage patterns.

-

Schema-Aware Hover

-

With a database connection:

-
    -
  • Table name hover - Shows all columns with types, PK, and nullable indicators
  • -
  • Qualified column hover (users.email) - Shows column type, nullability, primary key status
  • -
-

Document Symbols

-

Extracts let bindings as SymbolKind::Variable for the VS Code outline and breadcrumb views:

-
let active_users = users |> filter(...)    -> Symbol: active_users
-let orders_2024 = orders |> filter(...)    -> Symbol: orders_2024
-
-

Formatting

-

The formatter applies consistent indentation rules:

-
    -
  • Pipeline continuations (|>) get 4-space indent
  • -
  • Nested parentheses increase indent level
  • -
  • Closing ) decreases indent level
  • -
  • Lines are trimmed of trailing whitespace
  • -
  • Comments and blank lines are preserved
  • -
-

Before:

-
users
-|> filter(fn(row) => row.users.active)
-|> select(
-users.id,
-users.name
-)
-
-

After:

-
users
-    |> filter(fn(row) => row.users.active)
-    |> select(
-        users.id,
-        users.name
-    )
-
-

Initialization Options

-

The server accepts configuration via initializationOptions during the LSP initialize handshake:

-
{
-  "connectionString": "host=localhost dbname=myapp user=postgres",
-  "aiProvider": {
-    "provider": "ollama",
-    "endpoint": "http://localhost:11434",
-    "model": "qwen2.5-coder:1.5b",
-    "apiKey": "",
-    "timeoutMs": 2000,
-    "enabled": true
-  }
-}
-

See Database Configuration and VS Code Extension - AI Configuration for details.

-

Crate Structure

- - - - - - - - - - - - - - - - - - - - - -
CratePurpose
lql-parserANTLR grammar, lexer, parser, parse tree, error recovery
lql-analyzerCompletions, diagnostics, hover database, symbols, schema cache
lql-lspLSP server binary, tower-lsp integration, AI providers, DB client
-

Building from Source

-
cd Lql/lql-lsp-rust
-cargo build --release
-

The binary is at target/release/lql-lsp.

-

Running Tests

-
cargo test --workspace
-

With Coverage

-
./test-coverage.sh
-

Individual crate coverage:

-
cargo tarpaulin --packages lql-parser --engine llvm --exclude-files "*/generated/*"
-cargo tarpaulin --packages lql-analyzer --engine llvm
-cargo tarpaulin --packages lql-lsp --engine llvm
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/let-bindings/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/let-bindings/index.html deleted file mode 100644 index 67cfdabc..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/let-bindings/index.html +++ /dev/null @@ -1,358 +0,0 @@ - - - - - - Let Bindings - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Let Bindings

-

Let bindings allow you to name intermediate query results and reuse them, making complex queries more readable and composable.

-

Basic Syntax

-
let name = expression
-
-

Simple Example

-
let active_users = users |> filter(fn(row) => row.users.status = 'active')
-
-active_users |> select(active_users.name, active_users.email)
-
-

Building Complex Queries

-

Let bindings shine when building multi-step analytics:

-
-- Step 1: Join and filter
-let joined =
-    users
-    |> join(orders, on = users.id = orders.user_id)
-    |> filter(fn(row) => row.orders.status = 'completed')
-
--- Step 2: Aggregate
-joined
-|> group_by(users.id)
-|> select(
-    users.name,
-    count(*) as total_orders,
-    sum(orders.total) as revenue,
-    avg(orders.total) as avg_order_value
-)
-|> filter(fn(row) => row.revenue > 1000)
-|> order_by(revenue desc)
-|> limit(10)
-
-

Reusability

-

Define a filtered dataset once and use it in multiple contexts:

-
let engineering = employees
-    |> filter(fn(row) => row.employees.department = 'Engineering')
-
--- Use for different analyses
-engineering |> select(engineering.name, engineering.salary)
-engineering |> group_by(engineering.level) |> select(engineering.level, avg(engineering.salary) as avg_salary)
-
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/pipelines/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/pipelines/index.html deleted file mode 100644 index 11f0fd86..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/pipelines/index.html +++ /dev/null @@ -1,399 +0,0 @@ - - - - - - Pipeline Operators - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Pipeline Operators

-

Pipeline operators are the core of LQL. Each operation takes the result of the previous step and transforms it.

-

select

-

Choose which columns to include in the output:

-
users |> select(users.id, users.name, users.email)
-
-

Select all columns:

-
users |> select(*)
-
-

With computed columns:

-
products |> select(
-    products.name,
-    products.price * products.quantity as total_value,
-    round(products.price / 2, 2) as half_price
-)
-
-

filter

-

Filter rows using lambda expressions:

-
users |> filter(fn(row) => row.users.age > 18)
-
-

Combine conditions with and / or:

-
employees |> filter(fn(row) =>
-    row.employees.salary > 50000 and
-    row.employees.department = 'Engineering'
-)
-
-

join

-

Inner join two tables:

-
users |> join(orders, on = users.id = orders.user_id)
-
-

left_join

-

Left join preserving all rows from the left table:

-
users |> left_join(orders, on = users.id = orders.user_id)
-
-

group_by

-

Group rows by one or more columns:

-
orders |> group_by(orders.status)
-
-

Multiple grouping columns:

-
orders |> group_by(orders.user_id, orders.status)
-
-

having

-

Filter groups after aggregation:

-
orders
-|> group_by(orders.user_id)
-|> having(fn(group) => count(*) > 5)
-|> select(orders.user_id, count(*) as order_count)
-
-

order_by

-

Sort results ascending or descending:

-
users |> order_by(users.name asc)
-users |> order_by(users.created_at desc)
-
-

limit / offset

-

Pagination:

-
users |> limit(10)
-users |> limit(10) |> offset(20)
-
-

distinct

-

Remove duplicate rows:

-
orders |> select(orders.status) |> distinct()
-
-

union

-

Combine results from two queries:

-
active_users |> union(inactive_users)
-
-

Chaining Operations

-

The real power comes from chaining multiple operations:

-
users
-|> join(orders, on = users.id = orders.user_id)
-|> filter(fn(row) => row.orders.status = 'completed')
-|> group_by(users.id, users.name)
-|> select(
-    users.name,
-    count(*) as total_orders,
-    sum(orders.total) as revenue
-)
-|> having(fn(group) => count(*) > 2)
-|> order_by(revenue desc)
-|> limit(10)
-
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/quick-start/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/quick-start/index.html deleted file mode 100644 index 964189ba..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/quick-start/index.html +++ /dev/null @@ -1,346 +0,0 @@ - - - - - - Quick Start - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Quick Start

-

Install

-

NuGet Packages

-
<PackageReference Include="Lql.SQLite" Version="*" />
-<PackageReference Include="Lql.Postgres" Version="*" />
-<PackageReference Include="Lql.SqlServer" Version="*" />
-

CLI Tool

-
dotnet tool install -g LqlCli.SQLite
-

Your First Query

-

Write your first LQL query:

-
users |> select(users.id, users.name, users.email)
-
-

This transpiles to:

-
SELECT users.id, users.name, users.email FROM users
-

Programmatic Usage

-
using Lql;
-using Lql.SQLite;
-
-var lql = "Users |> filter(fn(row) => row.Age > 21) |> select(Name, Email)";
-var sql = LqlCodeParser.Parse(lql).ToSql(new SQLiteContext());
-

CLI Usage

-
lql --input query.lql --output query.sql
-

Next Steps

- - - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/sql-dialects/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/sql-dialects/index.html deleted file mode 100644 index 3c11791a..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/sql-dialects/index.html +++ /dev/null @@ -1,393 +0,0 @@ - - - - - - SQL Dialects - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

SQL Dialects

-

LQL is database platform independent. The same query transpiles to correct SQL for each target database.

-

Supported Dialects

- - - - - - - - - - - - - - - - - - - - - - - - - -
DialectPackageStatus
PostgreSQLLql.PostgresFull support
SQL ServerLql.SqlServerFull support
SQLiteLql.SQLiteFull support
-

Example

-

This LQL query:

-
users
-|> filter(fn(row) => row.users.age > 18)
-|> select(users.name, users.email)
-|> order_by(users.name asc)
-|> limit(10)
-
-

PostgreSQL Output

-
SELECT users.name, users.email
-FROM users
-WHERE users.age > 18
-ORDER BY users.name ASC
-LIMIT 10
-

SQL Server Output

-
SELECT TOP 10 users.name, users.email
-FROM users
-WHERE users.age > 18
-ORDER BY users.name ASC
-

SQLite Output

-
SELECT users.name, users.email
-FROM users
-WHERE users.age > 18
-ORDER BY users.name ASC
-LIMIT 10
-

Dialect Differences Handled by LQL

-

LQL abstracts away common dialect differences:

-
    -
  • LIMIT/TOP - PostgreSQL and SQLite use LIMIT, SQL Server uses TOP
  • -
  • String concatenation - || vs +
  • -
  • Boolean literals - TRUE/FALSE vs 1/0
  • -
  • ILIKE - PostgreSQL-specific case-insensitive LIKE
  • -
-

Programmatic Dialect Selection

-
using Lql;
-using Lql.Postgres;
-using Lql.SqlServer;
-using Lql.SQLite;
-
-var lql = "Users |> select(Users.Name)";
-var statement = LqlStatementConverter.ToStatement(lql);
-
-// Generate for each dialect
-var postgres = statement.ToPostgreSql();
-var sqlServer = statement.ToSqlServer();
-var sqlite = statement.ToSQLite();
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/syntax/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/syntax/index.html deleted file mode 100644 index 3f0e3216..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/syntax/index.html +++ /dev/null @@ -1,372 +0,0 @@ - - - - - - Syntax Overview - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

Syntax Overview

-

LQL uses a functional pipeline syntax where data flows through a series of transformations using the pipeline operator |>.

-

Basic Structure

-

Every LQL query starts with a table reference and pipes data through operations:

-
table_name |> operation1(...) |> operation2(...)
-
-

Table References

-

Simply name the table to start a query:

-
users
-employees
-orders
-
-

Pipeline Operator

-

The |> operator passes the result of the left side to the right side:

-
users |> select(users.id, users.name)
-
-

Column References

-

Columns are referenced with the table.column syntax:

-
users.id
-users.name
-orders.total
-
-

Lambda Expressions

-

Lambdas use the fn(param) => expression syntax:

-
filter(fn(row) => row.users.age > 18)
-filter(fn(row) => row.users.status = 'active' and row.users.age > 21)
-
-

Let Bindings

-

Store intermediate results with let:

-
let active_users = users |> filter(fn(row) => row.users.status = 'active')
-
-active_users |> select(active_users.name, active_users.email)
-
-

Aliases

-

Use as to rename columns in output:

-
users |> select(
-    users.name,
-    users.salary * 12 as annual_salary
-)
-
-

Comments

-

Single-line comments start with --:

-
-- Get all active users
-users |> filter(fn(row) => row.users.active)
-
-

Operators

-

Comparison

-

=, >, <, >=, <=, !=

-

Logical

-

and, or

-

Arithmetic

-

+, -, *, /

-

Sorting

-

asc, desc

- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/docs/vscode/index.html b/Lql/LqlWebsite-Eleventy/_site/docs/vscode/index.html deleted file mode 100644 index 70ceea22..00000000 --- a/Lql/LqlWebsite-Eleventy/_site/docs/vscode/index.html +++ /dev/null @@ -1,713 +0,0 @@ - - - - - - VS Code Extension - - - - - - - - - - - - - - - - - - - - - - - - - -
- -
- -
- -
- - -
-

VS Code Extension

-

The LQL VS Code extension provides a rich editing experience for .lql files, powered by a native Rust Language Server.

-

Installation

-

Search for LQL in the VS Code Extensions marketplace and click Install. The extension automatically downloads the correct LSP binary for your platform on first activation.

-

Supported platforms:

-
    -
  • Linux x64
  • -
  • macOS x64 (Intel)
  • -
  • macOS ARM64 (Apple Silicon)
  • -
  • Windows x64
  • -
-

Features

-

Syntax Highlighting

-

Full TextMate grammar with semantic colorization for all LQL constructs:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
TokenColorExample
KeywordsOrange-redlet, fn, as, asc, desc
Pipeline operatorForest green|>
Lambda operatorViolet=>
Query functionsForest greenselect, filter, join
Aggregate functionsVioletcount, sum, avg
String literalsLime green'completed'
CommentsDark slate-- comment
Table/column namesWhite/greenusers.id
-

The extension includes a dedicated LQL Dark color theme optimized for LQL syntax.

-

IntelliSense Completions

-

Context-aware completions triggered automatically as you type. Completions are organized by priority:

-

1. Column completions - Type table. to see all columns from that table (requires database connection):

-
    -
  • Shows column type, primary key indicator, and nullability
  • -
  • Example: users. suggests id (uuid PK NOT NULL), name (text), email (text)
  • -
-

2. Pipeline operations - Suggested after |>:

-
    -
  • select, filter, join, left_join, right_join, cross_join
  • -
  • group_by, order_by, having, limit, offset
  • -
  • union, union_all, insert, distinct
  • -
-

3. Functions - Suggested in expression contexts:

-
    -
  • Aggregate: count, sum, avg, min, max, first, last, row_number, rank
  • -
  • String: concat, substring, length, trim, upper, lower, replace
  • -
  • Math: round, floor, ceil, abs, sqrt, power, mod
  • -
  • Date/Time: now, today, year, month, day, extract, date_trunc
  • -
  • Conditional: coalesce, nullif, isnull, isnotnull
  • -
-

4. Keywords - let, fn, as, and, or, not, distinct, null, case, when, etc.

-

5. Table names - From your database schema, showing column count and first 5 columns

-

6. Variable bindings - let bindings from the current document

-

7. AI completions - Optional intelligent suggestions from an AI model (see AI Configuration)

-

Snippets

-

23 built-in snippets with tab stops for fast query authoring:

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
PrefixDescription
selectBasic select all
selectcSelect specific columns
selectfSelect with filter
filterandFilter with AND
filterorFilter with OR
joinInner join
leftjoinLeft join
groupbyGroup by with count
groupbyhavingGroup by with having
orderbyOrder by ascending
limitLimit results
limitoffsetPagination (limit + offset)
distinctSelect distinct
unionUnion queries
letLet binding
caseCase expression
fnLambda function
pipelineFull pipeline example
-

Real-Time Diagnostics

-

Errors and warnings appear as you type with squiggly underlines:

-

Errors (red):

-
    -
  • ANTLR parse errors with line/column position
  • -
  • Unmatched closing parenthesis
  • -
  • Unclosed parenthesis at end of file
  • -
-

Warnings (yellow):

-
    -
  • Pipe operator |> not surrounded by spaces
  • -
-

Information (blue):

-
    -
  • Unknown function names (not in the built-in function list)
  • -
-

Hover Documentation

-

Hover over any LQL keyword, function, or operator to see:

-
    -
  • Description and usage
  • -
  • Syntax signature
  • -
-

With a database connection, hover also shows:

-
    -
  • Table hover: All columns with types, PK/nullable indicators
  • -
  • Column hover (e.g., users.email): Column type, nullability, primary key status
  • -
-

Document Symbols

-

The outline view shows all let bindings in your file, enabling quick navigation with Ctrl+Shift+O.

-

Document Formatting

-

Format your entire document with Shift+Alt+F or right-click and select Format LQL Document:

-
    -
  • Consistent 4-space indentation for pipeline continuations
  • -
  • Proper indentation inside parentheses
  • -
  • Trimmed whitespace
  • -
  • Preserved comments and blank lines
  • -
-

Commands

- - - - - - - - - - - - - - - - - - - - - -
CommandDescription
Format LQL DocumentFormat the current .lql file
Validate LQL DocumentTrigger validation diagnostics
Show Compiled SQLShow the transpiled SQL output
-

Commands are available in the command palette (Ctrl+Shift+P) and the editor context menu when editing .lql files.

-

Extension Settings

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
SettingTypeDefaultDescription
lql.languageServer.enabledbooleantrueEnable/disable the language server
lql.languageServer.traceenumoffLSP trace level: off, messages, verbose
lql.validation.enabledbooleantrueEnable/disable real-time validation
lql.formatting.enabledbooleantrueEnable/disable document formatting
-

AI Configuration

-

The extension supports optional AI-powered completions via local or remote models. AI completions are merged with schema and keyword completions - they supplement, never replace.

- -
    -
  1. Install Ollama
  2. -
  3. Pull a code model:
    ollama pull qwen2.5-coder:1.5b
    -
  4. -
  5. Add to your VS Code settings.json:
    {
    -  "lql.aiProvider": {
    -    "provider": "ollama",
    -    "endpoint": "http://localhost:11434",
    -    "model": "qwen2.5-coder:1.5b",
    -    "enabled": true
    -  }
    -}
    -
  6. -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
ModelSizeSpeedQuality
qwen2.5-coder:1.5b1.5BFastGood
deepseek-coder:1.3b1.3BFastGood
codellama:7b7BSlowerBetter
-

AI Provider Settings

-
{
-  "lql.aiProvider": {
-    "provider": "ollama",
-    "endpoint": "http://localhost:11434",
-    "model": "qwen2.5-coder:1.5b",
-    "apiKey": "",
-    "timeoutMs": 2000,
-    "enabled": true
-  }
-}
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
FieldDescription
providerProvider type: ollama, openai, anthropic, custom
endpointAPI endpoint URL
modelModel identifier (optional, provider-specific)
apiKeyAPI key (optional, for cloud providers)
timeoutMsTimeout in milliseconds (default: 2000)
enabledEnable/disable AI completions
-

What the AI Sees

-

The AI model receives full context for accurate suggestions:

-
    -
  • The complete document text
  • -
  • Cursor position (line and column)
  • -
  • Current line prefix and word prefix
  • -
  • File URI
  • -
  • Database schema (table names, column names, types)
  • -
-

Responses that exceed the timeout are silently dropped - you always get fast keyword and schema completions regardless of AI latency.

-

Language Features

-

Comment Support

-
    -
  • Line comments: -- comment
  • -
  • Block comments: /* comment */
  • -
-

Bracket Matching

-

Auto-closing and matching for (), [], {}, '', ""

-

Folding

-

Region-based folding with -- #region and -- #endregion markers.

-

LSP Binary

-

The extension bundles a native Rust language server (lql-lsp). On first activation, it searches for the binary in this order:

-
    -
  1. Bundled bin/lql-lsp in the extension directory
  2. -
  3. Local development build (target/release/lql-lsp or target/debug/lql-lsp)
  4. -
  5. Previously cached binary in VS Code global storage
  6. -
  7. Downloads from GitHub Releases matching the extension version
  8. -
  9. Falls back to lql-lsp on your system PATH
  10. -
- - - -
-
- -
- - - - - - - diff --git a/Lql/LqlWebsite-Eleventy/_site/index.html b/Lql/LqlWebsite-Eleventy/_site/index.html index e50a0c7a..6d67b022 100644 --- a/Lql/LqlWebsite-Eleventy/_site/index.html +++ b/Lql/LqlWebsite-Eleventy/_site/index.html @@ -1,400 +1,468 @@ - - - Lambda Query Language - Functional Data Querying - - - - - - - - - - - - - - - - - - - - - + + + Lambda Query Language (LQL) - Functional Data Querying + + + + + + + + + + + + + + + + + - - -
- -
- -
-
-
-

Lambda Query Language

-

- Functional programming meets data querying. Write elegant, composable queries with the power of lambda expressions and pipeline operators. -

- - -
-
-
-
-
-
- complex_analytics.lql -
-
-- Join users + orders, filter only completed orders -let joined = +
+
+

Lambda Query Language

+

A functional pipeline language for database queries, migration logic, RLS predicates, and portable SQL generation.

+ + + +
+
+
+
+
+ tenant_orders.lql +
+
-- Join, filter, aggregate, and emit dialect-specific SQL +let completed_orders = users |> join(orders, on = users.id = orders.user_id) |> filter(fn(row) => row.orders.status = 'completed') --- Aggregate and analyze -joined -|> group_by(users.id) +completed_orders +|> group_by(users.id, users.name) |> select( users.name, - count(*) as total_orders, + count(*) as order_count, sum(orders.total) as revenue -)
-
+) +|> order_by(revenue desc)
+
+
-
-
-
-
-

Why LQL?

-

Functional programming principles applied to data querying for cleaner, more maintainable code.

-
+
+
+
+

What LQL Is For

+

Use one expression language for query pipelines, database-portable SQL, and migration-time predicates.

+
-
-
-
λ
-

Functional First

-

Built on pure functional programming principles. Immutable data transformations, lambda expressions, and composable operations make your queries predictable and testable.

-
- -
-
|>
-

Pipeline Operators

-

Chain operations naturally with pipeline operators. Data flows from left to right, making complex transformations easy to read and understand.

-
- -
-
TS
-

Type Safe

-

Strong typing ensures your queries are correct at compile time. No more runtime surprises from typos or schema mismatches.

-
- -
-
fn
-

Composable

-

Build complex queries from simple, reusable components. Define once, use everywhere with let bindings and function composition.

-
- -
-
SQL
-

SQL Compatible

-

Compiles to optimized SQL for your target database. Get the performance of SQL with the elegance of functional programming.

-
- -
-
DX
-

Developer Focused

-

Designed by developers, for developers. Excellent tooling support with VS Code extension, LSP, and clear error messages.

-
+
+
+
|>
+

Pipeline Queries

+

Start from a table or binding and pass rows through select, filter, join, grouping, ordering, pagination, and set operations.

+
+
+
SQL
+

Dialect Output

+

The same LQL statement emits PostgreSQL, SQL Server, or SQLite SQL through the dialect packages.

+
+
+
RLS
+

Migrations

+

Migration YAML can use LQL for RLS policies and PostgreSQL function bodies instead of embedding raw platform SQL.

+
+
+
LSP
+

Editor Tooling

+

The Rust language server powers diagnostics, formatting, schema-aware completions, hovers, and optional AI suggestions.

+
+
+
F#
+

Type Provider

+

F# projects can validate LQL at compile time and expose generated SQL through the type provider.

+
+
+
DX
+

Transpiler Demo

+

The Blazor page is kept as a focused playground for running client-side transpilation only.

+
+
-
-
-
-

See LQL in Action

-

Real examples showing the power and elegance of functional data querying.

-
- -
-
-
-

Simple Selection

-

Clean, readable syntax for basic data selection. No verbose SELECT statements or complex syntax.

-
    -
  • Pipeline operator for natural flow
  • -
  • Clear column specification
  • -
  • Type-safe field access
  • -
+
+
+

Examples

+

The core language is small: compose table pipelines, lambda predicates, and expression columns.

-
-
users |> select( + +
+
+
+

Simple Selection

+

Select explicit columns from a source table. Fully qualify columns when that keeps intent clear.

+
    +
  • Table-first query shape
  • +
  • Column projection with `select`
  • +
  • Portable output SQL
  • +
+
+
+
users |> select( users.id, users.name, users.email )
-
-
- -
-
-

Advanced Filtering

-

Lambda expressions provide powerful, type-safe filtering with full access to row data.

-
    -
  • Lambda function syntax
  • -
  • Logical operators (and, or)
  • -
  • Range filtering
  • -
-
-
-
employees -|> select( - employees.id, - employees.name, - employees.salary -) -|> filter(fn(row) => - row.employees.salary > 50000 and - row.employees.salary < 100000 -)
-
-
- -
-
-

Arithmetic & Functions

-

Rich expression support with mathematical operations and built-in functions for complex calculations.

-
    -
  • Mathematical expressions
  • -
  • Column aliases with 'as'
  • -
  • Function composition
  • -
-
-
-
products -|> select( - products.name, - products.price * products.quantity as total_value, - products.price + 10 as price_plus_ten, - round(products.price / 2, 2) as half_price -) +
+
+ +
+
+

Lambda Filtering

+

Use `fn(row) => ...` predicates for filtering. The row parameter exposes fields through table-qualified paths.

+
    +
  • Comparison operators: `=`, `!=`, `>`, `<`, `>=`, `<=`
  • +
  • Logical operators: `and`, `or`, `not`
  • +
  • Arithmetic and function calls inside predicates
  • +
+
+
+
employees |> filter(fn(row) => - row.products.price > 0 -)
-
-
- -
-
-

Aggregation & Grouping

-

Powerful aggregation functions with group by operations and having clauses for complex analytics.

-
    -
  • Group by multiple columns
  • -
  • Aggregate functions (count, sum, avg)
  • -
  • Having clause with lambda filters
  • -
-
-
-
orders -|> group_by(orders.user_id, orders.status) -|> select( - orders.user_id, - orders.status, - count(*) as order_count, - sum(orders.total) as total_amount, - avg(orders.total) as avg_amount + row.employees.salary > 50000 and + row.employees.department = 'Engineering' ) -|> having(fn(group) => count(*) > 2) -|> order_by(total_amount desc)
+|> select(employees.id, employees.name)
+
+
+ +
+
+

Joins And Aggregation

+

Join tables, group rows, compute aggregate columns, filter groups, and order the result.

+
    +
  • `join`, `left_join`, `right_join`, `full_join`, `cross_join`
  • +
  • `count`, `sum`, `avg`, `min`, `max`
  • +
  • `having` filters grouped rows
  • +
+
+
+
users +|> join(orders, on = users.id = orders.user_id) +|> group_by(users.id, users.name) +|> having(fn(group) => count(*) > 2) +|> select(users.name, sum(orders.total) as revenue)
+
+
-
-
-
-
-
-

F# Type Provider

-

Use LQL with full type safety in F# through our native type provider.

-
- -
-
-

Type-Safe LQL in F#

-

The LQL Type Provider brings compile-time type checking to your LQL queries in F#. Write queries with IntelliSense support, catch errors before runtime, and enjoy seamless integration with your F# codebase.

-
    -
  • Compile-time query validation
  • -
  • Full IntelliSense support for table and column names
  • -
  • Automatic SQL generation for PostgreSQL and SQL Server
  • -
  • Strongly-typed result sets
  • -
-
-
-
-
-
-
- Program.fs +
+
+
+

LQL Documentation

+

Everything needed to install, write, transpile, use in migrations, and wire into editor tooling.

-
open Lql - -// Define types with validated LQL -type GetUsers = - LqlCommand<"Users |> select(*)"> -// Access generated SQL -let sql = GetUsers.Sql -let query = GetUsers.Query
-
+
+
+

Install

+

Add the core parser plus the dialect package that will emit SQL for your target database.

+
dotnet add package Nimblesite.Lql.Core
+dotnet add package Nimblesite.Lql.Postgres
+dotnet add package Nimblesite.Lql.SqlServer
+dotnet add package Nimblesite.Lql.SQLite
+ + + + + + + + +
PackageUse
Nimblesite.Lql.CoreParser, AST, statement conversion.
Nimblesite.Lql.PostgresToPostgreSql() extension.
Nimblesite.Lql.SqlServerToSqlServer() extension.
Nimblesite.Lql.SQLiteToSQLite() extension.
+
+ +
+

Programmatic API

+

Parse once, then call the dialect extension. Errors are returned as `Result<T,E>` values.

+
using Nimblesite.Lql.Core;
+using Nimblesite.Lql.Postgres;
+using Nimblesite.Sql.Model;
+using Outcome;
+
+var lql = "Users |> filter(fn(row) => row.Users.Age > 21) |> select(Users.Name)";
+var statementResult = LqlStatementConverter.ToStatement(lql);
+
+if (statementResult is Result<LqlStatement, SqlError>.Ok<LqlStatement, SqlError> ok)
+{
+    Result<string, SqlError> sql = ok.Value.ToPostgreSql();
+}
+
+ +
+

Syntax

+

A query starts with a table or binding and flows left-to-right through operations.

+ + + + + + + + + + +
ConstructShape
Tableusers
Pipelineusers |> select(users.id)
Columnorders.total
Lambdafn(row) => row.users.active = true
Aliasusers.name as display_name
Let bindinglet active = users |> filter(...)
+
let active_users =
+    users |> filter(fn(row) => row.users.status = 'active')
+
+active_users
+|> select(active_users.id, active_users.name)
+|> order_by(active_users.name asc)
+|> limit(50)
+
+ +
+

Pipeline Operations

+ + + + + + + + + + + + + + +
OperationPurpose
select(cols...)Project columns or computed expressions.
filter(fn(row) => ...)Filter rows before grouping.
join(table, on = ...)Inner join to another source.
left_join(table, on = ...)Preserve rows from the left source.
group_by(cols...)Group result rows.
having(fn(group) => ...)Filter grouped rows.
order_by(col asc|desc)Sort the result.
limit(n) / offset(n)Page the result.
union(query) / union_all(query)Combine compatible result sets.
insert(target)Insert query output into another table.
+
+ +
+

Expressions And Functions

+

Expressions can compare values, combine predicates, compute columns, and call mapped SQL functions.

+ + + + + + + + + + +
GroupExamples
Comparison=, !=, >, <, >=, <=
Logicaland, or, not
Math+, -, *, /, round, abs
Aggregatecount, sum, avg, min, max
Windowrow_number, rank, dense_rank, lag, lead
Sessioncurrent_setting('app.tenant_id')::uuid for PostgreSQL migration/RLS shapes.
+
+ +
+

SQL Dialects

+

LQL keeps query intent stable while the dialect package handles SQL differences such as `LIMIT`, `TOP`, booleans, and function names.

+
users
+|> filter(fn(row) => row.users.age > 18)
+|> select(users.name, users.email)
+|> order_by(users.name asc)
+|> limit(10)
+ + + + + + + +
DialectGenerated paging shape
PostgreSQLORDER BY users.name ASC LIMIT 10
SQLiteORDER BY users.name ASC LIMIT 10
SQL ServerSELECT TOP 10 ... ORDER BY users.name ASC
+
+ +
+

Migrations, RLS, And `bodyLql`

+

DataProvider migrations use LQL where schema logic must be portable or generated consistently. Use `usingLql` and `withCheckLql` for RLS policies, and `bodyLql` for PostgreSQL function bodies that should be transpiled instead of handwritten.

+
tables:
+  - name: documents
+    columns:
+      - name: id
+        type: uuid
+        primaryKey: true
+      - name: tenant_id
+        type: uuid
+      - name: owner_id
+        type: uuid
+    rls:
+      enabled: true
+      policies:
+        - name: tenant_documents
+          command: all
+          usingLql: tenant_id = current_setting('app.tenant_id')::uuid
+          withCheckLql: owner_id = current_setting('app.user_id')::uuid
+
+functions:
+  - name: current_tenant_id
+    schema: app
+    returns: uuid
+    language: sql
+    bodyLql: current_setting('app.tenant_id')::uuid
+

For PostgreSQL, `current_setting('key')` is rewritten to the null-tolerant `current_setting('key', true)` form before SQL is emitted. `body` and `bodyLql` are mutually exclusive.

+
+ +
+

VS Code, LSP, Database Schema, And AI

+

The VS Code extension runs the Rust `lql-lsp` language server for `.lql` files.

+ + + + + + + + +
FeatureBehavior
DiagnosticsANTLR parse errors, unmatched parentheses, pipe spacing warnings, and unknown function hints.
CompletionsPipeline operations, keywords, functions, `let` bindings, table names, and columns.
Database configSet `lql.database.connectionString`, `LQL_CONNECTION_STRING`, or `DATABASE_URL` for schema-aware completions and hovers.
AI completionsOptional model completions are merged after schema and keyword results and bounded by timeout.
+
{
+  "lql.database.connectionString": "Host=localhost;Database=app;Username=postgres;Password=secret",
+  "lql.ai.provider": "ollama",
+  "lql.ai.endpoint": "http://localhost:11434/api/generate",
+  "lql.ai.model": "qwen2.5-coder:1.5b"
+}
+
+ +
+

F# Type Provider

+

The type provider validates LQL at compile time and exposes the original query plus generated SQLite SQL.

+
<PackageReference Include="Nimblesite.Lql.TypeProvider.FSharp" Version="*" />
+
open Nimblesite.Lql.Core.TypeProvider
+
+type ActiveUsers =
+    LqlCommand<"Users |> filter(fn(row) => row.Users.Status = 'active') |> select(*)">
+
+let query = ActiveUsers.Query
+let sql = ActiveUsers.Sql
+
+
-
-
-
-
-

Get Started in Minutes

-
- -
-
# Install the LQL NuGet package
-dotnet add package Lql
-
-# Write your first LQL query
-users |> select(users.id, users.name, users.email)
-
-# Transpiles to SQL:
-# SELECT users.id, users.name, users.email FROM users
-
- -
- Try the Playground +
+
+
+

Run The Transpiler

+

The Blazor app is now scoped to the interactive transpilation demo. Use it to paste LQL and inspect PostgreSQL or SQL Server output.

+
+
-
-
- - - - - + + + diff --git a/Lql/LqlWebsite-Eleventy/_site/playground/index.html b/Lql/LqlWebsite-Eleventy/_site/playground/index.html index 967cecd2..90b51b4b 100644 --- a/Lql/LqlWebsite-Eleventy/_site/playground/index.html +++ b/Lql/LqlWebsite-Eleventy/_site/playground/index.html @@ -1,198 +1,26 @@ - - - - LQL Playground - Interactive Transpiler - - - - - - - - - - - - - - - - - - - - + + + + LQL Transpiler Playground + + + + - - - -
- -
- -
- -
-
-
-

LQL Playground

-

Try Lambda Query Language and see how it transpiles to PostgreSQL or SQL Server

-
- -
-
-
-

LQL Input

-
- - -
- -
- -
-

PostgreSQL Output

-
Enter LQL code and click 'Convert to SQL' to see the result.
- -
-
- -
-

Example Queries

-

Click any example to load it into the editor:

-
- - - - - -
-
-
-
-
- - - -
- - - - + + diff --git a/Lql/LqlWebsite-Eleventy/eleventy.config.js b/Lql/LqlWebsite-Eleventy/eleventy.config.js index a91c5b10..4ef89eb1 100644 --- a/Lql/LqlWebsite-Eleventy/eleventy.config.js +++ b/Lql/LqlWebsite-Eleventy/eleventy.config.js @@ -28,10 +28,6 @@ export default function(eleventyConfig) { eleventyConfig.addPassthroughCopy("src/robots.txt"); eleventyConfig.addWatchTarget("src/assets/"); - eleventyConfig.addCollection("docs", function(collectionApi) { - return collectionApi.getFilteredByGlob("src/docs/**/*.md"); - }); - eleventyConfig.addFilter("dateFormat", (dateObj) => { return new Date(dateObj).toLocaleDateString('en-US', { year: 'numeric', month: 'long', day: 'numeric' diff --git a/Lql/LqlWebsite-Eleventy/package.json b/Lql/LqlWebsite-Eleventy/package.json index 68f6775a..2f7b55d0 100644 --- a/Lql/LqlWebsite-Eleventy/package.json +++ b/Lql/LqlWebsite-Eleventy/package.json @@ -5,7 +5,7 @@ "type": "module", "scripts": { "dev": "eleventy --serve", - "build": "eleventy" + "build": "rm -rf _site && eleventy" }, "devDependencies": { "@11ty/eleventy": "^3.1.2", diff --git a/Lql/LqlWebsite-Eleventy/src/_data/navigation.json b/Lql/LqlWebsite-Eleventy/src/_data/navigation.json index c753526a..a860d3d1 100644 --- a/Lql/LqlWebsite-Eleventy/src/_data/navigation.json +++ b/Lql/LqlWebsite-Eleventy/src/_data/navigation.json @@ -1,71 +1,18 @@ { "main": [ - { "text": "Docs", "url": "/docs/" }, + { "text": "Docs", "url": "/#docs" }, + { "text": "Syntax", "url": "/#syntax" }, + { "text": "Migrations", "url": "/#migrations" }, + { "text": "Tooling", "url": "/#tooling" }, { "text": "Playground", "url": "/playground/" }, { "text": "GitHub", "url": "https://github.com/Nimblesite/DataProvider", "external": true } ], - "docs": [ - { - "title": "Getting Started", - "items": [ - { "text": "Introduction", "url": "/docs/" }, - { "text": "Quick Start", "url": "/docs/quick-start/" }, - { "text": "Installation", "url": "/docs/installation/" } - ] - }, - { - "title": "Language", - "items": [ - { "text": "Syntax Overview", "url": "/docs/syntax/" }, - { "text": "Pipeline Operators", "url": "/docs/pipelines/" }, - { "text": "Lambda Expressions", "url": "/docs/lambdas/" }, - { "text": "Joins", "url": "/docs/joins/" }, - { "text": "Aggregation", "url": "/docs/aggregation/" }, - { "text": "Let Bindings", "url": "/docs/let-bindings/" } - ] - }, - { - "title": "Tooling", - "items": [ - { "text": "VS Code Extension", "url": "/docs/vscode/" }, - { "text": "AI-Powered Completions", "url": "/docs/ai-integration/" }, - { "text": "Database Configuration", "url": "/docs/database-config/" }, - { "text": "F# Type Provider", "url": "/docs/fsharp-type-provider/" }, - { "text": "Language Server", "url": "/docs/language-server/" } - ] - }, - { - "title": "Reference", - "items": [ - { "text": "SQL Dialects", "url": "/docs/sql-dialects/" }, - { "text": "Nimblesite.DataProvider.Core", "url": "https://dataprovider.dev", "external": true } - ] - } - ], "footer": [ - { - "title": "Documentation", - "items": [ - { "text": "Quick Start", "url": "/docs/quick-start/" }, - { "text": "Syntax Overview", "url": "/docs/syntax/" }, - { "text": "Playground", "url": "/playground/" } - ] - }, - { - "title": "Ecosystem", - "items": [ - { "text": "Nimblesite.DataProvider.Core", "url": "https://dataprovider.dev" }, - { "text": "GitHub", "url": "https://github.com/Nimblesite/DataProvider" }, - { "text": "NuGet", "url": "https://www.nuget.org/packages/Nimblesite.Lql.Core" } - ] - }, - { - "title": "More", - "items": [ - { "text": "VS Code Extension", "url": "/docs/vscode/" }, - { "text": "AI Completions", "url": "/docs/ai-integration/" }, - { "text": "F# Type Provider", "url": "/docs/fsharp-type-provider/" } - ] - } + { "text": "Docs", "url": "/#docs" }, + { "text": "Examples", "url": "/#examples" }, + { "text": "Migrations", "url": "/#migrations" }, + { "text": "Playground", "url": "/playground/" }, + { "text": "DataProvider", "url": "https://dataprovider.dev", "external": true }, + { "text": "GitHub", "url": "https://github.com/Nimblesite/DataProvider", "external": true } ] } diff --git a/Lql/LqlWebsite-Eleventy/src/_includes/layouts/base.njk b/Lql/LqlWebsite-Eleventy/src/_includes/layouts/base.njk index a752a7cd..c15ec573 100644 --- a/Lql/LqlWebsite-Eleventy/src/_includes/layouts/base.njk +++ b/Lql/LqlWebsite-Eleventy/src/_includes/layouts/base.njk @@ -1,104 +1,73 @@ - - - {{ title | default(site.title) }} - - - + + + {{ title | default(site.title) }} + + + - - - - - + + + + + + - - - - - - - - - - - {% block head %}{% endblock %} + + + + + + {% block head %}{% endblock %} - - -
- -
- -
- {% block content %}{{ content | safe }}{% endblock %} -
+ - +
+ {% block content %}{{ content | safe }}{% endblock %} +
- - + diff --git a/Lql/LqlWebsite-Eleventy/src/_includes/layouts/docs.njk b/Lql/LqlWebsite-Eleventy/src/_includes/layouts/docs.njk deleted file mode 100644 index 6547d765..00000000 --- a/Lql/LqlWebsite-Eleventy/src/_includes/layouts/docs.njk +++ /dev/null @@ -1,43 +0,0 @@ ---- -layout: layouts/base.njk ---- - -
- - - -
diff --git a/Lql/LqlWebsite-Eleventy/src/assets/css/styles.css b/Lql/LqlWebsite-Eleventy/src/assets/css/styles.css index d4c5b05b..f1ab758e 100644 --- a/Lql/LqlWebsite-Eleventy/src/assets/css/styles.css +++ b/Lql/LqlWebsite-Eleventy/src/assets/css/styles.css @@ -1,352 +1,219 @@ -/* LQL - Design System CSS - Dark-first, high contrast, minimal, reusable classes */ - :root { - /* Core Colors from LQL Design System */ - --volcanic: #FF4500; - --forest: #228B22; - --obsidian: #1C1C1C; - --amber: #FFA500; - --violet: #8A2BE2; - --charcoal: #36454F; - --ivory: #FFFFF0; - --dark-bg: #0F0F0F; - --darker-bg: #0A0A0A; - --card-bg: #1A1A1A; - --border: #2A2A2A; - - /* Semantic mappings (dark theme default) */ - --bg-primary: var(--dark-bg); - --bg-secondary: var(--darker-bg); - --bg-tertiary: var(--obsidian); - --text-primary: var(--ivory); - --text-secondary: #B0B0B0; - --text-muted: #808080; - --border-color: var(--border); - --code-bg: var(--darker-bg); - --accent: var(--volcanic); - --accent-hover: var(--amber); - - /* Typography */ - --font-sans: 'Inter', -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif; - --font-mono: 'JetBrains Mono', 'Fira Code', Consolas, monospace; - - /* Type Scale */ - --text-xs: 0.75rem; - --text-sm: 0.875rem; - --text-base: 1rem; - --text-lg: 1.125rem; - --text-xl: 1.25rem; - --text-2xl: 1.5rem; - --text-3xl: 1.875rem; - --text-4xl: 2.25rem; - --text-5xl: 3rem; - - /* Spacing */ - --space-1: 0.25rem; - --space-2: 0.5rem; - --space-3: 0.75rem; - --space-4: 1rem; - --space-6: 1.5rem; - --space-8: 2rem; - --space-10: 2.5rem; - --space-12: 3rem; - --space-16: 4rem; - --space-20: 5rem; - - /* Layout */ - --max-width: 1200px; - --header-height: 64px; - --sidebar-width: 260px; - --radius: 8px; - --transition: 200ms ease; -} - -/* Reset */ -*, *::before, *::after { box-sizing: border-box; margin: 0; padding: 0; } - -html { - scroll-behavior: smooth; - scroll-padding-top: calc(var(--header-height) + var(--space-4)); - overflow-x: hidden; -} - + --volcanic: #FF4500; + --forest: #228B22; + --obsidian: #1C1C1C; + --amber: #FFA500; + --violet: #8A2BE2; + --charcoal: #36454F; + --ivory: #FFFFF0; + --dark-bg: #0F0F0F; + --darker-bg: #0A0A0A; + --card-bg: #1A1A1A; + --border: #2A2A2A; + --text-primary: #FFFFF0; + --text-secondary: #B0B0B0; + --text-muted: #808080; +} + +* { margin: 0; padding: 0; box-sizing: border-box; } +html { scroll-behavior: smooth; scroll-padding-top: 96px; } body { - font-family: var(--font-sans); - font-size: var(--text-base); - line-height: 1.6; - color: var(--text-primary); - background: var(--bg-primary); - -webkit-font-smoothing: antialiased; - overflow-x: hidden; - width: 100%; - max-width: 100vw; + font-family: 'Inter', sans-serif; + background: var(--dark-bg); + color: var(--text-primary); + line-height: 1.6; + overflow-x: hidden; } -/* Skip link */ .skip-link { - position: absolute; - top: -100%; - left: var(--space-4); - padding: var(--space-2) var(--space-4); - background: var(--volcanic); - color: white; - border-radius: var(--radius); - z-index: 1000; -} -.skip-link:focus { top: var(--space-4); } + position: absolute; + top: -100px; + left: 24px; + padding: 8px 16px; + background: var(--volcanic); + color: white; + border-radius: 8px; + z-index: 1000; +} +.skip-link:focus { top: 16px; } -/* Container */ .container { - width: 100%; - max-width: var(--max-width); - margin: 0 auto; - padding: 0 var(--space-4); + max-width: 1200px; + margin: 0 auto; + padding: 0 24px; } -/* Typography */ -h1, h2, h3, h4, h5, h6 { - font-weight: 600; - line-height: 1.25; - color: var(--text-primary); -} -h1 { font-size: var(--text-4xl); font-weight: 700; } -h2 { font-size: var(--text-3xl); } -h3 { font-size: var(--text-2xl); } -h4 { font-size: var(--text-xl); } - -p { margin-bottom: var(--space-4); color: var(--text-secondary); } - -a { color: var(--volcanic); text-decoration: none; transition: color var(--transition); } -a:hover { color: var(--amber); } - -ul, ol { margin-bottom: var(--space-4); padding-left: var(--space-6); } -li { margin-bottom: var(--space-2); color: var(--text-secondary); } - -/* Code */ -code { - font-family: var(--font-mono); - font-size: 0.9em; - padding: 0.2em 0.4em; - background: var(--code-bg); - border: 1px solid var(--border-color); - border-radius: 4px; - color: var(--amber); +header { + background: var(--darker-bg); + border-bottom: 1px solid var(--border); + padding: 16px 0; + position: sticky; + top: 0; + z-index: 100; + backdrop-filter: blur(10px); } -pre { - font-family: var(--font-mono); - font-size: var(--text-sm); - line-height: 1.7; - padding: var(--space-4); - background: var(--code-bg); - border: 1px solid var(--border-color); - border-radius: var(--radius); - overflow-x: auto; - margin-bottom: var(--space-4); -} -pre code { padding: 0; background: none; border: none; color: inherit; } - -/* Header */ -.header { - position: sticky; - top: 0; - height: var(--header-height); - background: var(--bg-secondary); - border-bottom: 1px solid var(--border-color); - z-index: 100; - backdrop-filter: blur(10px); -} - -.nav { - display: flex; - align-items: center; - justify-content: space-between; - height: 100%; +.header-content { + display: flex; + align-items: center; + justify-content: space-between; + gap: 24px; } .logo { - display: flex; - align-items: center; - gap: var(--space-2); + display: flex; + align-items: center; + gap: 12px; + text-decoration: none; } -.logo img { height: 32px; } -.logo:hover { opacity: 0.9; } +.logo img { width: 40px; height: 40px; } .logo-text { - font-size: var(--text-xl); - font-weight: 800; - color: var(--amber); + font-size: 24px; + font-weight: 800; + color: var(--amber); } -.nav-links { - display: flex; - align-items: center; - gap: var(--space-6); - list-style: none; - margin: 0; - padding: 0; +nav ul { + display: flex; + list-style: none; + gap: 32px; } - -.nav-link { - font-weight: 500; - color: var(--text-secondary); - transition: color var(--transition); +nav a { + color: var(--text-secondary); + text-decoration: none; + font-weight: 500; + transition: color 0.2s; } -.nav-link:hover, .nav-link.active { color: var(--volcanic); } +nav a:hover { color: var(--volcanic); } -/* Site Toggle */ .site-toggle { - display: flex; - background: var(--bg-primary); - border-radius: var(--radius); - padding: 2px; - border: 1px solid var(--border-color); + display: flex; + background: var(--darker-bg); + border-radius: 8px; + padding: 2px; + border: 1px solid var(--border); + margin-right: 24px; } .site-toggle-btn { - padding: var(--space-2) var(--space-4); - font-size: var(--text-sm); - font-weight: 500; - color: var(--text-secondary); - border-radius: 6px; - transition: all var(--transition); + padding: 8px 16px; + font-size: 14px; + font-weight: 500; + color: var(--text-secondary); + border-radius: 6px; + text-decoration: none; + transition: all 0.2s; } .site-toggle-btn:hover { color: var(--text-primary); } .site-toggle-btn.active { - background: var(--amber); - color: var(--obsidian); + background: var(--amber); + color: var(--obsidian); } -.nav-actions { display: flex; align-items: center; gap: var(--space-3); } - -/* Mobile menu toggle */ -.mobile-menu-toggle { - display: none; - flex-direction: column; - gap: 4px; - padding: var(--space-2); - background: transparent; - border: none; - cursor: pointer; -} -.mobile-menu-toggle span { - display: block; - width: 24px; - height: 2px; - background: var(--text-primary); - transition: all var(--transition); +.hero { + padding: 120px 0; + background: linear-gradient(135deg, var(--darker-bg) 0%, var(--obsidian) 100%); + position: relative; + overflow: hidden; +} +.hero::before { + content: ''; + position: absolute; + inset: 0; + background: radial-gradient(circle at 30% 20%, rgba(255, 69, 0, 0.1) 0%, transparent 50%), + radial-gradient(circle at 70% 80%, rgba(34, 139, 34, 0.1) 0%, transparent 50%); + pointer-events: none; +} +.hero-content { + text-align: center; + position: relative; + z-index: 2; +} +.hero h1 { + font-size: 64px; + font-weight: 800; + margin-bottom: 24px; + background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); + -webkit-background-clip: text; + -webkit-text-fill-color: transparent; + background-clip: text; +} +.hero .subtitle { + font-size: 24px; + color: var(--text-secondary); + margin: 0 auto 48px; + max-width: 760px; +} + +.cta-buttons { + display: flex; + gap: 24px; + justify-content: center; + margin-bottom: 80px; } - -/* Buttons */ .btn { - display: inline-flex; - align-items: center; - justify-content: center; - gap: var(--space-2); - padding: var(--space-3) var(--space-6); - font-family: var(--font-sans); - font-size: var(--text-base); - font-weight: 600; - border-radius: var(--radius); - border: none; - cursor: pointer; - transition: all var(--transition); - text-decoration: none; + padding: 16px 32px; + border-radius: 8px; + font-weight: 600; + text-decoration: none; + transition: all 0.3s ease; + border: none; + cursor: pointer; + font-size: 16px; } - .btn-primary { - background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); - color: white; + background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); + color: white; } .btn-primary:hover { - transform: translateY(-2px); - box-shadow: 0 8px 25px rgba(255, 69, 0, 0.4); - color: white; + transform: translateY(-2px); + box-shadow: 0 12px 40px rgba(255, 69, 0, 0.4); } - .btn-secondary { - background: transparent; - color: var(--forest); - border: 2px solid var(--forest); + background: transparent; + color: var(--forest); + border: 2px solid var(--forest); } -.btn-secondary:hover { background: var(--forest); color: white; } - -.btn-large { padding: var(--space-4) var(--space-8); font-size: var(--text-lg); } - -/* Hero */ -.hero { - position: relative; - padding: var(--space-20) 0; - text-align: center; - background: linear-gradient(135deg, var(--darker-bg) 0%, var(--obsidian) 100%); - overflow: hidden; -} -.hero::before { - content: ''; - position: absolute; - inset: 0; - background: radial-gradient(circle at 30% 20%, rgba(255, 69, 0, 0.1) 0%, transparent 50%), - radial-gradient(circle at 70% 80%, rgba(34, 139, 34, 0.1) 0%, transparent 50%); - pointer-events: none; -} -.hero > * { position: relative; z-index: 1; } -.hero h1 { - font-size: var(--text-5xl); - font-weight: 800; - margin-bottom: var(--space-6); - background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); - -webkit-background-clip: text; - -webkit-text-fill-color: transparent; - background-clip: text; -} -.hero-subtitle { - font-size: var(--text-xl); - color: var(--text-secondary); - max-width: 600px; - margin: 0 auto var(--space-8); -} -.hero-buttons { display: flex; gap: var(--space-4); justify-content: center; flex-wrap: wrap; } -.hero-code { - max-width: 800px; - margin: var(--space-12) auto 0; - text-align: left; +.btn-secondary:hover { + background: var(--forest); + color: white; } -/* Code example styling */ -.code-window { - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - padding: var(--space-8); - overflow: hidden; +.hero-code, +.example-code { + background: var(--card-bg); + border: 1px solid var(--border); + border-radius: 12px; + padding: 32px; + max-width: 800px; + margin: 0 auto; + position: relative; } .code-header { - display: flex; - align-items: center; - gap: 8px; - margin-bottom: var(--space-6); + display: flex; + align-items: center; + gap: 8px; + margin-bottom: 24px; } .code-dot { - width: 12px; - height: 12px; - border-radius: 50%; + width: 12px; + height: 12px; + border-radius: 50%; } .code-dot:nth-child(1) { background: #FF5F57; } .code-dot:nth-child(2) { background: #FFBD2E; } .code-dot:nth-child(3) { background: #28CA42; } .code-title { - margin-left: var(--space-4); - color: var(--text-muted); - font-size: var(--text-sm); + margin-left: 16px; + color: var(--text-muted); + font-size: 14px; } .code-block { - font-family: var(--font-mono); - font-size: var(--text-base); - line-height: 1.8; - color: var(--text-primary); - white-space: pre-wrap; + font-family: 'JetBrains Mono', monospace; + font-size: 16px; + line-height: 1.8; + color: var(--text-primary); + white-space: pre-wrap; } - -/* LQL syntax highlighting */ .keyword { color: var(--volcanic); } .operator { color: var(--forest); } .function { color: var(--amber); } @@ -354,437 +221,214 @@ pre code { padding: 0; background: none; border: none; color: inherit; } .comment { color: var(--text-muted); } .identifier { color: var(--text-primary); } -/* Feature cards */ -.features { padding: var(--space-16) 0; } -.features-alt { padding: var(--space-16) 0; background: var(--obsidian); } - +.features { + padding: 120px 0; + background: var(--obsidian); +} +.examples, +.docs-section, +.playground-strip { + padding: 120px 0; + background: var(--dark-bg); +} .section-header { - text-align: center; - margin-bottom: var(--space-12); + text-align: center; + margin-bottom: 80px; } .section-header h2 { - font-size: var(--text-4xl); - font-weight: 700; - margin-bottom: var(--space-4); + font-size: 48px; + font-weight: 700; + margin-bottom: 16px; + color: var(--text-primary); } .section-header p { - font-size: var(--text-lg); - max-width: 600px; - margin: 0 auto; + font-size: 20px; + color: var(--text-secondary); + max-width: 760px; + margin: 0 auto; } .features-grid { - display: grid; - grid-template-columns: repeat(auto-fit, minmax(280px, 1fr)); - gap: var(--space-6); -} - -.card { - padding: var(--space-6); - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - transition: all 0.3s ease; -} -.card:hover { - transform: translateY(-4px); - border-color: var(--volcanic); - box-shadow: 0 12px 40px rgba(0, 0, 0, 0.3); + display: grid; + grid-template-columns: repeat(auto-fit, minmax(350px, 1fr)); + gap: 32px; +} +.feature-card, +.doc-card, +.examples-section, +.input-section, +.output-section { + background: var(--card-bg); + border: 1px solid var(--border); + border-radius: 12px; + padding: 32px; +} +.feature-card { + transition: all 0.3s ease; +} +.feature-card:hover { + transform: translateY(-4px); + border-color: var(--volcanic); + box-shadow: 0 12px 40px rgba(0, 0, 0, 0.3); +} +.feature-icon { + width: 48px; + height: 48px; + background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); + border-radius: 8px; + display: flex; + align-items: center; + justify-content: center; + margin-bottom: 24px; + font-size: 18px; + font-weight: 800; +} +.feature-card h3, +.doc-card h3 { + font-size: 24px; + font-weight: 600; + margin-bottom: 16px; + color: var(--text-primary); +} +.feature-card p, +.doc-card p { + color: var(--text-secondary); + line-height: 1.6; + margin-bottom: 16px; +} + +.examples-grid { + display: grid; + gap: 48px; } - -.card-icon { - width: 48px; - height: 48px; - display: flex; - align-items: center; - justify-content: center; - background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); - color: white; - border-radius: var(--radius); - margin-bottom: var(--space-4); - font-size: var(--text-xl); - font-weight: 700; -} -.card h3 { margin-bottom: var(--space-2); } -.card p { margin: 0; } -.card a { display: inline-block; margin-top: var(--space-3); } - -/* Examples section */ -.examples { padding: var(--space-16) 0; } -.examples-grid { display: grid; gap: var(--space-12); } - .example { - display: grid; - grid-template-columns: 1fr 1fr; - gap: var(--space-12); - align-items: center; + display: grid; + grid-template-columns: 1fr 1fr; + gap: 48px; + align-items: center; } .example:nth-child(even) { direction: rtl; } .example:nth-child(even) > * { direction: ltr; } - .example-content h3 { - font-size: var(--text-2xl); - font-weight: 700; - margin-bottom: var(--space-4); + font-size: 32px; + font-weight: 700; + margin-bottom: 16px; + color: var(--text-primary); } .example-content p { - font-size: var(--text-lg); - margin-bottom: var(--space-6); + font-size: 18px; + color: var(--text-secondary); + margin-bottom: 24px; } .example-features { - list-style: none; - padding: 0; + list-style: none; } .example-features li { - display: flex; - align-items: center; - gap: var(--space-3); - margin-bottom: var(--space-3); + display: flex; + align-items: center; + gap: 12px; + margin-bottom: 12px; + color: var(--text-secondary); } .example-features li::before { - content: '\2192'; - color: var(--forest); - font-weight: bold; + content: '→'; + color: var(--forest); + font-weight: bold; } - -.example-code { - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - padding: var(--space-8); +.docs-grid { + display: grid; + gap: 32px; } - -/* F# section */ -.fsharp { padding: var(--space-16) 0; background: var(--obsidian); } -.fsharp-content { - display: grid; - grid-template-columns: 1fr 1fr; - gap: var(--space-12); - align-items: center; -} -.fsharp-info h3 { - font-size: var(--text-2xl); - font-weight: 700; - margin-bottom: var(--space-4); -} -.fsharp-info p { - font-size: var(--text-lg); - margin-bottom: var(--space-6); - line-height: 1.7; -} -.fsharp-features { - list-style: none; - padding: 0; -} -.fsharp-features li { - display: flex; - align-items: center; - gap: var(--space-3); - margin-bottom: var(--space-3); -} -.fsharp-features li::before { - content: '\2192'; - color: var(--violet); - font-weight: bold; -} -.fsharp-code { - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - padding: var(--space-8); +.doc-card { + overflow-x: auto; } - -/* Playground */ -.playground { padding: var(--space-16) 0; } -.playground-content { - display: grid; - grid-template-columns: 1fr 1fr; - gap: var(--space-8); - margin-bottom: var(--space-8); -} -.playground-panel { - background: var(--card-bg); - border: 1px solid var(--border-color); - border-radius: 12px; - padding: var(--space-6); -} -.playground-panel h3 { - font-size: var(--text-lg); - font-weight: 600; - margin-bottom: var(--space-4); -} -.playground-controls { - display: flex; - gap: var(--space-4); - align-items: center; - margin-bottom: var(--space-4); -} -.dialect-selector { - background: var(--darker-bg); - border: 1px solid var(--border-color); - border-radius: 6px; - padding: var(--space-2) var(--space-3); - color: var(--text-primary); - font-size: var(--text-sm); -} -.dialect-selector:focus { outline: none; border-color: var(--volcanic); } -.convert-btn { - background: linear-gradient(135deg, var(--volcanic) 0%, var(--amber) 100%); - color: white; - border: none; - padding: var(--space-2) var(--space-4); - border-radius: 6px; - font-weight: 600; - cursor: pointer; - transition: all 0.3s ease; -} -.convert-btn:hover { - transform: translateY(-1px); - box-shadow: 0 6px 20px rgba(255, 69, 0, 0.3); -} -.convert-btn:disabled { opacity: 0.6; cursor: not-allowed; transform: none; } -.lql-input { - width: 100%; - height: 300px; - background: var(--darker-bg); - border: 1px solid var(--border-color); - border-radius: var(--radius); - padding: var(--space-4); - font-family: var(--font-mono); - font-size: var(--text-sm); - color: var(--text-primary); - resize: vertical; -} -.lql-input:focus { outline: none; border-color: var(--volcanic); } -.sql-output { - background: var(--darker-bg); - border: 1px solid var(--border-color); - border-radius: var(--radius); - padding: var(--space-4); - font-family: var(--font-mono); - font-size: var(--text-sm); - color: var(--text-primary); - min-height: 300px; - white-space: pre-wrap; - overflow-y: auto; -} -.error-message { - background: rgba(255, 69, 0, 0.1); - border: 1px solid var(--volcanic); - border-radius: 6px; - padding: var(--space-3); - color: var(--volcanic); - font-size: var(--text-sm); - margin-top: var(--space-4); -} -.example-buttons { - display: flex; - gap: var(--space-3); - flex-wrap: wrap; -} -.example-btn { - background: transparent; - color: var(--forest); - border: 1px solid var(--forest); - padding: var(--space-2) var(--space-4); - border-radius: 6px; - font-size: var(--text-sm); - cursor: pointer; - transition: all var(--transition); -} -.example-btn:hover { background: var(--forest); color: white; } - -/* Docs layout */ -.docs-layout { - display: grid; - grid-template-columns: var(--sidebar-width) 1fr; - min-height: calc(100vh - var(--header-height)); +pre { + background: var(--darker-bg); + border: 1px solid var(--border); + border-radius: 8px; + padding: 16px; + margin: 16px 0 24px; + overflow-x: auto; } - -.sidebar { - position: sticky; - top: var(--header-height); - height: calc(100vh - var(--header-height)); - overflow-y: auto; - padding: var(--space-6); - background: var(--bg-secondary); - border-right: 1px solid var(--border-color); +code { + font-family: 'JetBrains Mono', monospace; + color: var(--amber); + font-size: 14px; } - -.sidebar-section { margin-bottom: var(--space-6); } -.sidebar-section h4 { - font-size: var(--text-sm); - font-weight: 600; - text-transform: uppercase; - letter-spacing: 0.05em; - color: var(--text-muted); - margin-bottom: var(--space-3); -} -.sidebar-section ul { list-style: none; padding: 0; margin: 0; } -.sidebar-section li { margin: 0; } -.sidebar-section a { - display: block; - padding: var(--space-2) var(--space-3); - color: var(--text-secondary); - border-radius: 4px; - transition: all var(--transition); -} -.sidebar-section a:hover, .sidebar-section a.active { - background: var(--bg-tertiary); - color: var(--volcanic); +pre code { + color: var(--text-primary); + line-height: 1.7; } - -.docs-content { - padding: var(--space-8); - max-width: 900px; -} -.docs-content h1 { margin-bottom: var(--space-6); } -.docs-content h2 { - margin-top: var(--space-10); - margin-bottom: var(--space-4); - padding-bottom: var(--space-2); - border-bottom: 1px solid var(--border-color); -} -.docs-content h3 { margin-top: var(--space-8); margin-bottom: var(--space-3); } - -/* Tables */ table { - width: 100%; - border-collapse: collapse; - margin-bottom: var(--space-6); - background: var(--card-bg); - border-radius: var(--radius); - overflow: hidden; -} -th, td { - padding: var(--space-3) var(--space-4); - text-align: left; - border-bottom: 1px solid var(--border-color); + width: 100%; + border-collapse: collapse; + margin: 16px 0 24px; + overflow: hidden; + border-radius: 8px; +} +th, +td { + text-align: left; + padding: 12px 16px; + border-bottom: 1px solid var(--border); + color: var(--text-secondary); + vertical-align: top; } th { - background: var(--obsidian); - color: var(--text-primary); - font-weight: 600; - text-transform: uppercase; - font-size: var(--text-sm); - letter-spacing: 0.05em; -} -td code { - background: var(--bg-tertiary); - padding: var(--space-1) var(--space-2); - border-radius: 4px; - font-size: var(--text-sm); + background: var(--obsidian); + color: var(--text-primary); + font-weight: 700; } -/* Syntax highlighting (Eleventy plugin) */ -.token.comment { color: #6B7280; } -.token.keyword { color: var(--volcanic); } -.token.string { color: var(--violet); } -.token.function, .token.class-name { color: var(--amber); } -.token.number { color: var(--amber); } -.token.operator { color: var(--forest); } - -/* Footer */ -.footer { - padding: var(--space-16) 0 var(--space-8); - background: var(--bg-secondary); - border-top: 1px solid var(--border-color); -} -.footer-grid { - display: grid; - grid-template-columns: repeat(auto-fit, minmax(180px, 1fr)); - gap: var(--space-8); - margin-bottom: var(--space-8); -} -.footer-section h3 { - font-size: var(--text-sm); - font-weight: 600; - text-transform: uppercase; - letter-spacing: 0.05em; - margin-bottom: var(--space-4); -} -.footer-section ul { list-style: none; padding: 0; margin: 0; } -.footer-section li { margin-bottom: var(--space-2); } -.footer-section a { color: var(--text-secondary); font-size: var(--text-sm); } -.footer-section a:hover { color: var(--volcanic); } +.text-center { text-align: center; } -.footer-bottom { - padding-top: var(--space-8); - border-top: 1px solid var(--border-color); - text-align: center; +footer { + background: var(--darker-bg); + border-top: 1px solid var(--border); + padding: 48px 0; + text-align: center; +} +.footer-content { color: var(--text-muted); } +.footer-links { + display: flex; + justify-content: center; + gap: 32px; + margin-bottom: 24px; + flex-wrap: wrap; } -.footer-bottom p { - font-size: var(--text-sm); - color: var(--text-muted); - margin-bottom: var(--space-2); +.footer-links a { + color: var(--text-secondary); + text-decoration: none; + transition: color 0.2s; } +.footer-links a:hover { color: var(--volcanic); } -/* Responsive */ -@media (max-width: 1024px) { - .docs-layout { grid-template-columns: 1fr; } - .sidebar { - display: none; - position: fixed; - top: var(--header-height); - left: 0; - width: 100%; - height: calc(100vh - var(--header-height)); - z-index: 50; - } - .sidebar.open { display: block; } +@media (max-width: 980px) { + .header-content { flex-wrap: wrap; } + nav ul { gap: 18px; flex-wrap: wrap; } + .site-toggle { margin-right: 0; } } @media (max-width: 768px) { - :root { - --text-5xl: 2.25rem; - --text-4xl: 1.875rem; - --text-3xl: 1.5rem; - } - - .container { padding: 0 var(--space-3); } - - .nav-links { - display: none; - position: fixed; - top: var(--header-height); - left: 0; - width: 100%; - padding: var(--space-4); - background: var(--bg-secondary); - border-bottom: 1px solid var(--border-color); - flex-direction: column; - gap: var(--space-2); - } - .nav-links.open { display: flex; } - - .mobile-menu-toggle { display: flex; } - - .nav-actions .btn { display: none; } - - .hero { padding: var(--space-12) 0; } - .hero h1 { font-size: var(--text-4xl); } - .hero-subtitle { font-size: var(--text-base); } - .hero-buttons { flex-direction: column; align-items: center; width: 100%; } - .hero-buttons .btn { width: 100%; max-width: 280px; } - .hero-code { margin: var(--space-8) auto 0; } - - .example { grid-template-columns: 1fr; } - .example:nth-child(even) { direction: ltr; } - - .fsharp-content { grid-template-columns: 1fr; } - - .playground-content { grid-template-columns: 1fr; } - - .features-grid { grid-template-columns: 1fr; } - - .card { padding: var(--space-4); } - .docs-content { padding: var(--space-4); } - pre { font-size: var(--text-xs); padding: var(--space-3); } - table { display: block; overflow-x: auto; } - .footer-grid { gap: var(--space-6); } + .hero { padding: 80px 0; } + .hero h1 { font-size: 48px; } + .hero .subtitle { font-size: 20px; } + .cta-buttons { flex-direction: column; align-items: center; } + .section-header { margin-bottom: 48px; } + .section-header h2 { font-size: 36px; } + .features, + .examples, + .docs-section, + .playground-strip { padding: 72px 0; } + .features-grid { grid-template-columns: 1fr; } + .example { grid-template-columns: 1fr; } + .example { text-align: center; } + .example:nth-child(even) { direction: ltr; } + nav ul { display: none; } + .hero-code, + .example-code, + .feature-card, + .doc-card { padding: 24px; } + .code-block, + pre code { font-size: 13px; } } - -/* Utilities */ -.text-center { text-align: center; } -.mt-8 { margin-top: var(--space-8); } -.mb-8 { margin-bottom: var(--space-8); } diff --git a/Lql/LqlWebsite-Eleventy/src/assets/js/playground.js b/Lql/LqlWebsite-Eleventy/src/assets/js/playground.js deleted file mode 100644 index 65c2f148..00000000 --- a/Lql/LqlWebsite-Eleventy/src/assets/js/playground.js +++ /dev/null @@ -1,80 +0,0 @@ -(function() { - 'use strict'; - - const examples = { - simple: 'users |> select(users.id, users.name, users.email)', - join: 'users\n|> join(orders, on = users.id = orders.user_id)\n|> select(users.name, orders.total, orders.status)', - filter: 'employees\n|> select(employees.id, employees.name, employees.salary)\n|> filter(fn(row) => row.employees.salary > 50000 and row.employees.department = \'Engineering\')', - aggregate: 'orders\n|> group_by(orders.user_id)\n|> select(\n orders.user_id,\n count(*) as order_count,\n sum(orders.total) as total_amount,\n avg(orders.total) as avg_amount\n)\n|> having(fn(group) => count(*) > 2)\n|> order_by(total_amount desc)', - complex: '-- Complex analytics query\nlet joined =\n users\n |> join(orders, on = users.id = orders.user_id)\n |> filter(fn(row) => row.orders.status = \'completed\')\n\njoined\n|> group_by(users.id)\n|> select(\n users.name,\n count(*) as total_orders,\n sum(orders.total) as revenue,\n avg(orders.total) as avg_order_value\n)\n|> filter(fn(row) => row.revenue > 1000)\n|> order_by(revenue desc)\n|> limit(10)' - }; - - const lqlInput = document.getElementById('lql-input'); - const sqlOutput = document.getElementById('sql-output'); - const errorMessage = document.getElementById('error-message'); - const convertBtn = document.getElementById('convert-btn'); - const dialectSelector = document.getElementById('dialect-selector'); - const outputTitle = document.getElementById('output-title'); - - // Load default example - lqlInput.value = examples.simple; - - // Update output title when dialect changes - dialectSelector.addEventListener('change', function() { - outputTitle.textContent = this.value === 'SqlServer' ? 'SQL Server Output' : 'PostgreSQL Output'; - }); - - // Convert button - calls the Blazor WASM transpiler via JS interop - convertBtn.addEventListener('click', async function() { - const lql = lqlInput.value.trim(); - if (!lql) { - showError('Please enter some LQL code to convert.'); - return; - } - - convertBtn.disabled = true; - convertBtn.textContent = 'Converting...'; - errorMessage.style.display = 'none'; - sqlOutput.textContent = 'Converting...'; - - try { - // Call the Blazor WASM transpiler if available - if (window.lqlTranspile) { - const dialect = dialectSelector.value; - const result = await window.lqlTranspile(lql, dialect); - if (result.error) { - showError(result.error); - sqlOutput.textContent = ''; - } else { - sqlOutput.textContent = result.sql; - } - } else { - // Fallback: show a message that the transpiler is loading or unavailable - sqlOutput.textContent = 'The LQL transpiler is loading. Please wait a moment and try again.\n\nIf this persists, the Blazor WASM runtime may not be available.'; - } - } catch (err) { - showError('An unexpected error occurred: ' + err.message); - sqlOutput.textContent = ''; - } finally { - convertBtn.disabled = false; - convertBtn.textContent = 'Convert to SQL'; - } - }); - - // Example buttons - document.querySelectorAll('.example-btn[data-example]').forEach(function(btn) { - btn.addEventListener('click', function() { - const key = this.getAttribute('data-example'); - if (examples[key]) { - lqlInput.value = examples[key]; - errorMessage.style.display = 'none'; - sqlOutput.textContent = "Click 'Convert to SQL' to see the result."; - } - }); - }); - - function showError(msg) { - errorMessage.textContent = msg; - errorMessage.style.display = 'block'; - } -})(); diff --git a/Lql/LqlWebsite-Eleventy/src/docs/aggregation.md b/Lql/LqlWebsite-Eleventy/src/docs/aggregation.md deleted file mode 100644 index cafed0ef..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/aggregation.md +++ /dev/null @@ -1,73 +0,0 @@ ---- -layout: layouts/docs.njk -title: Aggregation -description: Group by, aggregate functions, and having clauses in LQL. ---- - -LQL provides full aggregation support with group by, aggregate functions, and having clauses. - -## Aggregate Functions - -| Function | Description | -|----------|-------------| -| `count(*)` | Count all rows | -| `sum(column)` | Sum of values | -| `avg(column)` | Average of values | -| `min(column)` | Minimum value | -| `max(column)` | Maximum value | - -## Basic Aggregation - -``` -orders -|> group_by(orders.status) -|> select( - orders.status, - count(*) as order_count -) -``` - -## Multiple Group Columns - -``` -orders -|> group_by(orders.user_id, orders.status) -|> select( - orders.user_id, - orders.status, - count(*) as order_count, - sum(orders.total) as total_amount -) -``` - -## Having Clause - -Filter groups after aggregation using lambda expressions: - -``` -orders -|> group_by(orders.user_id) -|> having(fn(group) => count(*) > 2) -|> select( - orders.user_id, - count(*) as order_count, - sum(orders.total) as total_amount, - avg(orders.total) as avg_amount -) -``` - -## Complete Analytics Query - -``` -orders -|> group_by(orders.user_id, orders.status) -|> select( - orders.user_id, - orders.status, - count(*) as order_count, - sum(orders.total) as total_amount, - avg(orders.total) as avg_amount -) -|> having(fn(group) => count(*) > 2) -|> order_by(total_amount desc) -``` diff --git a/Lql/LqlWebsite-Eleventy/src/docs/ai-integration.md b/Lql/LqlWebsite-Eleventy/src/docs/ai-integration.md deleted file mode 100644 index 8c65ad0b..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/ai-integration.md +++ /dev/null @@ -1,233 +0,0 @@ ---- -layout: layouts/docs.njk -title: AI-Powered Completions -description: Integrate local or cloud AI models with the LQL Language Server for intelligent query completions. ---- - -The LQL Language Server has built-in support for AI-powered code completions. Connect a local model via Ollama or use a cloud provider to get intelligent, context-aware query suggestions alongside the standard schema and keyword completions. - -## How It Works - -```mermaid -graph TD - A[VS Code Editor] <-->|completions| B[lql-lsp] - B --> C["Schema\ncolumns, tables\npriority 0–4"] - B --> D["Keywords\nfunctions, operators\npriority 1–3"] - B --> E["AI Model\nasync with timeout\npriority 6"] -``` - -On every completion request, the LSP runs three sources **in parallel**: - -1. **Schema completions** (priority 0-4) - Table names, column names from your database -2. **Keyword completions** (priority 1-3) - Pipeline operations, functions, keywords -3. **AI completions** (priority 6) - Intelligent suggestions from a language model - -All results are merged and sorted by priority. Schema and keyword completions always appear first; AI suggestions supplement them at the bottom. If the AI model is slow or unavailable, you still get instant schema and keyword completions. - -## Timeout Enforcement - -AI completions are wrapped in a configurable timeout (default: 2000ms). If the model doesn't respond in time, the LSP silently drops the AI results and returns only schema/keyword completions. This guarantees the editor never feels sluggish, regardless of AI model latency. - -## What the AI Model Receives - -Every completion request sends the AI model rich context about your current editing state: - -| Context | Description | -|---------|-------------| -| **Full document** | The complete `.lql` file text | -| **Cursor position** | Line and column number | -| **Line prefix** | Text from the start of the line to the cursor | -| **Word prefix** | The partial word being typed | -| **File URI** | Path to the current file | -| **Table names** | All tables from the database schema | -| **Schema description** | Compact schema: `users(id uuid PK NOT NULL, name text, email text)` | - -The schema description gives the model full knowledge of your database structure, so it can suggest syntactically valid and schema-aware queries. - -## Setup with Ollama (Recommended) - -[Ollama](https://ollama.com) runs language models locally on your machine. No API keys, no cloud, no data leaves your laptop. - -### 1. Install Ollama - -Download and install from [ollama.com](https://ollama.com). - -### 2. Pull a Code Model - -```bash -ollama pull qwen2.5-coder:1.5b -``` - -### 3. Configure VS Code - -Add to your `settings.json`: - -```json -{ - "lql.aiProvider": { - "provider": "ollama", - "endpoint": "http://localhost:11434/api/generate", - "model": "qwen2.5-coder:1.5b", - "enabled": true - } -} -``` - -### 4. Start Writing LQL - -Open any `.lql` file and start typing. AI suggestions appear in the completion list alongside schema and keyword completions, marked as **Snippet** items. - -## Recommended Models - -| Model | Parameters | Speed | Quality | Best For | -|-------|-----------|-------|---------|----------| -| `qwen2.5-coder:1.5b` | 1.5B | Fast | Good | Daily use, quick responses | -| `deepseek-coder:1.3b` | 1.3B | Fast | Good | Lightweight alternative | -| `codellama:7b` | 7B | Moderate | Better | Complex queries, more context | -| `qwen2.5-coder:7b` | 7B | Moderate | Better | Higher quality suggestions | - -For the best experience, start with `qwen2.5-coder:1.5b` - it provides good suggestions with minimal latency. - -## Provider Configuration - -The AI provider is configured via `initializationOptions` during the LSP handshake, which VS Code passes from your settings: - -```json -{ - "lql.aiProvider": { - "provider": "ollama", - "endpoint": "http://localhost:11434/api/generate", - "model": "qwen2.5-coder:1.5b", - "apiKey": "", - "timeoutMs": 2000, - "enabled": true - } -} -``` - -### Configuration Fields - -| Field | Type | Default | Description | -|-------|------|---------|-------------| -| `provider` | string | *required* | Provider type: `ollama`, `openai`, `anthropic`, `custom` | -| `endpoint` | string | *required* | Full URL of the API endpoint | -| `model` | string | `"default"` | Model identifier (provider-specific) | -| `apiKey` | string | `null` | API key for cloud providers | -| `timeoutMs` | number | `2000` | Maximum time to wait for AI response (ms) | -| `enabled` | boolean | `true` | Enable/disable AI completions | - -## Supported Providers - -### Ollama (Local) - -Runs entirely on your machine. The LSP calls the Ollama `/api/generate` endpoint and injects the [LQL language reference](https://github.com/Nimblesite/DataProvider/blob/main/Lql/lql-lsp-rust/crates/lql-reference.md) as system context, giving the model knowledge of LQL syntax. - -```json -{ - "provider": "ollama", - "endpoint": "http://localhost:11434/api/generate", - "model": "qwen2.5-coder:1.5b" -} -``` - -The Ollama provider: -- Sends the full document, cursor position, and schema as a structured prompt -- Uses low temperature (0.1) for deterministic, focused completions -- Limits response to 256 tokens for fast turnaround -- Parses the model's JSON array response into completion items -- Handles markdown code fence wrapping in model responses - -### OpenAI / Anthropic / Custom (Cloud) - -Configure any OpenAI-compatible or custom endpoint: - -```json -{ - "provider": "openai", - "endpoint": "https://api.openai.com/v1/completions", - "model": "gpt-4", - "apiKey": "sk-..." -} -``` - -Cloud providers require an API key. The same context (document, cursor, schema) is sent to the model. - -### Custom Providers - -Set `provider` to `"custom"` and point `endpoint` to any API that accepts the same prompt format. The LSP logs the configuration on startup so you can verify it's active. - -## How AI Completions Merge - -The completion pipeline works as follows: - -1. **Schema + keyword completions** are computed synchronously (instant) -2. **AI completions** are requested asynchronously with a timeout -3. If AI responds within the timeout, results are appended to the list -4. If AI times out, only schema + keyword results are returned -5. All items are sorted by `sort_priority` before sending to the editor - -``` -Priority 0: Column completions (users.id, users.name) -Priority 1: Pipeline operations (select, filter, join) -Priority 2: Functions (count, sum, avg, concat) -Priority 3: Keywords (let, fn, as, and, or) -Priority 4: Table names (users, orders, products) -Priority 5: Let bindings (active_users, joined) -Priority 6: AI suggestions (context-aware snippets) -``` - -This means AI suggestions never push schema completions out of view - they always appear at the bottom of the list as supplementary suggestions. - -## Disabling AI - -To disable AI completions without removing the configuration: - -```json -{ - "lql.aiProvider": { - "provider": "ollama", - "endpoint": "http://localhost:11434/api/generate", - "model": "qwen2.5-coder:1.5b", - "enabled": false - } -} -``` - -Or simply remove the `lql.aiProvider` section from your settings. - -## Troubleshooting - -### No AI completions appearing - -1. Check the **LQL Language Server** output channel (`View > Output > LQL Language Server`) for provider activation messages -2. Verify Ollama is running: `curl http://localhost:11434/api/tags` -3. Verify the model is pulled: `ollama list` -4. Check `enabled` is not set to `false` - -### AI completions are slow - -1. Try a smaller model (`qwen2.5-coder:1.5b` instead of `codellama:7b`) -2. Increase `timeoutMs` if you prefer waiting for better results -3. Ensure Ollama has enough RAM (1.5B models need ~2GB, 7B models need ~8GB) - -### AI suggestions are irrelevant - -1. Ensure your [database is connected](/docs/database-config/) - schema context dramatically improves AI suggestions -2. Try a different model - `qwen2.5-coder` tends to produce better LQL-specific completions -3. The LQL reference document is automatically injected as system context for Ollama - -### Verifying the pipeline works - -Use the built-in test provider to confirm AI completions flow end-to-end: - -```json -{ - "lql.aiProvider": { - "provider": "test", - "endpoint": "http://localhost", - "enabled": true - } -} -``` - -This returns deterministic completions (like `ai_suggest_filter`, `ai_suggest_join`) without any external service, proving the full pipeline works. diff --git a/Lql/LqlWebsite-Eleventy/src/docs/database-config.md b/Lql/LqlWebsite-Eleventy/src/docs/database-config.md deleted file mode 100644 index 538c92f6..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/database-config.md +++ /dev/null @@ -1,190 +0,0 @@ ---- -layout: layouts/docs.njk -title: Database Configuration -description: Connect the LQL Language Server to your database for schema-aware completions and hover. ---- - -The LQL Language Server can connect to your PostgreSQL database to provide schema-aware features like column completions, table hover, and qualified column hover. - -## Why Connect a Database? - -Without a database connection, the LSP still provides: -- Keyword and function completions -- Pipeline operation suggestions -- Parse error diagnostics -- Hover documentation for LQL constructs -- Document formatting and symbols - -With a database connection, you additionally get: -- **Column completions** - Type `users.` and see all columns with types -- **Table name completions** - See all tables with column counts -- **Table hover** - Hover over a table name to see its full schema -- **Column hover** - Hover over `users.email` to see type, nullability, and PK status - -## Connection Methods - -The LSP resolves the database connection in this priority order: - -### 1. VS Code Settings (recommended) - -Add a connection string to your VS Code `settings.json`: - -```json -{ - "lql.connectionString": "host=localhost dbname=myapp user=postgres password=secret" -} -``` - -This is passed to the LSP via `initializationOptions.connectionString`. - -### 2. Environment Variable: LQL_CONNECTION_STRING - -```bash -export LQL_CONNECTION_STRING="host=localhost dbname=myapp user=postgres password=secret" -``` - -### 3. Environment Variable: DATABASE_URL - -```bash -export DATABASE_URL="postgres://postgres:secret@localhost/myapp" -``` - -## Supported Connection Formats - -The LSP accepts multiple PostgreSQL connection string formats and normalizes them automatically. - -### libpq Format - -The native PostgreSQL format: - -``` -host=localhost dbname=myapp user=postgres password=secret -``` - -With port: - -``` -host=localhost port=5433 dbname=myapp user=postgres password=secret -``` - -### Npgsql Format (.NET style) - -Semicolon-delimited key=value pairs. These are automatically converted to libpq format: - -``` -Host=localhost;Database=myapp;Username=postgres;Password=secret -``` - -Mapping: -- `Host` -> `host` -- `Database` -> `dbname` -- `Username` -> `user` -- `Password` -> `password` -- `Port` -> `port` - -### URI Format - -PostgreSQL connection URI: - -``` -postgres://postgres:secret@localhost/myapp -postgresql://postgres:secret@localhost:5433/myapp -``` - -## Schema Introspection - -On startup (and when the connection is available), the LSP queries `information_schema.columns` and `information_schema.key_column_usage` to discover: - -| Metadata | Source | -|----------|--------| -| Table names | `information_schema.columns` | -| Column names | `information_schema.columns` | -| Column types | `data_type` column | -| Nullability | `is_nullable` column | -| Primary keys | `information_schema.key_column_usage` | - -The schema is cached in memory for fast lookups. Connection timeout is 10 seconds, query timeout is 30 seconds. - -## Graceful Degradation - -If the database is unreachable or the connection string is invalid: - -- The LSP logs the error and continues without schema -- All non-schema features remain fully functional -- No error is shown to the user (check the **LQL Language Server** output channel for diagnostics) - -This means you can use the extension without any database - you just won't get table/column completions. - -## Schema-Aware Features in Detail - -### Column Completions - -When you type a table name followed by `.`, the LSP shows all columns for that table: - -``` -users. -``` - -Completion list shows: -``` -id uuid (PK) NOT NULL -name text NOT NULL -email text -status text -``` - -### Table Completions - -Table names appear in the completion list with metadata: - -``` -users (4 columns: id, name, email, status) -orders (6 columns: id, user_id, total, status, ...) -``` - -### Table Hover - -Hovering over a table name shows the full schema: - -``` -Table: users - -| Column | Type | PK | Nullable | -|--------|------|----|----------| -| id | uuid | Y | N | -| name | text | | N | -| email | text | | Y | -| status | text | | Y | -``` - -### Qualified Column Hover - -Hovering over `users.email` shows: - -``` -Column: users.email -Type: text -Nullable: yes -Primary Key: no -``` - -## Troubleshooting - -### No schema completions - -1. Check the **LQL Language Server** output channel (`View > Output > LQL Language Server`) -2. Verify your connection string is correct -3. Ensure PostgreSQL is running and accessible -4. Check firewall/network rules - -### Connection string not picked up - -1. VS Code settings take priority over environment variables -2. Restart VS Code after changing environment variables -3. Try the libpq format if other formats don't work - -### Schema is stale - -The schema is fetched once on startup. Restart the language server to refresh: -1. Open the command palette (`Ctrl+Shift+P`) -2. Run **Developer: Reload Window** diff --git a/Lql/LqlWebsite-Eleventy/src/docs/fsharp-type-provider.md b/Lql/LqlWebsite-Eleventy/src/docs/fsharp-type-provider.md deleted file mode 100644 index d144fc05..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/fsharp-type-provider.md +++ /dev/null @@ -1,81 +0,0 @@ ---- -layout: layouts/docs.njk -title: F# Type Provider -description: Compile-time validated LQL queries in F# with the LQL Type Provider. ---- - -The LQL Type Provider brings compile-time type checking to your LQL queries in F#. Write queries with IntelliSense support, catch errors before runtime, and enjoy seamless integration with your F# codebase. - -## Installation - -```xml - -``` - -## Basic Usage - -```fsharp -open Lql - -// Define types with validated LQL - errors caught at COMPILE TIME -type GetUsers = LqlCommand<"Users |> select(Users.Id, Users.Name, Users.Email)"> -type ActiveUsers = LqlCommand<"Users |> filter(fn(row) => row.Status = 'active') |> select(*)"> - -// Access generated SQL and original query -let sql = GetUsers.Sql // Generated SQL string -let query = GetUsers.Query // Original LQL string -``` - -## What Gets Validated - -The type provider validates your LQL at compile time and generates two properties: -- `Query` - The original LQL query string -- `Sql` - The generated SQL (SQLite dialect) - -## Query Examples - -```fsharp -// Select with columns -type SelectColumns = LqlCommand<"Users |> select(Users.Id, Users.Name, Users.Email)"> - -// Filtering with AND/OR -type FilterComplex = LqlCommand<"Users |> filter(fn(row) => row.Users.Age > 18 and row.Users.Status = 'active') |> select(*)"> - -// Joins -type JoinQuery = LqlCommand<"Users |> join(Orders, on = Users.Id = Orders.UserId) |> select(Users.Name, Orders.Total)"> -type LeftJoin = LqlCommand<"Users |> left_join(Orders, on = Users.Id = Orders.UserId) |> select(*)"> - -// Aggregations with GROUP BY and HAVING -type GroupBy = LqlCommand<"Orders |> group_by(Orders.UserId) |> select(Orders.UserId, count(*) as order_count)"> -type Having = LqlCommand<"Orders |> group_by(Orders.UserId) |> having(fn(g) => count(*) > 5) |> select(Orders.UserId, count(*) as cnt)"> - -// Order, limit, offset -type Pagination = LqlCommand<"Users |> order_by(Users.Name asc) |> limit(10) |> offset(20) |> select(*)"> - -// Arithmetic expressions -type Calculated = LqlCommand<"Products |> select(Products.Price * Products.Quantity as total)"> -``` - -## Compile-Time Error Example - -Invalid LQL causes a build error with line/column position: - -```fsharp -// This FAILS to compile with: "Invalid LQL syntax at line 1, column 15" -type BadQuery = LqlCommand<"Users |> selectt(*)"> // typo: 'selectt' -``` - -## Executing Queries - -```fsharp -open Microsoft.Data.Sqlite - -let executeQuery() = - use conn = new SqliteConnection("Data Source=mydb.db") - conn.Open() - - // SQL is validated at compile time, safe to execute - use cmd = new SqliteCommand(GetUsers.Sql, conn) - use reader = cmd.ExecuteReader() - // ... process results -``` diff --git a/Lql/LqlWebsite-Eleventy/src/docs/index.md b/Lql/LqlWebsite-Eleventy/src/docs/index.md deleted file mode 100644 index d209b2e5..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/index.md +++ /dev/null @@ -1,61 +0,0 @@ ---- -layout: layouts/docs.njk -title: Introduction -description: Lambda Query Language (LQL) - a functional pipeline-style DSL that transpiles to SQL. ---- - -A functional pipeline-style DSL that transpiles to SQL. Write database logic once, run it anywhere. - -## The Problem - -SQL dialects differ. PostgreSQL, SQLite, and SQL Server each have their own quirks. This creates problems: - -- **Migrations** - Schema changes need different SQL for each database -- **Business Logic** - Triggers, stored procedures, and constraints vary by vendor -- **Sync Logic** - Offline-first apps need identical logic on client (SQLite) and server (Postgres) -- **Testing** - Running tests against SQLite while production uses Postgres - -## The Solution - -LQL is a single query language that transpiles to any SQL dialect. Write once, deploy everywhere. - -``` -Users -|> filter(fn(row) => row.Age > 18 and row.Status = 'active') -|> join(Orders, on = Users.Id = Orders.UserId) -|> group_by(Users.Id, Users.Name) -|> select(Users.Name, sum(Orders.Total) as TotalSpent) -|> order_by(TotalSpent desc) -|> limit(10) -``` - -This transpiles to correct SQL for PostgreSQL, SQLite, or SQL Server. - -## Use Cases - -### Cross-Database Migrations -Define schema changes in LQL. Migration.CLI generates the right SQL for your target database. - -### Cross DB Platform Business Logic With Triggers -Write triggers and constraints in LQL. Deploy the same logic to any database. - -### Offline-First Sync -Sync framework uses LQL for conflict resolution. Same logic runs on mobile (SQLite) and server (Postgres). - -### Integration Testing -Test against SQLite locally, deploy to Postgres in production. Same queries, same results. - -## Pipeline Operations - -| Operation | Description | -|-----------|-------------| -| `select(cols...)` | Choose columns | -| `filter(fn(row) => ...)` | Filter rows | -| `join(table, on = ...)` | Join tables | -| `left_join(table, on = ...)` | Left join | -| `group_by(cols...)` | Group rows | -| `having(fn(row) => ...)` | Filter groups | -| `order_by(col [asc/desc])` | Sort results | -| `limit(n)` / `offset(n)` | Pagination | -| `distinct()` | Unique rows | -| `union(query)` | Combine queries | diff --git a/Lql/LqlWebsite-Eleventy/src/docs/installation.md b/Lql/LqlWebsite-Eleventy/src/docs/installation.md deleted file mode 100644 index 32803222..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/installation.md +++ /dev/null @@ -1,51 +0,0 @@ ---- -layout: layouts/docs.njk -title: Installation -description: Install LQL packages and tools for your project. ---- - -## NuGet Packages - -LQL provides dialect-specific packages for each target database: - -```xml - - - - - - - - -``` - -## CLI Tool - -Install the LQL CLI for command-line transpilation: - -```bash -dotnet tool install -g LqlCli.SQLite -``` - -## F# Type Provider - -For compile-time validated LQL queries in F#: - -```xml - -``` - -## VS Code Extension - -Search for **LQL** in VS Code Extensions marketplace for: - -- Syntax highlighting -- IntelliSense completions -- Real-time diagnostics -- Hover documentation -- Document formatting - -## Requirements - -- .NET 9.0 or later -- One of the supported databases: SQLite, PostgreSQL, or SQL Server diff --git a/Lql/LqlWebsite-Eleventy/src/docs/joins.md b/Lql/LqlWebsite-Eleventy/src/docs/joins.md deleted file mode 100644 index 640cd84f..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/joins.md +++ /dev/null @@ -1,63 +0,0 @@ ---- -layout: layouts/docs.njk -title: Joins -description: Combining data from multiple tables with LQL join operations. ---- - -LQL supports multiple join types for combining data from different tables. - -## Inner Join - -Returns only rows that have matching values in both tables: - -``` -users -|> join(orders, on = users.id = orders.user_id) -|> select(users.name, orders.total, orders.status) -``` - -## Left Join - -Returns all rows from the left table, with matching rows from the right table (or NULL): - -``` -users -|> left_join(orders, on = users.id = orders.user_id) -|> select(users.name, orders.total) -``` - -## Multiple Joins - -Chain joins to combine more than two tables: - -``` -users -|> join(orders, on = users.id = orders.user_id) -|> join(products, on = orders.product_id = products.id) -|> select(users.name, products.name, orders.quantity) -``` - -## Join with Filter - -Combine joins with filtering: - -``` -users -|> join(orders, on = users.id = orders.user_id) -|> filter(fn(row) => row.orders.status = 'completed') -|> select(users.name, orders.total) -``` - -## Join with Aggregation - -``` -users -|> join(orders, on = users.id = orders.user_id) -|> group_by(users.id, users.name) -|> select( - users.name, - count(*) as total_orders, - sum(orders.total) as revenue -) -|> order_by(revenue desc) -``` diff --git a/Lql/LqlWebsite-Eleventy/src/docs/lambdas.md b/Lql/LqlWebsite-Eleventy/src/docs/lambdas.md deleted file mode 100644 index 630be60f..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/lambdas.md +++ /dev/null @@ -1,82 +0,0 @@ ---- -layout: layouts/docs.njk -title: Lambda Expressions -description: Using lambda expressions for filtering and data transformation in LQL. ---- - -Lambda expressions are the functional core of LQL. They provide type-safe, composable predicates for filtering and transforming data. - -## Syntax - -``` -fn(parameter) => expression -``` - -The parameter represents a row of data. Access columns using `parameter.table.column`: - -``` -fn(row) => row.users.age > 18 -``` - -## In filter - -The most common use is filtering rows: - -``` -users |> filter(fn(row) => row.users.active = true) -``` - -## Compound Expressions - -Combine conditions with `and` and `or`: - -``` -employees |> filter(fn(row) => - row.employees.salary > 50000 and - row.employees.salary < 100000 -) -``` - -``` -users |> filter(fn(row) => - row.users.role = 'admin' or - row.users.role = 'superadmin' -) -``` - -## In having - -Lambdas also work with `having` to filter groups: - -``` -orders -|> group_by(orders.user_id) -|> having(fn(group) => count(*) > 5) -|> select(orders.user_id, count(*) as order_count) -``` - -## Comparison Operators - -| Operator | Meaning | -|----------|---------| -| `=` | Equal | -| `!=` | Not equal | -| `>` | Greater than | -| `<` | Less than | -| `>=` | Greater than or equal | -| `<=` | Less than or equal | - -## String Comparisons - -``` -users |> filter(fn(row) => row.users.name = 'Alice') -users |> filter(fn(row) => row.users.status != 'inactive') -``` - -## Arithmetic in Lambdas - -``` -products |> filter(fn(row) => - row.products.price * row.products.quantity > 1000 -) -``` diff --git a/Lql/LqlWebsite-Eleventy/src/docs/language-server.md b/Lql/LqlWebsite-Eleventy/src/docs/language-server.md deleted file mode 100644 index a8cf5d92..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/language-server.md +++ /dev/null @@ -1,226 +0,0 @@ ---- -layout: layouts/docs.njk -title: Language Server -description: The LQL Language Server (lql-lsp) - a native Rust LSP for LQL development. ---- - -The LQL Language Server (`lql-lsp`) is a native Rust implementation that provides IDE features for `.lql` files. It communicates via the Language Server Protocol (LSP) over JSON-RPC on stdin/stdout. - -Key capabilities: schema-aware completions via [Database Configuration](/docs/database-config/), intelligent suggestions via [AI-Powered Completions](/docs/ai-integration/), real-time diagnostics, hover documentation, and formatting. - -## Architecture - -```mermaid -graph TD - A[VS Code Extension] <-->|JSON-RPC / stdio| B[lql-lsp - Rust] - B --> C[lql-parser - ANTLR] - B --> D[lql-analyzer] - B --> E[PostgreSQL Schema Cache] - B --> F[AI Model - Ollama / Cloud] -``` - -Built with: -- **tower-lsp** - LSP protocol framework (JSON-RPC, message framing) -- **antlr-rust** - ANTLR4 grammar-based parser with error recovery -- **tokio** - Async runtime for concurrent schema fetching and AI calls -- **tokio-postgres** - PostgreSQL client for schema introspection -- **reqwest** - HTTP client for AI provider communication - -## LSP Capabilities - -The server registers these capabilities on initialization: - -| Capability | Description | -|------------|-------------| -| `textDocumentSync: Full` | Full document synced on every change | -| `completionProvider` | Triggered by: `.` `\|` `>` `(` ` ` | -| `hoverProvider` | Hover info for keywords, tables, columns | -| `documentSymbolProvider` | `let` bindings shown in outline | -| `documentFormattingProvider` | Full-document formatting | - -## Completion Engine - -Completions are organized into priority layers. Lower numbers appear first: - -| Priority | Category | Count | Requires DB | -|----------|----------|-------|-------------| -| 0 | Column completions (`table.col`) | Per-table | Yes | -| 1 | Pipeline operations | 14 | No | -| 2 | Functions (aggregate, string, math, date) | 40+ | No | -| 3 | Keywords | 30+ | No | -| 4 | Table names | Per-schema | Yes | -| 5 | Variable bindings (`let` names) | Per-document | No | -| 6 | AI completions | Variable | Optional | - -### Context Detection - -The completion engine detects context to filter suggestions: - -- **After `|>`** - Shows pipeline operations -- **After `table.`** - Shows columns for that table -- **In argument list** - Shows functions, columns, keywords -- **In lambda body** - Shows row field access patterns -- **Word prefix** - Filters all completions by typed prefix - -### Trigger Characters - -Completions auto-trigger on: `.` (column access), `|` and `>` (pipe), `(` (function args), ` ` (space after pipe). - -### AI Completion Pipeline - -When an [AI provider is configured](/docs/ai-integration/), the LSP merges AI-generated completions with schema and keyword results on every request: - -1. Schema + keyword completions are computed synchronously (instant) -2. AI completions are requested asynchronously via HTTP (e.g., Ollama `/api/generate`) -3. A configurable timeout (default 2000ms) wraps the AI call -4. If AI responds in time, results are appended at priority 6 -5. If AI times out, only schema + keyword results are returned - no latency penalty - -The AI model receives full context: document text, cursor position, line prefix, word prefix, and a compact schema description (e.g., `users(id uuid PK NOT NULL, name text, email text)`). This enables schema-aware suggestions even from small local models. - -See [AI-Powered Completions](/docs/ai-integration/) for setup instructions and provider configuration. - -## Diagnostics - -Four categories of diagnostics, published on every document change: - -### Parse Errors (ERROR) -From the ANTLR parser. Syntax errors with exact line/column positions: -``` --- Error: mismatched input 'selectt' expecting... -users |> selectt(users.id) - ^^^^^^^^ -``` - -### Bracket Validation (ERROR) -Document-level parenthesis matching: -``` --- Error: unclosed parenthesis -users |> select(users.id - ^ -``` - -### Pipe Spacing (WARNING) -The `|>` operator should be surrounded by spaces: -``` --- Warning: pipe operator should be surrounded by spaces -users|>select(*) - ^^ -``` - -### Unknown Functions (INFO) -Functions not in the 82-entry known function list: -``` --- Info: unknown function 'foobar' -users |> foobar(users.id) - ^^^^^^ -``` - -## Hover Information - -The hover database contains 50+ entries covering all LQL constructs. - -### Keyword Hover -Hovering over `select`, `filter`, `join`, etc. shows descriptions with usage patterns. - -### Schema-Aware Hover -With a [database connection](/docs/database-config/): - -- **Table name hover** - Shows all columns with types, PK, and nullable indicators -- **Qualified column hover** (`users.email`) - Shows column type, nullability, primary key status - -## Document Symbols - -Extracts `let` bindings as `SymbolKind::Variable` for the VS Code outline and breadcrumb views: - -``` -let active_users = users |> filter(...) -> Symbol: active_users -let orders_2024 = orders |> filter(...) -> Symbol: orders_2024 -``` - -## Formatting - -The formatter applies consistent indentation rules: - -- Pipeline continuations (`|>`) get 4-space indent -- Nested parentheses increase indent level -- Closing `)` decreases indent level -- Lines are trimmed of trailing whitespace -- Comments and blank lines are preserved - -Before: -``` -users -|> filter(fn(row) => row.users.active) -|> select( -users.id, -users.name -) -``` - -After: -``` -users - |> filter(fn(row) => row.users.active) - |> select( - users.id, - users.name - ) -``` - -## Initialization Options - -The server accepts configuration via `initializationOptions` during the LSP `initialize` handshake: - -```json -{ - "connectionString": "host=localhost dbname=myapp user=postgres", - "aiProvider": { - "provider": "ollama", - "endpoint": "http://localhost:11434", - "model": "qwen2.5-coder:1.5b", - "apiKey": "", - "timeoutMs": 2000, - "enabled": true - } -} -``` - -See [Database Configuration](/docs/database-config/) and [VS Code Extension - AI Configuration](/docs/vscode/#ai-configuration) for details. - -## Crate Structure - -| Crate | Purpose | -|-------|---------| -| `lql-parser` | ANTLR grammar, lexer, parser, parse tree, error recovery | -| `lql-analyzer` | Completions, diagnostics, hover database, symbols, schema cache | -| `lql-lsp` | LSP server binary, tower-lsp integration, AI providers, DB client | - -## Building from Source - -```bash -cd Lql/lql-lsp-rust -cargo build --release -``` - -The binary is at `target/release/lql-lsp`. - -### Running Tests - -```bash -cargo test --workspace -``` - -### With Coverage - -```bash -./test-coverage.sh -``` - -Individual crate coverage: - -```bash -cargo tarpaulin --packages lql-parser --engine llvm --exclude-files "*/generated/*" -cargo tarpaulin --packages lql-analyzer --engine llvm -cargo tarpaulin --packages lql-lsp --engine llvm -``` diff --git a/Lql/LqlWebsite-Eleventy/src/docs/let-bindings.md b/Lql/LqlWebsite-Eleventy/src/docs/let-bindings.md deleted file mode 100644 index 5cbe3288..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/let-bindings.md +++ /dev/null @@ -1,59 +0,0 @@ ---- -layout: layouts/docs.njk -title: Let Bindings -description: Storing intermediate results with let bindings in LQL. ---- - -Let bindings allow you to name intermediate query results and reuse them, making complex queries more readable and composable. - -## Basic Syntax - -``` -let name = expression -``` - -## Simple Example - -``` -let active_users = users |> filter(fn(row) => row.users.status = 'active') - -active_users |> select(active_users.name, active_users.email) -``` - -## Building Complex Queries - -Let bindings shine when building multi-step analytics: - -``` --- Step 1: Join and filter -let joined = - users - |> join(orders, on = users.id = orders.user_id) - |> filter(fn(row) => row.orders.status = 'completed') - --- Step 2: Aggregate -joined -|> group_by(users.id) -|> select( - users.name, - count(*) as total_orders, - sum(orders.total) as revenue, - avg(orders.total) as avg_order_value -) -|> filter(fn(row) => row.revenue > 1000) -|> order_by(revenue desc) -|> limit(10) -``` - -## Reusability - -Define a filtered dataset once and use it in multiple contexts: - -``` -let engineering = employees - |> filter(fn(row) => row.employees.department = 'Engineering') - --- Use for different analyses -engineering |> select(engineering.name, engineering.salary) -engineering |> group_by(engineering.level) |> select(engineering.level, avg(engineering.salary) as avg_salary) -``` diff --git a/Lql/LqlWebsite-Eleventy/src/docs/pipelines.md b/Lql/LqlWebsite-Eleventy/src/docs/pipelines.md deleted file mode 100644 index e0f8a102..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/pipelines.md +++ /dev/null @@ -1,142 +0,0 @@ ---- -layout: layouts/docs.njk -title: Pipeline Operators -description: Deep dive into LQL pipeline operations - select, filter, join, group_by, and more. ---- - -Pipeline operators are the core of LQL. Each operation takes the result of the previous step and transforms it. - -## select - -Choose which columns to include in the output: - -``` -users |> select(users.id, users.name, users.email) -``` - -Select all columns: - -``` -users |> select(*) -``` - -With computed columns: - -``` -products |> select( - products.name, - products.price * products.quantity as total_value, - round(products.price / 2, 2) as half_price -) -``` - -## filter - -Filter rows using lambda expressions: - -``` -users |> filter(fn(row) => row.users.age > 18) -``` - -Combine conditions with `and` / `or`: - -``` -employees |> filter(fn(row) => - row.employees.salary > 50000 and - row.employees.department = 'Engineering' -) -``` - -## join - -Inner join two tables: - -``` -users |> join(orders, on = users.id = orders.user_id) -``` - -## left_join - -Left join preserving all rows from the left table: - -``` -users |> left_join(orders, on = users.id = orders.user_id) -``` - -## group_by - -Group rows by one or more columns: - -``` -orders |> group_by(orders.status) -``` - -Multiple grouping columns: - -``` -orders |> group_by(orders.user_id, orders.status) -``` - -## having - -Filter groups after aggregation: - -``` -orders -|> group_by(orders.user_id) -|> having(fn(group) => count(*) > 5) -|> select(orders.user_id, count(*) as order_count) -``` - -## order_by - -Sort results ascending or descending: - -``` -users |> order_by(users.name asc) -users |> order_by(users.created_at desc) -``` - -## limit / offset - -Pagination: - -``` -users |> limit(10) -users |> limit(10) |> offset(20) -``` - -## distinct - -Remove duplicate rows: - -``` -orders |> select(orders.status) |> distinct() -``` - -## union - -Combine results from two queries: - -``` -active_users |> union(inactive_users) -``` - -## Chaining Operations - -The real power comes from chaining multiple operations: - -``` -users -|> join(orders, on = users.id = orders.user_id) -|> filter(fn(row) => row.orders.status = 'completed') -|> group_by(users.id, users.name) -|> select( - users.name, - count(*) as total_orders, - sum(orders.total) as revenue -) -|> having(fn(group) => count(*) > 2) -|> order_by(revenue desc) -|> limit(10) -``` diff --git a/Lql/LqlWebsite-Eleventy/src/docs/quick-start.md b/Lql/LqlWebsite-Eleventy/src/docs/quick-start.md deleted file mode 100644 index d2a2d283..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/quick-start.md +++ /dev/null @@ -1,57 +0,0 @@ ---- -layout: layouts/docs.njk -title: Quick Start -description: Get up and running with LQL in minutes. ---- - -## Install - -### NuGet Packages - -```xml - - - -``` - -### CLI Tool - -```bash -dotnet tool install -g LqlCli.SQLite -``` - -## Your First Query - -Write your first LQL query: - -``` -users |> select(users.id, users.name, users.email) -``` - -This transpiles to: - -```sql -SELECT users.id, users.name, users.email FROM users -``` - -## Programmatic Usage - -```csharp -using Lql; -using Lql.SQLite; - -var lql = "Users |> filter(fn(row) => row.Age > 21) |> select(Name, Email)"; -var sql = LqlCodeParser.Parse(lql).ToSql(new SQLiteContext()); -``` - -## CLI Usage - -```bash -lql --input query.lql --output query.sql -``` - -## Next Steps - -- [Syntax Overview](/docs/syntax/) - Learn the full language syntax -- [Pipeline Operators](/docs/pipelines/) - Deep dive into pipeline operations -- [Playground](/playground/) - Try LQL interactively in your browser diff --git a/Lql/LqlWebsite-Eleventy/src/docs/sql-dialects.md b/Lql/LqlWebsite-Eleventy/src/docs/sql-dialects.md deleted file mode 100644 index 5d4665ea..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/sql-dialects.md +++ /dev/null @@ -1,82 +0,0 @@ ---- -layout: layouts/docs.njk -title: SQL Dialects -description: How LQL transpiles to PostgreSQL, SQL Server, and SQLite. ---- - -LQL is database platform independent. The same query transpiles to correct SQL for each target database. - -## Supported Dialects - -| Dialect | Package | Status | -|---------|---------|--------| -| PostgreSQL | `Lql.Postgres` | Full support | -| SQL Server | `Lql.SqlServer` | Full support | -| SQLite | `Lql.SQLite` | Full support | - -## Example - -This LQL query: - -``` -users -|> filter(fn(row) => row.users.age > 18) -|> select(users.name, users.email) -|> order_by(users.name asc) -|> limit(10) -``` - -### PostgreSQL Output - -```sql -SELECT users.name, users.email -FROM users -WHERE users.age > 18 -ORDER BY users.name ASC -LIMIT 10 -``` - -### SQL Server Output - -```sql -SELECT TOP 10 users.name, users.email -FROM users -WHERE users.age > 18 -ORDER BY users.name ASC -``` - -### SQLite Output - -```sql -SELECT users.name, users.email -FROM users -WHERE users.age > 18 -ORDER BY users.name ASC -LIMIT 10 -``` - -## Dialect Differences Handled by LQL - -LQL abstracts away common dialect differences: - -- **LIMIT/TOP** - PostgreSQL and SQLite use `LIMIT`, SQL Server uses `TOP` -- **String concatenation** - `||` vs `+` -- **Boolean literals** - `TRUE`/`FALSE` vs `1`/`0` -- **ILIKE** - PostgreSQL-specific case-insensitive LIKE - -## Programmatic Dialect Selection - -```csharp -using Lql; -using Lql.Postgres; -using Lql.SqlServer; -using Lql.SQLite; - -var lql = "Users |> select(Users.Name)"; -var statement = LqlStatementConverter.ToStatement(lql); - -// Generate for each dialect -var postgres = statement.ToPostgreSql(); -var sqlServer = statement.ToSqlServer(); -var sqlite = statement.ToSQLite(); -``` diff --git a/Lql/LqlWebsite-Eleventy/src/docs/syntax.md b/Lql/LqlWebsite-Eleventy/src/docs/syntax.md deleted file mode 100644 index 8ceb978a..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/syntax.md +++ /dev/null @@ -1,96 +0,0 @@ ---- -layout: layouts/docs.njk -title: Syntax Overview -description: Complete overview of LQL syntax and language features. ---- - -LQL uses a functional pipeline syntax where data flows through a series of transformations using the pipeline operator `|>`. - -## Basic Structure - -Every LQL query starts with a table reference and pipes data through operations: - -``` -table_name |> operation1(...) |> operation2(...) -``` - -## Table References - -Simply name the table to start a query: - -``` -users -employees -orders -``` - -## Pipeline Operator - -The `|>` operator passes the result of the left side to the right side: - -``` -users |> select(users.id, users.name) -``` - -## Column References - -Columns are referenced with the `table.column` syntax: - -``` -users.id -users.name -orders.total -``` - -## Lambda Expressions - -Lambdas use the `fn(param) => expression` syntax: - -``` -filter(fn(row) => row.users.age > 18) -filter(fn(row) => row.users.status = 'active' and row.users.age > 21) -``` - -## Let Bindings - -Store intermediate results with `let`: - -``` -let active_users = users |> filter(fn(row) => row.users.status = 'active') - -active_users |> select(active_users.name, active_users.email) -``` - -## Aliases - -Use `as` to rename columns in output: - -``` -users |> select( - users.name, - users.salary * 12 as annual_salary -) -``` - -## Comments - -Single-line comments start with `--`: - -``` --- Get all active users -users |> filter(fn(row) => row.users.active) -``` - -## Operators - -### Comparison -`=`, `>`, `<`, `>=`, `<=`, `!=` - -### Logical -`and`, `or` - -### Arithmetic -`+`, `-`, `*`, `/` - -### Sorting -`asc`, `desc` diff --git a/Lql/LqlWebsite-Eleventy/src/docs/vscode.md b/Lql/LqlWebsite-Eleventy/src/docs/vscode.md deleted file mode 100644 index fe043c9d..00000000 --- a/Lql/LqlWebsite-Eleventy/src/docs/vscode.md +++ /dev/null @@ -1,233 +0,0 @@ ---- -layout: layouts/docs.njk -title: VS Code Extension -description: LQL VS Code extension with syntax highlighting, IntelliSense, diagnostics, and AI completions. ---- - -The LQL VS Code extension provides a rich editing experience for `.lql` files, powered by a native Rust Language Server. - -## Installation - -Search for **LQL** in the VS Code Extensions marketplace and click Install. The extension automatically downloads the correct LSP binary for your platform on first activation. - -Supported platforms: -- Linux x64 -- macOS x64 (Intel) -- macOS ARM64 (Apple Silicon) -- Windows x64 - -## Features - -### Syntax Highlighting - -Full TextMate grammar with semantic colorization for all LQL constructs: - -| Token | Color | Example | -|-------|-------|---------| -| Keywords | Orange-red | `let`, `fn`, `as`, `asc`, `desc` | -| Pipeline operator | Forest green | `\|>` | -| Lambda operator | Violet | `=>` | -| Query functions | Forest green | `select`, `filter`, `join` | -| Aggregate functions | Violet | `count`, `sum`, `avg` | -| String literals | Lime green | `'completed'` | -| Comments | Dark slate | `-- comment` | -| Table/column names | White/green | `users.id` | - -The extension includes a dedicated **LQL Dark** color theme optimized for LQL syntax. - -### IntelliSense Completions - -Context-aware completions triggered automatically as you type. Completions are organized by priority: - -**1. Column completions** - Type `table.` to see all columns from that table (requires [database connection](/docs/database-config/)): -- Shows column type, primary key indicator, and nullability -- Example: `users.` suggests `id (uuid PK NOT NULL)`, `name (text)`, `email (text)` - -**2. Pipeline operations** - Suggested after `|>`: -- `select`, `filter`, `join`, `left_join`, `right_join`, `cross_join` -- `group_by`, `order_by`, `having`, `limit`, `offset` -- `union`, `union_all`, `insert`, `distinct` - -**3. Functions** - Suggested in expression contexts: -- **Aggregate**: `count`, `sum`, `avg`, `min`, `max`, `first`, `last`, `row_number`, `rank` -- **String**: `concat`, `substring`, `length`, `trim`, `upper`, `lower`, `replace` -- **Math**: `round`, `floor`, `ceil`, `abs`, `sqrt`, `power`, `mod` -- **Date/Time**: `now`, `today`, `year`, `month`, `day`, `extract`, `date_trunc` -- **Conditional**: `coalesce`, `nullif`, `isnull`, `isnotnull` - -**4. Keywords** - `let`, `fn`, `as`, `and`, `or`, `not`, `distinct`, `null`, `case`, `when`, etc. - -**5. Table names** - From your database schema, showing column count and first 5 columns - -**6. Variable bindings** - `let` bindings from the current document - -**7. AI completions** - Optional intelligent suggestions from an AI model (see [AI Configuration](#ai-configuration)) - -### Snippets - -23 built-in snippets with tab stops for fast query authoring: - -| Prefix | Description | -|--------|-------------| -| `select` | Basic select all | -| `selectc` | Select specific columns | -| `selectf` | Select with filter | -| `filterand` | Filter with AND | -| `filteror` | Filter with OR | -| `join` | Inner join | -| `leftjoin` | Left join | -| `groupby` | Group by with count | -| `groupbyhaving` | Group by with having | -| `orderby` | Order by ascending | -| `limit` | Limit results | -| `limitoffset` | Pagination (limit + offset) | -| `distinct` | Select distinct | -| `union` | Union queries | -| `let` | Let binding | -| `case` | Case expression | -| `fn` | Lambda function | -| `pipeline` | Full pipeline example | - -### Real-Time Diagnostics - -Errors and warnings appear as you type with squiggly underlines: - -**Errors** (red): -- ANTLR parse errors with line/column position -- Unmatched closing parenthesis -- Unclosed parenthesis at end of file - -**Warnings** (yellow): -- Pipe operator `|>` not surrounded by spaces - -**Information** (blue): -- Unknown function names (not in the built-in function list) - -### Hover Documentation - -Hover over any LQL keyword, function, or operator to see: -- Description and usage -- Syntax signature - -With a [database connection](/docs/database-config/), hover also shows: -- **Table hover**: All columns with types, PK/nullable indicators -- **Column hover** (e.g., `users.email`): Column type, nullability, primary key status - -### Document Symbols - -The outline view shows all `let` bindings in your file, enabling quick navigation with `Ctrl+Shift+O`. - -### Document Formatting - -Format your entire document with `Shift+Alt+F` or right-click and select **Format LQL Document**: -- Consistent 4-space indentation for pipeline continuations -- Proper indentation inside parentheses -- Trimmed whitespace -- Preserved comments and blank lines - -### Commands - -| Command | Description | -|---------|-------------| -| **Format LQL Document** | Format the current `.lql` file | -| **Validate LQL Document** | Trigger validation diagnostics | -| **Show Compiled SQL** | Show the transpiled SQL output | - -Commands are available in the command palette (`Ctrl+Shift+P`) and the editor context menu when editing `.lql` files. - -## Extension Settings - -| Setting | Type | Default | Description | -|---------|------|---------|-------------| -| `lql.languageServer.enabled` | boolean | `true` | Enable/disable the language server | -| `lql.languageServer.trace` | enum | `off` | LSP trace level: `off`, `messages`, `verbose` | -| `lql.validation.enabled` | boolean | `true` | Enable/disable real-time validation | -| `lql.formatting.enabled` | boolean | `true` | Enable/disable document formatting | - -## AI Configuration - -The extension supports optional AI-powered completions via local or remote models. AI completions are merged with schema and keyword completions - they supplement, never replace. - -### Setup with Ollama (recommended) - -1. Install [Ollama](https://ollama.com) -2. Pull a code model: - ```bash - ollama pull qwen2.5-coder:1.5b - ``` -3. Add to your VS Code `settings.json`: - ```json - { - "lql.aiProvider": { - "provider": "ollama", - "endpoint": "http://localhost:11434", - "model": "qwen2.5-coder:1.5b", - "enabled": true - } - } - ``` - -### Recommended Models - -| Model | Size | Speed | Quality | -|-------|------|-------|---------| -| `qwen2.5-coder:1.5b` | 1.5B | Fast | Good | -| `deepseek-coder:1.3b` | 1.3B | Fast | Good | -| `codellama:7b` | 7B | Slower | Better | - -### AI Provider Settings - -```json -{ - "lql.aiProvider": { - "provider": "ollama", - "endpoint": "http://localhost:11434", - "model": "qwen2.5-coder:1.5b", - "apiKey": "", - "timeoutMs": 2000, - "enabled": true - } -} -``` - -| Field | Description | -|-------|-------------| -| `provider` | Provider type: `ollama`, `openai`, `anthropic`, `custom` | -| `endpoint` | API endpoint URL | -| `model` | Model identifier (optional, provider-specific) | -| `apiKey` | API key (optional, for cloud providers) | -| `timeoutMs` | Timeout in milliseconds (default: 2000) | -| `enabled` | Enable/disable AI completions | - -### What the AI Sees - -The AI model receives full context for accurate suggestions: -- The complete document text -- Cursor position (line and column) -- Current line prefix and word prefix -- File URI -- Database schema (table names, column names, types) - -Responses that exceed the timeout are silently dropped - you always get fast keyword and schema completions regardless of AI latency. - -## Language Features - -### Comment Support -- Line comments: `-- comment` -- Block comments: `/* comment */` - -### Bracket Matching -Auto-closing and matching for `()`, `[]`, `{}`, `''`, `""` - -### Folding -Region-based folding with `-- #region` and `-- #endregion` markers. - -## LSP Binary - -The extension bundles a native Rust language server (`lql-lsp`). On first activation, it searches for the binary in this order: - -1. Bundled `bin/lql-lsp` in the extension directory -2. Local development build (`target/release/lql-lsp` or `target/debug/lql-lsp`) -3. Previously cached binary in VS Code global storage -4. Downloads from [GitHub Releases](https://github.com/Nimblesite/DataProvider/releases) matching the extension version -5. Falls back to `lql-lsp` on your system PATH diff --git a/Lql/LqlWebsite-Eleventy/src/index.njk b/Lql/LqlWebsite-Eleventy/src/index.njk index 773ee215..d494c62f 100644 --- a/Lql/LqlWebsite-Eleventy/src/index.njk +++ b/Lql/LqlWebsite-Eleventy/src/index.njk @@ -1,257 +1,350 @@ --- layout: layouts/base.njk -title: "Lambda Query Language - Functional Data Querying" -description: "Functional programming meets data querying. Write elegant, composable queries with pipeline operators and lambda expressions." +title: "Lambda Query Language (LQL) - Functional Data Querying" +description: "LQL documentation, syntax, migrations, tooling, and transpilation examples for the Lambda Query Language." ---
-
-

Lambda Query Language

-

- Functional programming meets data querying. Write elegant, composable queries with the power of lambda expressions and pipeline operators. -

- +
+
+

Lambda Query Language

+

A functional pipeline language for database queries, migration logic, RLS predicates, and portable SQL generation.

-
-
-
-
-
-
- complex_analytics.lql -
-
-- Join users + orders, filter only completed orders -let joined = + + +
+
+
+
+
+ tenant_orders.lql +
+
-- Join, filter, aggregate, and emit dialect-specific SQL +let completed_orders = users |> join(orders, on = users.id = orders.user_id) |> filter(fn(row) => row.orders.status = 'completed') --- Aggregate and analyze -joined -|> group_by(users.id) +completed_orders +|> group_by(users.id, users.name) |> select( users.name, - count(*) as total_orders, + count(*) as order_count, sum(orders.total) as revenue -)
-
+) +|> order_by(revenue desc)
+
+
-
-
-
-
-

Why LQL?

-

Functional programming principles applied to data querying for cleaner, more maintainable code.

-
- -
-
-
λ
-

Functional First

-

Built on pure functional programming principles. Immutable data transformations, lambda expressions, and composable operations make your queries predictable and testable.

-
- -
-
|>
-

Pipeline Operators

-

Chain operations naturally with pipeline operators. Data flows from left to right, making complex transformations easy to read and understand.

-
- -
-
TS
-

Type Safe

-

Strong typing ensures your queries are correct at compile time. No more runtime surprises from typos or schema mismatches.

-
- -
-
fn
-

Composable

-

Build complex queries from simple, reusable components. Define once, use everywhere with let bindings and function composition.

-
- -
-
SQL
-

SQL Compatible

-

Compiles to optimized SQL for your target database. Get the performance of SQL with the elegance of functional programming.

-
+
+
+
+

What LQL Is For

+

Use one expression language for query pipelines, database-portable SQL, and migration-time predicates.

+
-
-
DX
-

Developer Focused

-

Designed by developers, for developers. Excellent tooling support with VS Code extension, LSP, and clear error messages.

-
+
+
+
|>
+

Pipeline Queries

+

Start from a table or binding and pass rows through select, filter, join, grouping, ordering, pagination, and set operations.

+
+
+
SQL
+

Dialect Output

+

The same LQL statement emits PostgreSQL, SQL Server, or SQLite SQL through the dialect packages.

+
+
+
RLS
+

Migrations

+

Migration YAML can use LQL for RLS policies and PostgreSQL function bodies instead of embedding raw platform SQL.

+
+
+
LSP
+

Editor Tooling

+

The Rust language server powers diagnostics, formatting, schema-aware completions, hovers, and optional AI suggestions.

+
+
+
F#
+

Type Provider

+

F# projects can validate LQL at compile time and expose generated SQL through the type provider.

+
+
+
DX
+

Transpiler Demo

+

The Blazor page is kept as a focused playground for running client-side transpilation only.

+
+
-
-
-
-

See LQL in Action

-

Real examples showing the power and elegance of functional data querying.

-
- -
-
-
-

Simple Selection

-

Clean, readable syntax for basic data selection. No verbose SELECT statements or complex syntax.

-
    -
  • Pipeline operator for natural flow
  • -
  • Clear column specification
  • -
  • Type-safe field access
  • -
+
+
+

Examples

+

The core language is small: compose table pipelines, lambda predicates, and expression columns.

-
-
users |> select( + +
+
+
+

Simple Selection

+

Select explicit columns from a source table. Fully qualify columns when that keeps intent clear.

+
    +
  • Table-first query shape
  • +
  • Column projection with `select`
  • +
  • Portable output SQL
  • +
+
+
+
users |> select( users.id, users.name, users.email )
-
-
+
+
-
-
-

Advanced Filtering

-

Lambda expressions provide powerful, type-safe filtering with full access to row data.

-
    -
  • Lambda function syntax
  • -
  • Logical operators (and, or)
  • -
  • Range filtering
  • -
-
-
-
employees -|> select( - employees.id, - employees.name, - employees.salary -) +
+
+

Lambda Filtering

+

Use `fn(row) => ...` predicates for filtering. The row parameter exposes fields through table-qualified paths.

+
    +
  • Comparison operators: `=`, `!=`, `>`, `<`, `>=`, `<=`
  • +
  • Logical operators: `and`, `or`, `not`
  • +
  • Arithmetic and function calls inside predicates
  • +
+
+
+
employees |> filter(fn(row) => - row.employees.salary > 50000 and - row.employees.salary < 100000 -)
-
-
- -
-
-

Arithmetic & Functions

-

Rich expression support with mathematical operations and built-in functions for complex calculations.

-
    -
  • Mathematical expressions
  • -
  • Column aliases with 'as'
  • -
  • Function composition
  • -
-
-
-
products -|> select( - products.name, - products.price * products.quantity as total_value, - products.price + 10 as price_plus_ten, - round(products.price / 2, 2) as half_price + row.employees.salary > 50000 and + row.employees.department = 'Engineering' ) -|> filter(fn(row) => - row.products.price > 0 -)
-
-
+|> select(employees.id, employees.name)
+
+
-
-
-

Aggregation & Grouping

-

Powerful aggregation functions with group by operations and having clauses for complex analytics.

-
    -
  • Group by multiple columns
  • -
  • Aggregate functions (count, sum, avg)
  • -
  • Having clause with lambda filters
  • -
+
+
+

Joins And Aggregation

+

Join tables, group rows, compute aggregate columns, filter groups, and order the result.

+
    +
  • `join`, `left_join`, `right_join`, `full_join`, `cross_join`
  • +
  • `count`, `sum`, `avg`, `min`, `max`
  • +
  • `having` filters grouped rows
  • +
+
+
+
users +|> join(orders, on = users.id = orders.user_id) +|> group_by(users.id, users.name) +|> having(fn(group) => count(*) > 2) +|> select(users.name, sum(orders.total) as revenue)
+
+
-
-
orders -|> group_by(orders.user_id, orders.status) -|> select( - orders.user_id, - orders.status, - count(*) as order_count, - sum(orders.total) as total_amount, - avg(orders.total) as avg_amount -) -|> having(fn(group) => count(*) > 2) -|> order_by(total_amount desc)
-
-
-
-
-
-
-

F# Type Provider

-

Use LQL with full type safety in F# through our native type provider.

-
- -
-
-

Type-Safe LQL in F#

-

The LQL Type Provider brings compile-time type checking to your LQL queries in F#. Write queries with IntelliSense support, catch errors before runtime, and enjoy seamless integration with your F# codebase.

-
    -
  • Compile-time query validation
  • -
  • Full IntelliSense support for table and column names
  • -
  • Automatic SQL generation for PostgreSQL and SQL Server
  • -
  • Strongly-typed result sets
  • -
-
-
-
-
-
-
- Program.fs +
+
+
+

LQL Documentation

+

Everything needed to install, write, transpile, use in migrations, and wire into editor tooling.

-
open Lql -// Define types with validated LQL -type GetUsers = - LqlCommand<"Users |> select(*)"> +
+
+

Install

+

Add the core parser plus the dialect package that will emit SQL for your target database.

+
dotnet add package Nimblesite.Lql.Core
+dotnet add package Nimblesite.Lql.Postgres
+dotnet add package Nimblesite.Lql.SqlServer
+dotnet add package Nimblesite.Lql.SQLite
+ + + + + + + + +
PackageUse
Nimblesite.Lql.CoreParser, AST, statement conversion.
Nimblesite.Lql.PostgresToPostgreSql() extension.
Nimblesite.Lql.SqlServerToSqlServer() extension.
Nimblesite.Lql.SQLiteToSQLite() extension.
+
-// Access generated SQL -let sql = GetUsers.Sql -let query = GetUsers.Query
-
-
-
-
+
+

Programmatic API

+

Parse once, then call the dialect extension. Errors are returned as `Result<T,E>` values.

+
using Nimblesite.Lql.Core;
+using Nimblesite.Lql.Postgres;
+using Nimblesite.Sql.Model;
+using Outcome;
 
-
-
-
-

Get Started in Minutes

-
+var lql = "Users |> filter(fn(row) => row.Users.Age > 21) |> select(Users.Name)"; +var statementResult = LqlStatementConverter.ToStatement(lql); -
-{% highlight "bash" %} -# Install the LQL NuGet package -dotnet add package Lql +if (statementResult is Result<LqlStatement, SqlError>.Ok<LqlStatement, SqlError> ok) +{ + Result<string, SqlError> sql = ok.Value.ToPostgreSql(); +}
+
-# Write your first LQL query -users |> select(users.id, users.name, users.email) +
+

Syntax

+

A query starts with a table or binding and flows left-to-right through operations.

+ + + + + + + + + + +
ConstructShape
Tableusers
Pipelineusers |> select(users.id)
Columnorders.total
Lambdafn(row) => row.users.active = true
Aliasusers.name as display_name
Let bindinglet active = users |> filter(...)
+
let active_users =
+    users |> filter(fn(row) => row.users.status = 'active')
 
-# Transpiles to SQL:
-# SELECT users.id, users.name, users.email FROM users
-{% endhighlight %}
+active_users
+|> select(active_users.id, active_users.name)
+|> order_by(active_users.name asc)
+|> limit(50)
+
+ +
+

Pipeline Operations

+ + + + + + + + + + + + + + +
OperationPurpose
select(cols...)Project columns or computed expressions.
filter(fn(row) => ...)Filter rows before grouping.
join(table, on = ...)Inner join to another source.
left_join(table, on = ...)Preserve rows from the left source.
group_by(cols...)Group result rows.
having(fn(group) => ...)Filter grouped rows.
order_by(col asc|desc)Sort the result.
limit(n) / offset(n)Page the result.
union(query) / union_all(query)Combine compatible result sets.
insert(target)Insert query output into another table.
+
+ +
+

Expressions And Functions

+

Expressions can compare values, combine predicates, compute columns, and call mapped SQL functions.

+ + + + + + + + + + +
GroupExamples
Comparison=, !=, >, <, >=, <=
Logicaland, or, not
Math+, -, *, /, round, abs
Aggregatecount, sum, avg, min, max
Windowrow_number, rank, dense_rank, lag, lead
Sessioncurrent_setting('app.tenant_id')::uuid for PostgreSQL migration/RLS shapes.
+
+ +
+

SQL Dialects

+

LQL keeps query intent stable while the dialect package handles SQL differences such as `LIMIT`, `TOP`, booleans, and function names.

+
users
+|> filter(fn(row) => row.users.age > 18)
+|> select(users.name, users.email)
+|> order_by(users.name asc)
+|> limit(10)
+ + + + + + + +
DialectGenerated paging shape
PostgreSQLORDER BY users.name ASC LIMIT 10
SQLiteORDER BY users.name ASC LIMIT 10
SQL ServerSELECT TOP 10 ... ORDER BY users.name ASC
+
+ +
+

Migrations, RLS, And `bodyLql`

+

DataProvider migrations use LQL where schema logic must be portable or generated consistently. Use `usingLql` and `withCheckLql` for RLS policies, and `bodyLql` for PostgreSQL function bodies that should be transpiled instead of handwritten.

+
tables:
+  - name: documents
+    columns:
+      - name: id
+        type: uuid
+        primaryKey: true
+      - name: tenant_id
+        type: uuid
+      - name: owner_id
+        type: uuid
+    rls:
+      enabled: true
+      policies:
+        - name: tenant_documents
+          command: all
+          usingLql: tenant_id = current_setting('app.tenant_id')::uuid
+          withCheckLql: owner_id = current_setting('app.user_id')::uuid
+
+functions:
+  - name: current_tenant_id
+    schema: app
+    returns: uuid
+    language: sql
+    bodyLql: current_setting('app.tenant_id')::uuid
+

For PostgreSQL, `current_setting('key')` is rewritten to the null-tolerant `current_setting('key', true)` form before SQL is emitted. `body` and `bodyLql` are mutually exclusive.

+
+ +
+

VS Code, LSP, Database Schema, And AI

+

The VS Code extension runs the Rust `lql-lsp` language server for `.lql` files.

+ + + + + + + + +
FeatureBehavior
DiagnosticsANTLR parse errors, unmatched parentheses, pipe spacing warnings, and unknown function hints.
CompletionsPipeline operations, keywords, functions, `let` bindings, table names, and columns.
Database configSet `lql.database.connectionString`, `LQL_CONNECTION_STRING`, or `DATABASE_URL` for schema-aware completions and hovers.
AI completionsOptional model completions are merged after schema and keyword results and bounded by timeout.
+
{
+  "lql.database.connectionString": "Host=localhost;Database=app;Username=postgres;Password=secret",
+  "lql.ai.provider": "ollama",
+  "lql.ai.endpoint": "http://localhost:11434/api/generate",
+  "lql.ai.model": "qwen2.5-coder:1.5b"
+}
+
+ +
+

F# Type Provider

+

The type provider validates LQL at compile time and exposes the original query plus generated SQLite SQL.

+
<PackageReference Include="Nimblesite.Lql.TypeProvider.FSharp" Version="*" />
+
open Nimblesite.Lql.Core.TypeProvider
+
+type ActiveUsers =
+    LqlCommand<"Users |> filter(fn(row) => row.Users.Status = 'active') |> select(*)">
+
+let query = ActiveUsers.Query
+let sql = ActiveUsers.Sql
+
+ + -
- Try the Playground +
+
+
+

Run The Transpiler

+

The Blazor app is now scoped to the interactive transpilation demo. Use it to paste LQL and inspect PostgreSQL or SQL Server output.

+
+
-
diff --git a/Lql/LqlWebsite-Eleventy/src/playground.njk b/Lql/LqlWebsite-Eleventy/src/playground.njk deleted file mode 100644 index de35aaa7..00000000 --- a/Lql/LqlWebsite-Eleventy/src/playground.njk +++ /dev/null @@ -1,53 +0,0 @@ ---- -layout: layouts/base.njk -title: "LQL Playground - Interactive Transpiler" -description: "Try Lambda Query Language and see how it transpiles to PostgreSQL or SQL Server in real time." ---- - -
-
-
-

LQL Playground

-

Try Lambda Query Language and see how it transpiles to PostgreSQL or SQL Server

-
- -
-
-
-

LQL Input

-
- - -
- -
- -
-

PostgreSQL Output

-
Enter LQL code and click 'Convert to SQL' to see the result.
- -
-
- -
-

Example Queries

-

Click any example to load it into the editor:

-
- - - - - -
-
-
-
-
- - diff --git a/Lql/Nimblesite.Lql.Core/Parsing/LqlToAstVisitor.cs b/Lql/Nimblesite.Lql.Core/Parsing/LqlToAstVisitor.cs index 05bf08f7..d79f85f1 100644 --- a/Lql/Nimblesite.Lql.Core/Parsing/LqlToAstVisitor.cs +++ b/Lql/Nimblesite.Lql.Core/Parsing/LqlToAstVisitor.cs @@ -433,6 +433,11 @@ expr as ParserRuleContext return $"{left} IS {(isNot ? "NOT " : string.Empty)}NULL"; } + if (comparison.inExpr() != null) + { + return ProcessInExpressionToSql(comparison.inExpr(), lambdaScope); + } + // No fallback - fail hard if comparison type is not handled throw new SqlErrorException( CreateSqlErrorStatic( @@ -442,6 +447,49 @@ expr as ParserRuleContext ); } + private static string ProcessInExpressionToSql( + LqlParser.InExprContext inExpr, + HashSet? lambdaScope + ) + { + // Implements [LQL-PREDICATE-IN-LIST]. + if (inExpr.argList() == null) + { + throw new SqlErrorException( + CreateSqlErrorStatic("IN subqueries are not supported in this context", inExpr) + ); + } + + var left = ProcessInLeftExpressionToSql(inExpr, lambdaScope); + var values = inExpr.argList().arg().Select(arg => ProcessFnCallArgToSql(arg, lambdaScope)); + return $"{left} IN ({string.Join(", ", values)})"; + } + + private static string ProcessInLeftExpressionToSql( + LqlParser.InExprContext inExpr, + HashSet? lambdaScope + ) + { + if (inExpr.qualifiedIdent() != null) + { + return ProcessQualifiedIdentifierToSql(inExpr.qualifiedIdent(), lambdaScope); + } + + if (inExpr.IDENT() != null) + { + return inExpr.IDENT().GetText(); + } + + if (inExpr.PARAMETER() != null) + { + return inExpr.PARAMETER().GetText(); + } + + throw new SqlErrorException( + CreateSqlErrorStatic("Unsupported IN left-hand expression", inExpr) + ); + } + /// /// Processes an arithmetic expression to SQL text, respecting lambda variable scope. /// diff --git a/Lql/Nimblesite.Lql.Tests/LqlErrorHandlingTests.cs b/Lql/Nimblesite.Lql.Tests/LqlErrorHandlingTests.cs index 17456c0b..9bd76647 100644 --- a/Lql/Nimblesite.Lql.Tests/LqlErrorHandlingTests.cs +++ b/Lql/Nimblesite.Lql.Tests/LqlErrorHandlingTests.cs @@ -65,6 +65,26 @@ public void InvalidSyntax_ShouldReturnError() Assert.True(failure.Value.Position.Column >= 0); } + [Fact] + public void InvalidCharacter_ShouldReturnLexerError() + { + // Arrange + const string lqlCode = """ + users |> select(users.id) # + """; + + // Act + var result = LqlStatementConverter.ToStatement(lqlCode); + + // Assert + Assert.IsType.Error>(result); + var failure = (Result.Error)result; + Assert.Contains("Syntax error", failure.Value.Message, StringComparison.Ordinal); + Assert.NotNull(failure.Value.Position); + Assert.True(failure.Value.Position!.Line > 0); + Assert.True(failure.Value.Position.Column >= 0); + } + [Fact] public void MissingPipeOperator_ShouldReturnError() { diff --git a/Lql/Nimblesite.Lql.TypeProvider.FSharp.Tests.Data/Nimblesite.Lql.TypeProvider.FSharp.Tests.Data.csproj b/Lql/Nimblesite.Lql.TypeProvider.FSharp.Tests.Data/Nimblesite.Lql.TypeProvider.FSharp.Tests.Data.csproj index 86d41b41..0772c5d8 100644 --- a/Lql/Nimblesite.Lql.TypeProvider.FSharp.Tests.Data/Nimblesite.Lql.TypeProvider.FSharp.Tests.Data.csproj +++ b/Lql/Nimblesite.Lql.TypeProvider.FSharp.Tests.Data/Nimblesite.Lql.TypeProvider.FSharp.Tests.Data.csproj @@ -26,6 +26,24 @@ + + $(MSBuildThisFileDirectory)../../Migration/DataProviderMigrate/bin/$(Configuration)/net9.0/DataProviderMigrate.dll + $(MSBuildThisFileDirectory)../../DataProvider/DataProvider/bin/$(Configuration)/net10.0/DataProvider.dll + + + + + + + @@ -37,7 +55,7 @@ + + $(MSBuildThisFileDirectory)../../Migration/DataProviderMigrate/bin/$(Configuration)/net9.0/DataProviderMigrate.dll + + + + + + @@ -47,7 +59,7 @@ - +
diff --git a/Lql/Nimblesite.Lql.Website/Components/Layout/MainLayout.razor b/Lql/Nimblesite.Lql.Website/Components/Layout/MainLayout.razor index 0e8827ae..f26d33df 100644 --- a/Lql/Nimblesite.Lql.Website/Components/Layout/MainLayout.razor +++ b/Lql/Nimblesite.Lql.Website/Components/Layout/MainLayout.razor @@ -4,20 +4,17 @@
\ No newline at end of file +
diff --git a/Lql/Nimblesite.Lql.Website/Components/Pages/Home.razor b/Lql/Nimblesite.Lql.Website/Components/Pages/Home.razor index ff95f9a1..0b455d40 100644 --- a/Lql/Nimblesite.Lql.Website/Components/Pages/Home.razor +++ b/Lql/Nimblesite.Lql.Website/Components/Pages/Home.razor @@ -1,246 +1,19 @@ @page "/" -@using Microsoft.AspNetCore.Components.Web @using Nimblesite.Lql.Core @using Nimblesite.Lql.Postgres @using Nimblesite.Lql.SqlServer -@using Outcome @using Nimblesite.Sql.Model +@using Outcome -Lambda Query Language (LQL) - Functional Data Querying - -
-
-
-

Lambda Query Language

-

Functional programming meets data querying. Write elegant, composable queries with the power of lambda expressions and pipeline operators.

- - - -
-
-
-
-
- complex_analytics.lql -
-
-- Join users + orders, filter only completed orders -let joined = - users - |> join(orders, on = users.id = orders.user_id) - |> filter(fn(row) => row.orders.status = 'completed') - --- Aggregate and analyze -joined -|> group_by(users.id) -|> select( - users.name, - count(*) as total_orders, - sum(orders.total) as revenue -)
-
-
-
-
- -
-
-
-

Why LQL?

-

Functional programming principles applied to data querying for cleaner, more maintainable code.

-
- -
-
-
λ
-

Functional First

-

Built on pure functional programming principles. Immutable data transformations, lambda expressions, and composable operations make your queries predictable and testable.

-
- -
-
|>
-

Pipeline Operators

-

Chain operations naturally with pipeline operators. Data flows from left to right, making complex transformations easy to read and understand.

-
- -
-
-

Type Safe

-

Strong typing ensures your queries are correct at compile time. No more runtime surprises from typos or schema mismatches.

-
- -
-
🔧
-

Composable

-

Build complex queries from simple, reusable components. Define once, use everywhere with let bindings and function composition.

-
- -
-
📊
-

SQL Compatible

-

Compiles to optimized SQL for your target database. Get the performance of SQL with the elegance of functional programming.

-
- -
-
🎯
-

Developer Focused

-

Designed by developers, for developers. Excellent tooling support, clear error messages, and intuitive syntax.

-
-
-
-
- -
-
-
-

See LQL in Action

-

Real examples showing the power and elegance of functional data querying.

-
- -
-
-
-

Simple Selection

-

Clean, readable syntax for basic data selection. No verbose SELECT statements or complex syntax.

-
    -
  • Pipeline operator for natural flow
  • -
  • Clear column specification
  • -
  • Type-safe field access
  • -
-
-
-
users |> select( - users.id, - users.name, - users.email -)
-
-
- -
-
-

Advanced Filtering

-

Lambda expressions provide powerful, type-safe filtering with full access to row data.

-
    -
  • Lambda function syntax
  • -
  • Logical operators (and, or)
  • -
  • Range filtering
  • -
-
-
-
employees -|> select( - employees.id, - employees.name, - employees.salary -) -|> filter(fn(row) => - row.employees.salary > 50000 and - row.employees.salary < 100000 -)
-
-
- -
-
-

Arithmetic & Functions

-

Rich expression support with mathematical operations and built-in functions for complex calculations.

-
    -
  • Mathematical expressions
  • -
  • Column aliases with 'as'
  • -
  • Function composition
  • -
-
-
-
products -|> select( - products.name, - products.price * products.quantity as total_value, - products.price + 10 as price_plus_ten, - round(products.price / 2, 2) as half_price -) -|> filter(fn(row) => - row.products.price > 0 -)
-
-
- -
-
-

Aggregation & Grouping

-

Powerful aggregation functions with group by operations and having clauses for complex analytics.

-
    -
  • Group by multiple columns
  • -
  • Aggregate functions (count, sum, avg)
  • -
  • Having clause with lambda filters
  • -
-
-
-
orders -|> group_by(orders.user_id, orders.status) -|> select( - orders.user_id, - orders.status, - count(*) as order_count, - sum(orders.total) as total_amount, - avg(orders.total) as avg_amount -) -|> having(fn(group) => count(*) > 2) -|> order_by(total_amount desc)
-
-
-
-
-
- -
-
-
-

F# Type Provider

-

Use LQL with full type safety in F# through our native type provider.

-
- -
-
-

Type-Safe LQL in F#

-

The LQL Type Provider brings compile-time type checking to your LQL queries in F#. Write queries with IntelliSense support, catch errors before runtime, and enjoy seamless integration with your F# codebase.

-
    -
  • Compile-time query validation
  • -
  • Full IntelliSense support for table and column names
  • -
  • Automatic SQL generation for PostgreSQL and SQL Server
  • -
  • Strongly-typed result sets
  • -
-
-
-
-
-
-
- Program.fs -
-
open Lql - -// Define types with validated LQL -type GetUsers = - LqlCommand<"Users |> select(*)"> - -// Access generated SQL -let sql = GetUsers.Sql -let query = GetUsers.Query
-
-
-
-
+LQL Transpiler Playground
-
-
-

LQL Playground

-

Try Lambda Query Language and see how it transpiles to PostgreSQL or SQL Server

+
+
+

LQL Transpiler Playground

+

Paste LQL and inspect the generated PostgreSQL or SQL Server output.

- +

LQL Input

@@ -250,21 +23,11 @@
- +
@@ -281,80 +44,68 @@ users |> select(users.id, users.name, users.email)">

Example Queries

-

Click any example to load it into the editor:

+

Click any example to load it into the editor.

- - - - - + + + + +
- - @code { - private string lqlInput = ""; + private const string SimpleSelectExample = "users |> select(users.id, users.name, users.email)"; + + private const string JoinExample = + "users\n" + + "|> join(orders, on = users.id = orders.user_id)\n" + + "|> select(users.name, orders.total, orders.status)"; + + private const string FilterExample = + "employees\n" + + "|> select(employees.id, employees.name, employees.salary)\n" + + "|> filter(fn(row) => row.employees.salary > 50000 and row.employees.department = 'Engineering')"; + + private const string AggregateExample = + "orders\n" + + "|> group_by(orders.user_id)\n" + + "|> select(\n" + + " orders.user_id,\n" + + " count(*) as order_count,\n" + + " sum(orders.total) as total_amount,\n" + + " avg(orders.total) as avg_amount\n" + + ")\n" + + "|> having(fn(group) => count(*) > 2)\n" + + "|> order_by(total_amount desc)"; + + private const string ComplexExample = + "let joined =\n" + + " users\n" + + " |> join(orders, on = users.id = orders.user_id)\n" + + " |> filter(fn(row) => row.orders.status = 'completed')\n" + + "\n" + + "joined\n" + + "|> group_by(users.id)\n" + + "|> select(\n" + + " users.name,\n" + + " count(*) as total_orders,\n" + + " sum(orders.total) as revenue,\n" + + " avg(orders.total) as avg_order_value\n" + + ")\n" + + "|> filter(fn(row) => row.revenue > 1000)\n" + + "|> order_by(revenue desc)\n" + + "|> limit(10)"; + + private string lqlInput = SimpleSelectExample; private string sqlOutput = "Enter LQL code and click 'Convert to SQL' to see the result."; private string errorMessage = ""; private string selectedDialect = "PostgreSQL"; - private bool isConverting = false; + private bool isConverting; - // Example queries - private readonly string simpleSelectExample = "users |> select(users.id, users.name, users.email)"; - - private readonly string joinExample = @"users -|> join(orders, on = users.id = orders.user_id) -|> select(users.name, orders.total, orders.status)"; - - private readonly string filterExample = @"employees -|> select(employees.id, employees.name, employees.salary) -|> filter(fn(row) => row.employees.salary > 50000 and row.employees.department = 'Engineering')"; - - private readonly string aggregateExample = @"orders -|> group_by(orders.user_id) -|> select( - orders.user_id, - count(*) as order_count, - sum(orders.total) as total_amount, - avg(orders.total) as avg_amount -) -|> having(fn(group) => count(*) > 2) -|> order_by(total_amount desc)"; - - private readonly string complexExample = @"-- Complex analytics query -let joined = - users - |> join(orders, on = users.id = orders.user_id) - |> filter(fn(row) => row.orders.status = 'completed') - -joined -|> group_by(users.id) -|> select( - users.name, - count(*) as total_orders, - sum(orders.total) as revenue, - avg(orders.total) as avg_order_value -) -|> filter(fn(row) => row.revenue > 1000) -|> order_by(revenue desc) -|> limit(10)"; - - private async Task ConvertLql() + private void ConvertLql() { if (string.IsNullOrWhiteSpace(lqlInput)) { @@ -366,51 +117,46 @@ joined isConverting = true; errorMessage = ""; sqlOutput = "Converting..."; - - try - { - await Task.Delay(100); // Small delay to show loading state - - // Parse the LQL code - var statementResult = LqlStatementConverter.ToStatement(lqlInput); - if (statementResult is Result.Error parseFailure) - { - errorMessage = parseFailure.Value.DetailedMessage ?? parseFailure.Value.Message; - sqlOutput = ""; - return; - } - - var statement = ((Result.Ok)statementResult).Value; + var statementResult = LqlStatementConverter.ToStatement(lqlInput); + if (statementResult is Result.Error parseFailure) + { + errorMessage = parseFailure.Value.DetailedMessage ?? parseFailure.Value.Message; + sqlOutput = ""; + isConverting = false; + return; + } - // Convert to the selected SQL dialect - Result sqlResult = selectedDialect switch - { - "PostgreSQL" => statement.ToPostgreSql(), - "SqlServer" => statement.ToSqlServer(), - _ => new Result.Error(new SqlError($"Unsupported dialect: {selectedDialect}")) - }; + if (statementResult is not Result.Ok statementOk) + { + errorMessage = "LQL parser returned an unknown result."; + sqlOutput = ""; + isConverting = false; + return; + } - if (sqlResult is Result.Error sqlFailure) - { - errorMessage = sqlFailure.Value.DetailedMessage ?? sqlFailure.Value.Message; - sqlOutput = ""; - return; - } + var sqlResult = selectedDialect switch + { + "PostgreSQL" => statementOk.Value.ToPostgreSql(), + "SqlServer" => statementOk.Value.ToSqlServer(), + _ => new Result.Error(new SqlError($"Unsupported dialect: {selectedDialect}")), + }; - var sql = ((Result.Ok)sqlResult).Value; - sqlOutput = sql; - errorMessage = ""; - } - catch (Exception ex) + if (sqlResult is Result.Error sqlFailure) { - errorMessage = $"An unexpected error occurred: {ex}"; + errorMessage = sqlFailure.Value.DetailedMessage ?? sqlFailure.Value.Message; sqlOutput = ""; + isConverting = false; + return; } - finally + + if (sqlResult is Result.Ok sqlOk) { - isConverting = false; + sqlOutput = sqlOk.Value; + errorMessage = ""; } + + isConverting = false; } private void LoadExample(string example) @@ -419,11 +165,4 @@ joined errorMessage = ""; sqlOutput = "Click 'Convert to SQL' to see the result."; } - - protected override async Task OnInitializedAsync() - { - // Load a simple example by default - lqlInput = simpleSelectExample; - await ConvertLql(); - } -} \ No newline at end of file +} diff --git a/Lql/Nimblesite.Lql.Website/wwwroot/index.html b/Lql/Nimblesite.Lql.Website/wwwroot/index.html index ef606325..b666746b 100644 --- a/Lql/Nimblesite.Lql.Website/wwwroot/index.html +++ b/Lql/Nimblesite.Lql.Website/wwwroot/index.html @@ -4,7 +4,7 @@ - Lambda Query Language (LQL) - Functional Data Querying + LQL Transpiler Playground @@ -12,10 +12,10 @@ -
Loading...
+
Loading LQL transpiler...
- An unhandled error has occurred. + The LQL transpiler failed to load. Reload 🗙
@@ -23,4 +23,4 @@ - \ No newline at end of file + diff --git a/Migration/Nimblesite.DataProvider.Migration.Core/SchemaBuilder.cs b/Migration/Nimblesite.DataProvider.Migration.Core/SchemaBuilder.cs index 5fd4ec4a..58186f4a 100644 --- a/Migration/Nimblesite.DataProvider.Migration.Core/SchemaBuilder.cs +++ b/Migration/Nimblesite.DataProvider.Migration.Core/SchemaBuilder.cs @@ -358,6 +358,7 @@ public sealed class ColumnBuilder private bool _isComputedPersisted; private string? _collation; private string? _checkConstraint; + private string? _checkConstraintName; private string? _comment; internal bool IsPrimaryKey { get; private set; } @@ -456,6 +457,17 @@ public ColumnBuilder Check(string expression) return this; } + /// + /// Add named check constraint to column. + /// Implements [MIG-PG-NAMED-COLUMN-CHECK-CONSTRAINT]. + /// + public ColumnBuilder Check(string name, string expression) + { + _checkConstraintName = name; + _checkConstraint = expression; + return this; + } + /// /// Add comment to column. /// @@ -480,6 +492,7 @@ internal ColumnDefinition Build() => IsComputedPersisted = _isComputedPersisted, Collation = _collation, CheckConstraint = _checkConstraint, + CheckConstraintName = _checkConstraintName, Comment = _comment, }; } diff --git a/Migration/Nimblesite.DataProvider.Migration.Core/SchemaConstraintNames.cs b/Migration/Nimblesite.DataProvider.Migration.Core/SchemaConstraintNames.cs new file mode 100644 index 00000000..baba7e6b --- /dev/null +++ b/Migration/Nimblesite.DataProvider.Migration.Core/SchemaConstraintNames.cs @@ -0,0 +1,22 @@ +namespace Nimblesite.DataProvider.Migration.Core; + +/// +/// Stable constraint name helpers shared by diff and platform DDL generators. +/// +public static class SchemaConstraintNames +{ + /// + /// Resolve the database name for a column-level check constraint. + /// Implements [MIG-PG-NAMED-COLUMN-CHECK-CONSTRAINT]. + /// + public static string ColumnCheck(string tableName, ColumnDefinition column) + { + var name = column.CheckConstraintName; + if (!string.IsNullOrWhiteSpace(name)) + { + return name; + } + + return $"{tableName}_{column.Name}_chk"; + } +} diff --git a/Migration/Nimblesite.DataProvider.Migration.Core/SchemaDefinition.cs b/Migration/Nimblesite.DataProvider.Migration.Core/SchemaDefinition.cs index 8c5be3e9..f26ac976 100644 --- a/Migration/Nimblesite.DataProvider.Migration.Core/SchemaDefinition.cs +++ b/Migration/Nimblesite.DataProvider.Migration.Core/SchemaDefinition.cs @@ -228,6 +228,12 @@ public sealed record ColumnDefinition /// Check constraint expression for this column only. public string? CheckConstraint { get; init; } + /// + /// Stable database constraint name for . + /// Implements [MIG-PG-NAMED-COLUMN-CHECK-CONSTRAINT]. + /// + public string? CheckConstraintName { get; init; } + /// Column comment/description for documentation. public string? Comment { get; init; } } diff --git a/Migration/Nimblesite.DataProvider.Migration.Core/SchemaDiff.cs b/Migration/Nimblesite.DataProvider.Migration.Core/SchemaDiff.cs index 458c3e03..df335875 100644 --- a/Migration/Nimblesite.DataProvider.Migration.Core/SchemaDiff.cs +++ b/Migration/Nimblesite.DataProvider.Migration.Core/SchemaDiff.cs @@ -122,6 +122,17 @@ public static OperationsResult Calculate( ); operations.AddRange(fkOps); + var uniqueOps = CalculateUniqueConstraintDiff( + currentTable, + desiredTable, + allowDestructive, + logger + ); + operations.AddRange(uniqueOps); + + var checkOps = CalculateCheckConstraintDiff(currentTable, desiredTable, logger); + operations.AddRange(checkOps); + rlsOperations.AddRange( CalculateRlsDiff(currentTable, desiredTable, allowDestructive, logger) ); @@ -429,4 +440,150 @@ private static IEnumerable CalculateForeignKeyDiff( } } } + + private static IEnumerable CalculateCheckConstraintDiff( + TableDefinition current, + TableDefinition desired, + ILogger? logger + ) + { + var currentNames = CheckConstraintNames(current) + .ToHashSet(StringComparer.OrdinalIgnoreCase); + + foreach (var desiredCheck in desired.CheckConstraints) + { + if (currentNames.Contains(desiredCheck.Name)) + { + continue; + } + + logger?.LogDebug( + "Check constraint {CheckName} on {Schema}.{Table} not found, will add", + desiredCheck.Name, + desired.Schema, + desired.Name + ); + yield return new AddCheckConstraintOperation( + desired.Schema, + desired.Name, + desiredCheck + ); + } + + foreach (var desiredColumn in desired.Columns) + { + if (desiredColumn.CheckConstraint is null) + { + continue; + } + + // Implements [MIG-PG-NAMED-COLUMN-CHECK-CONSTRAINT]. + var name = SchemaConstraintNames.ColumnCheck(desired.Name, desiredColumn); + if (currentNames.Contains(name)) + { + continue; + } + + logger?.LogDebug( + "Column check constraint {CheckName} on {Schema}.{Table} not found, will add", + name, + desired.Schema, + desired.Name + ); + yield return new AddCheckConstraintOperation( + desired.Schema, + desired.Name, + new CheckConstraintDefinition + { + Name = name, + Expression = desiredColumn.CheckConstraint, + } + ); + } + } + + private static IEnumerable CheckConstraintNames(TableDefinition table) + { + foreach (var tableCheck in table.CheckConstraints) + { + yield return tableCheck.Name; + } + + foreach (var column in table.Columns) + { + if (column.CheckConstraint is not null) + { + yield return SchemaConstraintNames.ColumnCheck(table.Name, column); + } + } + } + + private static IEnumerable CalculateUniqueConstraintDiff( + TableDefinition current, + TableDefinition desired, + bool allowDestructive, + ILogger? logger + ) + { + var currentNames = current + .UniqueConstraints.Select(uc => UniqueConstraintName(current.Name, uc)) + .ToHashSet(StringComparer.OrdinalIgnoreCase); + + foreach (var desiredUnique in desired.UniqueConstraints) + { + var name = UniqueConstraintName(desired.Name, desiredUnique); + if (currentNames.Contains(name)) + { + continue; + } + + // Implements [MIG-PG-UNIQUE-CONSTRAINT-INSPECTION]. + logger?.LogDebug( + "Unique constraint {UniqueName} on {Schema}.{Table} not found, will add", + name, + desired.Schema, + desired.Name + ); + yield return new AddUniqueConstraintOperation( + desired.Schema, + desired.Name, + desiredUnique + ); + } + + if (!allowDestructive) + { + yield break; + } + + var desiredNames = desired + .UniqueConstraints.Select(uc => UniqueConstraintName(desired.Name, uc)) + .ToHashSet(StringComparer.OrdinalIgnoreCase); + + foreach (var currentUnique in current.UniqueConstraints) + { + var name = UniqueConstraintName(current.Name, currentUnique); + if (desiredNames.Contains(name)) + { + continue; + } + + logger?.LogDebug("Unique constraint {UniqueName} will be dropped", name); + // Implements [MIG-PG-CONSTRAINT-BACKED-INDEX-DROP]. + yield return new DropIndexOperation(current.Schema, current.Name, name); + } + } + + private static string UniqueConstraintName( + string tableName, + UniqueConstraintDefinition uniqueConstraint + ) + { + if (!string.IsNullOrWhiteSpace(uniqueConstraint.Name)) + { + return uniqueConstraint.Name; + } + + return $"UQ_{tableName}_{string.Join("_", uniqueConstraint.Columns)}"; + } } diff --git a/Migration/Nimblesite.DataProvider.Migration.Postgres/PostgresDdlGenerator.cs b/Migration/Nimblesite.DataProvider.Migration.Postgres/PostgresDdlGenerator.cs index 34b5f97b..25a96305 100644 --- a/Migration/Nimblesite.DataProvider.Migration.Postgres/PostgresDdlGenerator.cs +++ b/Migration/Nimblesite.DataProvider.Migration.Postgres/PostgresDdlGenerator.cs @@ -112,7 +112,7 @@ public static string Generate(SchemaOperation operation) => $"DROP TABLE IF EXISTS \"{op.Schema}\".\"{op.TableName}\" CASCADE", DropColumnOperation op => $"ALTER TABLE \"{op.Schema}\".\"{op.TableName}\" DROP COLUMN \"{op.ColumnName}\"", - DropIndexOperation op => $"DROP INDEX IF EXISTS \"{op.Schema}\".\"{op.IndexName}\"", + DropIndexOperation op => GenerateDropIndex(op), DropForeignKeyOperation op => $"ALTER TABLE \"{op.Schema}\".\"{op.TableName}\" DROP CONSTRAINT \"{op.ConstraintName}\"", DropFunctionOperation op => GenerateDropFunction(op), @@ -267,7 +267,7 @@ private static string GenerateCreateTable(TableDefinition table) foreach (var column in table.Columns) { - columnDefs.Add(GenerateColumnDef(column)); + columnDefs.Add(GenerateColumnDef(tableName, column)); } // Add primary key constraint @@ -337,7 +337,7 @@ private static string GenerateCreateTable(TableDefinition table) return sb.ToString(); } - private static string GenerateColumnDef(ColumnDefinition column) + private static string GenerateColumnDef(string tableName, ColumnDefinition column) { var sb = new StringBuilder(); sb.Append(CultureInfo.InvariantCulture, $"\"{column.Name}\" "); @@ -378,11 +378,16 @@ private static string GenerateColumnDef(ColumnDefinition column) if (column.CheckConstraint is not null) { + // Implements [MIG-PG-NAMED-COLUMN-CHECK-CONSTRAINT]. // Auto-quote the column's own name in its CHECK expression so // mixed-case columns survive without manual quoting in YAML. var ownNames = new HashSet(StringComparer.Ordinal) { column.Name }; var quotedExpr = QuoteIdentifiersInExpression(column.CheckConstraint, ownNames); - sb.Append(CultureInfo.InvariantCulture, $" CHECK ({quotedExpr})"); + var checkName = SchemaConstraintNames.ColumnCheck(tableName, column); + sb.Append( + CultureInfo.InvariantCulture, + $" CONSTRAINT \"{checkName}\" CHECK ({quotedExpr})" + ); } return sb.ToString(); @@ -390,7 +395,7 @@ private static string GenerateColumnDef(ColumnDefinition column) private static string GenerateAddColumn(AddColumnOperation op) { - var colDef = GenerateColumnDef(op.Column); + var colDef = GenerateColumnDef(op.TableName, op.Column); return $"ALTER TABLE \"{op.Schema}\".\"{op.TableName}\" ADD COLUMN {colDef}"; } @@ -407,6 +412,45 @@ private static string GenerateCreateIndex(CreateIndexOperation op) return $"CREATE {unique}INDEX IF NOT EXISTS \"{op.Index.Name}\" ON \"{op.Schema}\".\"{op.TableName}\" ({indexItems}){filter}"; } + private static string GenerateDropIndex(DropIndexOperation op) + { + var schema = SqlLiteral(op.Schema); + var table = SqlLiteral(op.TableName); + var index = SqlLiteral(op.IndexName); + + // Implements [MIG-PG-CONSTRAINT-BACKED-INDEX-DROP]. + return $$""" + DO $migration_drop_index$ + DECLARE + constraint_name text; + BEGIN + SELECT c.conname + INTO constraint_name + FROM pg_constraint c + JOIN pg_class i ON i.oid = c.conindid + JOIN pg_namespace ni ON ni.oid = i.relnamespace + JOIN pg_class t ON t.oid = c.conrelid + JOIN pg_namespace nt ON nt.oid = t.relnamespace + WHERE ni.nspname = {{schema}} + AND i.relname = {{index}} + AND nt.nspname = {{schema}} + AND t.relname = {{table}} + AND c.contype IN ('p', 'u') + LIMIT 1; + + IF constraint_name IS NULL THEN + EXECUTE format('DROP INDEX IF EXISTS %I.%I', {{schema}}, {{index}}); + ELSE + EXECUTE format('ALTER TABLE %I.%I DROP CONSTRAINT %I', {{schema}}, {{table}}, constraint_name); + END IF; + END; + $migration_drop_index$; + """; + } + + private static string SqlLiteral(string value) => + $"'{value.Replace("'", "''", StringComparison.Ordinal)}'"; + private static string GenerateAddForeignKey(AddForeignKeyOperation op) { var fk = op.ForeignKey; diff --git a/Migration/Nimblesite.DataProvider.Migration.Postgres/PostgresSchemaInspector.cs b/Migration/Nimblesite.DataProvider.Migration.Postgres/PostgresSchemaInspector.cs index 12b1fdcb..38179cdb 100644 --- a/Migration/Nimblesite.DataProvider.Migration.Postgres/PostgresSchemaInspector.cs +++ b/Migration/Nimblesite.DataProvider.Migration.Postgres/PostgresSchemaInspector.cs @@ -99,6 +99,8 @@ public static TableResult InspectTable( var columns = new List(); var indexes = new List(); var foreignKeys = new List(); + var uniqueConstraints = new List(); + var checkConstraints = new List(); PrimaryKeyDefinition? primaryKey = null; // Get column info. Implements [MIG-TYPES-VECTOR] §5.4.4: LEFT JOIN @@ -204,6 +206,8 @@ ORDER BY kcu.ordinal_position }; } + InspectUniqueConstraints(connection, schemaName, tableName, uniqueConstraints); + // Get indexes (both column-based and expression indexes) using var idxCmd = connection.CreateCommand(); idxCmd.CommandText = """ @@ -221,9 +225,13 @@ FROM pg_class t JOIN pg_index ix ON t.oid = ix.indrelid JOIN pg_class i ON i.oid = ix.indexrelid JOIN pg_namespace n ON n.oid = t.relnamespace + LEFT JOIN pg_constraint owned_constraint + ON owned_constraint.conindid = ix.indexrelid + AND owned_constraint.contype IN ('p', 'u') WHERE n.nspname = @schema AND t.relname = @table AND NOT ix.indisprimary + AND owned_constraint.oid IS NULL ORDER BY i.relname """; idxCmd.Parameters.AddWithValue("@schema", schemaName); @@ -319,6 +327,8 @@ JOIN information_schema.referential_constraints rc } } + InspectCheckConstraints(connection, schemaName, tableName, columns, checkConstraints); + // [RLS-DIFF] read pg_policies + relrowsecurity into RowLevelSecurity. var rls = InspectRls(connection, schemaName, tableName); @@ -331,6 +341,8 @@ JOIN information_schema.referential_constraints rc Indexes = indexes.AsReadOnly(), ForeignKeys = foreignKeys.AsReadOnly(), PrimaryKey = primaryKey, + UniqueConstraints = uniqueConstraints.AsReadOnly(), + CheckConstraints = checkConstraints.AsReadOnly(), RowLevelSecurity = rls, } ); @@ -416,6 +428,121 @@ private static ForeignKeyAction ParseForeignKeyAction(string action) => _ => ForeignKeyAction.NoAction, }; + private static void InspectUniqueConstraints( + NpgsqlConnection connection, + string schemaName, + string tableName, + List uniqueConstraints + ) + { + // Implements [MIG-PG-UNIQUE-CONSTRAINT-INSPECTION]. + using var uniqueCmd = connection.CreateCommand(); + uniqueCmd.CommandText = """ + SELECT + c.conname, + array_agg(a.attname ORDER BY keys.n) AS columns + FROM pg_constraint c + JOIN pg_class t ON t.oid = c.conrelid + JOIN pg_namespace n ON n.oid = t.relnamespace + JOIN unnest(c.conkey) WITH ORDINALITY AS keys(attnum, n) ON true + JOIN pg_attribute a ON a.attrelid = t.oid AND a.attnum = keys.attnum + WHERE n.nspname = @schema + AND t.relname = @table + AND c.contype = 'u' + GROUP BY c.conname + ORDER BY c.conname + """; + uniqueCmd.Parameters.AddWithValue("@schema", schemaName); + uniqueCmd.Parameters.AddWithValue("@table", tableName); + + using var reader = uniqueCmd.ExecuteReader(); + while (reader.Read()) + { + uniqueConstraints.Add( + new UniqueConstraintDefinition + { + Name = reader.GetString(0), + Columns = ((string[])reader.GetValue(1)).ToList().AsReadOnly(), + } + ); + } + } + + private static void InspectCheckConstraints( + NpgsqlConnection connection, + string schemaName, + string tableName, + List columns, + List tableChecks + ) + { + // Implements [MIG-PG-NAMED-COLUMN-CHECK-CONSTRAINT]. + using var checkCmd = connection.CreateCommand(); + checkCmd.CommandText = """ + SELECT + c.conname, + pg_get_expr(c.conbin, c.conrelid, true), + COALESCE(array_length(c.conkey, 1), 0), + column_ref.attname + FROM pg_constraint c + JOIN pg_class t ON t.oid = c.conrelid + JOIN pg_namespace n ON n.oid = t.relnamespace + LEFT JOIN LATERAL ( + SELECT a.attname + FROM unnest(c.conkey) AS keys(attnum) + JOIN pg_attribute a ON a.attrelid = t.oid AND a.attnum = keys.attnum + ) AS column_ref ON COALESCE(array_length(c.conkey, 1), 0) = 1 + WHERE n.nspname = @schema + AND t.relname = @table + AND c.contype = 'c' + ORDER BY c.conname + """; + checkCmd.Parameters.AddWithValue("@schema", schemaName); + checkCmd.Parameters.AddWithValue("@table", tableName); + + using var reader = checkCmd.ExecuteReader(); + while (reader.Read()) + { + var name = reader.GetString(0); + var expression = reader.GetString(1); + var columnCount = reader.GetInt32(2); + + if (columnCount == 1 && !reader.IsDBNull(3)) + { + ApplyColumnCheckConstraint(columns, reader.GetString(3), name, expression); + } + else + { + tableChecks.Add( + new CheckConstraintDefinition { Name = name, Expression = expression } + ); + } + } + } + + private static void ApplyColumnCheckConstraint( + List columns, + string columnName, + string constraintName, + string expression + ) + { + for (var i = 0; i < columns.Count; i++) + { + if (!string.Equals(columns[i].Name, columnName, StringComparison.OrdinalIgnoreCase)) + { + continue; + } + + columns[i] = columns[i] with + { + CheckConstraint = expression, + CheckConstraintName = constraintName, + }; + return; + } + } + /// /// Parse expressions from a PostgreSQL index definition string. /// Example: "CREATE UNIQUE INDEX uq_name ON public.table USING btree (lower(name), suburb_id)" diff --git a/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresColumnCheckConstraintTests.cs b/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresColumnCheckConstraintTests.cs new file mode 100644 index 00000000..7c5b525a --- /dev/null +++ b/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresColumnCheckConstraintTests.cs @@ -0,0 +1,154 @@ +namespace Nimblesite.DataProvider.Migration.Tests; + +[Collection(PostgresTestSuite.Name)] +public sealed class PostgresColumnCheckConstraintTests(PostgresContainerFixture fixture) +{ + private const string SchemaName = "public"; + private static readonly ILogger Logger = NullLogger.Instance; + + [Fact] + public async Task ApplyYaml_WhenColumnCheckConstraintHasName_CreatesNamedPostgresConstraints() + { + // Implements [MIG-PG-NAMED-COLUMN-CHECK-CONSTRAINT]. + var connection = await fixture + .CreateDatabaseAsync("column_check_constraints") + .ConfigureAwait(true); + + try + { + var schema = SchemaYamlSerializer.FromYaml( + """ + name: issue51 + tables: + - schema: public + name: bundles + columns: + - name: id + type: Uuid + isNullable: false + - name: tenant_id + type: Uuid + isNullable: false + - name: sha256 + type: Text + isNullable: false + checkConstraintName: bundles_sha256_fmt_chk + checkConstraint: sha256 ~ '^[0-9a-f]{64}$' + - name: size_bytes + type: BigInt + isNullable: false + checkConstraintName: bundles_size_bytes_nonneg_chk + checkConstraint: size_bytes >= 0 + primaryKey: + columns: + - id + uniqueConstraints: + - name: uq_bundles_tenant_sha + columns: + - tenant_id + - sha256 + """ + ); + + Apply(connection, Calculate(Inspect(connection), schema)); + + Assert.Equal( + [ + "bundles_sha256_fmt_chk", + "bundles_size_bytes_nonneg_chk", + "uq_bundles_tenant_sha", + ], + ConstraintNames(connection) + ); + + var inspected = Inspect(connection).Tables.Single(t => t.Name == "bundles"); + var sha256 = inspected.Columns.Single(c => c.Name == "sha256"); + var sizeBytes = inspected.Columns.Single(c => c.Name == "size_bytes"); + + Assert.Equal("bundles_sha256_fmt_chk", sha256.CheckConstraintName); + Assert.Equal("bundles_size_bytes_nonneg_chk", sizeBytes.CheckConstraintName); + Assert.DoesNotContain( + Calculate(Inspect(connection), schema), + operation => operation is AddCheckConstraintOperation + ); + } + finally + { + await connection.DisposeAsync().ConfigureAwait(true); + NpgsqlConnection.ClearPool(connection); + } + } + + private static SchemaDefinition Inspect(NpgsqlConnection connection) + { + var result = PostgresSchemaInspector.Inspect(connection, SchemaName, Logger); + if (result is SchemaResultOk ok) + { + return ok.Value; + } + + Assert.Fail("Expected PostgreSQL schema inspection to succeed."); + return Schema.Define("failed").Build(); + } + + private static IReadOnlyList Calculate( + SchemaDefinition current, + SchemaDefinition desired + ) + { + var result = SchemaDiff.Calculate(current, desired, logger: Logger); + if (result is OperationsResultOk ok) + { + return ok.Value; + } + + Assert.Fail("Expected PostgreSQL schema diff to succeed."); + return []; + } + + private static void Apply( + NpgsqlConnection connection, + IReadOnlyList operations + ) + { + var result = MigrationRunner.Apply( + connection, + operations, + PostgresDdlGenerator.Generate, + MigrationOptions.Default, + Logger + ); + var failure = result is MigrationApplyResultError error ? error.Value.ToString() : ""; + + Assert.True(result is MigrationApplyResultOk, $"Migration failed: {failure}"); + } + + private static string[] ConstraintNames(NpgsqlConnection connection) + { + using var command = connection.CreateCommand(); + command.CommandText = """ + SELECT c.conname + FROM pg_constraint c + JOIN pg_class t ON t.oid = c.conrelid + JOIN pg_namespace n ON n.oid = t.relnamespace + WHERE n.nspname = @schema + AND t.relname = 'bundles' + AND c.conname IN ( + 'bundles_sha256_fmt_chk', + 'bundles_size_bytes_nonneg_chk', + 'uq_bundles_tenant_sha' + ) + ORDER BY c.conname + """; + command.Parameters.AddWithValue("@schema", SchemaName); + + using var reader = command.ExecuteReader(); + var names = new List(); + while (reader.Read()) + { + names.Add(reader.GetString(0)); + } + + return names.ToArray(); + } +} diff --git a/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresDropConstraintBackedIndexTests.cs b/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresDropConstraintBackedIndexTests.cs new file mode 100644 index 00000000..0c0a0480 --- /dev/null +++ b/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresDropConstraintBackedIndexTests.cs @@ -0,0 +1,200 @@ +namespace Nimblesite.DataProvider.Migration.Tests; + +[Collection(PostgresTestSuite.Name)] +public sealed class PostgresDropConstraintBackedIndexTests(PostgresContainerFixture fixture) +{ + private const string SchemaName = "public"; + private const string TableName = "api_keys"; + private const string ConstraintName = "api_keys_key_hash_uniq"; + private static readonly ILogger Logger = NullLogger.Instance; + + [Fact] + public async Task DropIndex_WhenIndexBacksUniqueConstraint_DropsConstraint() + { + // Implements [MIG-PG-CONSTRAINT-BACKED-INDEX-DROP]. + var connection = await fixture + .CreateDatabaseAsync("drop_constraint_index") + .ConfigureAwait(true); + + try + { + Apply(connection, Calculate(Inspect(connection), SchemaWithUniqueConstraint())); + var upgrade = Calculate(Inspect(connection), SchemaWithoutUniqueConstraint(), true); + + Assert.Contains(upgrade, IsConstraintBackedIndexDrop); + + Apply(connection, upgrade, MigrationOptions.Destructive); + + Assert.False(ConstraintExists(connection)); + Assert.False(IndexExists(connection)); + } + finally + { + await connection.DisposeAsync().ConfigureAwait(true); + NpgsqlConnection.ClearPool(connection); + } + } + + [Fact] + public async Task Calculate_WhenUniqueConstraintSchemaConverged_DoesNotDropBackingIndex() + { + // Implements [MIG-PG-UNIQUE-CONSTRAINT-INSPECTION]. + var connection = await fixture + .CreateDatabaseAsync("converged_unique_constraint") + .ConfigureAwait(true); + + try + { + var desired = SchemaWithUniqueConstraint(); + + Apply(connection, Calculate(Inspect(connection), desired)); + + var inspected = Inspect(connection).Tables.Single(t => t.Name == TableName); + var converged = Calculate(Inspect(connection), desired, true); + + Assert.Contains( + inspected.UniqueConstraints, + constraint => + constraint.Name == ConstraintName + && constraint.Columns.SequenceEqual(["key_hash"]) + ); + Assert.DoesNotContain(converged, IsConstraintBackedIndexDrop); + Assert.DoesNotContain( + converged, + operation => operation is AddUniqueConstraintOperation + ); + + Apply(connection, converged, MigrationOptions.Destructive); + + Assert.True(ConstraintExists(connection)); + Assert.True(IndexExists(connection)); + } + finally + { + await connection.DisposeAsync().ConfigureAwait(true); + NpgsqlConnection.ClearPool(connection); + } + } + + private static SchemaDefinition SchemaWithUniqueConstraint() => + Schema + .Define("Issue49") + .Table( + SchemaName, + TableName, + t => + t.Column("id", PortableTypes.Uuid, c => c.PrimaryKey()) + .Column("key_hash", PortableTypes.VarChar(255), c => c.NotNull()) + .Unique(ConstraintName, "key_hash") + ) + .Build(); + + private static SchemaDefinition SchemaWithoutUniqueConstraint() => + Schema + .Define("Issue49") + .Table( + SchemaName, + TableName, + t => + t.Column("id", PortableTypes.Uuid, c => c.PrimaryKey()) + .Column("key_hash", PortableTypes.VarChar(255), c => c.NotNull()) + ) + .Build(); + + private static SchemaDefinition Inspect(NpgsqlConnection connection) + { + var result = PostgresSchemaInspector.Inspect(connection, SchemaName, Logger); + if (result is SchemaResultOk ok) + { + return ok.Value; + } + + Assert.Fail("Expected PostgreSQL schema inspection to succeed."); + return Schema.Define("failed").Build(); + } + + private static IReadOnlyList Calculate( + SchemaDefinition current, + SchemaDefinition desired, + bool allowDestructive = false + ) + { + var result = SchemaDiff.Calculate(current, desired, allowDestructive, Logger); + if (result is OperationsResultOk ok) + { + return ok.Value; + } + + Assert.Fail("Expected PostgreSQL schema diff to succeed."); + return []; + } + + private static void Apply( + NpgsqlConnection connection, + IReadOnlyList operations, + MigrationOptions? options = null + ) + { + var result = MigrationRunner.Apply( + connection, + operations, + PostgresDdlGenerator.Generate, + options ?? MigrationOptions.Default, + Logger + ); + var failure = result is MigrationApplyResultError error ? error.Value.ToString() : ""; + + Assert.True(result is MigrationApplyResultOk, $"Migration failed: {failure}"); + } + + private static bool IsConstraintBackedIndexDrop(SchemaOperation operation) => + operation + is DropIndexOperation + { + Schema: SchemaName, + TableName: TableName, + IndexName: ConstraintName, + }; + + private static bool ConstraintExists(NpgsqlConnection connection) + { + using var command = connection.CreateCommand(); + command.CommandText = """ + SELECT EXISTS ( + SELECT 1 + FROM pg_constraint c + JOIN pg_class t ON t.oid = c.conrelid + JOIN pg_namespace n ON n.oid = t.relnamespace + WHERE n.nspname = @schema + AND t.relname = @table + AND c.conname = @constraint + ) + """; + AddObjectParameters(command); + return command.ExecuteScalar() is bool exists && exists; + } + + private static bool IndexExists(NpgsqlConnection connection) + { + using var command = connection.CreateCommand(); + command.CommandText = """ + SELECT EXISTS ( + SELECT 1 + FROM pg_class i + JOIN pg_namespace n ON n.oid = i.relnamespace + WHERE n.nspname = @schema + AND i.relname = @constraint + AND i.relkind = 'i' + ) + """; + AddObjectParameters(command); + return command.ExecuteScalar() is bool exists && exists; + } + + private static void AddObjectParameters(NpgsqlCommand command) + { + command.Parameters.AddWithValue("@schema", SchemaName); + command.Parameters.AddWithValue("@table", TableName); + command.Parameters.AddWithValue("@constraint", ConstraintName); + } +} diff --git a/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresFunctionBodyLqlTests.cs b/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresFunctionBodyLqlTests.cs index f3b768d3..bb1d60f0 100644 --- a/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresFunctionBodyLqlTests.cs +++ b/Migration/Nimblesite.DataProvider.Migration.Tests/PostgresFunctionBodyLqlTests.cs @@ -50,6 +50,22 @@ public void TranslatePostgresBody_ExistsPipeline_EmitsSelectExists() Assert.Contains("tenant_id = t", sql, StringComparison.Ordinal); } + [Fact] + public void TranslatePostgresBody_ExistsPipelineWithInList_EmitsInPredicate() + { + // Implements [LQL-PREDICATE-IN-LIST]. + var sql = Body( + """ + exists( + tenant_members + |> filter(fn(m) => m.user_id = u and m.tenant_id = t and m.role in ('owner', 'admin')) + ) + """ + ); + + Assert.Contains("role IN ('owner', 'admin')", sql, StringComparison.Ordinal); + } + [Fact] public void FromYaml_FunctionBodyLql_Deserializes() { diff --git a/Migration/Nimblesite.DataProvider.Migration.Tests/SchemaYamlSerializerTests.cs b/Migration/Nimblesite.DataProvider.Migration.Tests/SchemaYamlSerializerTests.cs index 57e3aa9c..d8762859 100644 --- a/Migration/Nimblesite.DataProvider.Migration.Tests/SchemaYamlSerializerTests.cs +++ b/Migration/Nimblesite.DataProvider.Migration.Tests/SchemaYamlSerializerTests.cs @@ -312,7 +312,7 @@ public void CheckConstraint_SerializesCorrectly() .Column( "Price", PortableTypes.Decimal(10, 2), - c => c.NotNull().Check("Price >= 0") + c => c.NotNull().Check("CK_Products_Price", "Price >= 0") ) .Column("Quantity", PortableTypes.Int, c => c.NotNull()) .Check("CK_Products_Quantity", "Quantity >= 0") @@ -328,6 +328,7 @@ public void CheckConstraint_SerializesCorrectly() var priceCol = table.Columns.First(c => c.Name == "Price"); Assert.Equal("Price >= 0", priceCol.CheckConstraint); + Assert.Equal("CK_Products_Price", priceCol.CheckConstraintName); Assert.Single(table.CheckConstraints); Assert.Equal("CK_Products_Quantity", table.CheckConstraints[0].Name); diff --git a/Reporting/Nimblesite.Reporting.React/wwwroot/js/vendor/react-dom.development.js b/Reporting/Nimblesite.Reporting.React/wwwroot/js/vendor/react-dom.development.js new file mode 100644 index 00000000..57a309ce --- /dev/null +++ b/Reporting/Nimblesite.Reporting.React/wwwroot/js/vendor/react-dom.development.js @@ -0,0 +1,29924 @@ +/** + * @license React + * react-dom.development.js + * + * Copyright (c) Facebook, Inc. and its affiliates. + * + * This source code is licensed under the MIT license found in the + * LICENSE file in the root directory of this source tree. + */ +(function (global, factory) { + typeof exports === 'object' && typeof module !== 'undefined' ? factory(exports, require('react')) : + typeof define === 'function' && define.amd ? define(['exports', 'react'], factory) : + (global = global || self, factory(global.ReactDOM = {}, global.React)); +}(this, (function (exports, React) { 'use strict'; + + var ReactSharedInternals = React.__SECRET_INTERNALS_DO_NOT_USE_OR_YOU_WILL_BE_FIRED; + + var suppressWarning = false; + function setSuppressWarning(newSuppressWarning) { + { + suppressWarning = newSuppressWarning; + } + } // In DEV, calls to console.warn and console.error get replaced + // by calls to these methods by a Babel plugin. + // + // In PROD (or in packages without access to React internals), + // they are left as they are instead. + + function warn(format) { + { + if (!suppressWarning) { + for (var _len = arguments.length, args = new Array(_len > 1 ? _len - 1 : 0), _key = 1; _key < _len; _key++) { + args[_key - 1] = arguments[_key]; + } + + printWarning('warn', format, args); + } + } + } + function error(format) { + { + if (!suppressWarning) { + for (var _len2 = arguments.length, args = new Array(_len2 > 1 ? _len2 - 1 : 0), _key2 = 1; _key2 < _len2; _key2++) { + args[_key2 - 1] = arguments[_key2]; + } + + printWarning('error', format, args); + } + } + } + + function printWarning(level, format, args) { + // When changing this logic, you might want to also + // update consoleWithStackDev.www.js as well. + { + var ReactDebugCurrentFrame = ReactSharedInternals.ReactDebugCurrentFrame; + var stack = ReactDebugCurrentFrame.getStackAddendum(); + + if (stack !== '') { + format += '%s'; + args = args.concat([stack]); + } // eslint-disable-next-line react-internal/safe-string-coercion + + + var argsWithFormat = args.map(function (item) { + return String(item); + }); // Careful: RN currently depends on this prefix + + argsWithFormat.unshift('Warning: ' + format); // We intentionally don't use spread (or .apply) directly because it + // breaks IE9: https://github.com/facebook/react/issues/13610 + // eslint-disable-next-line react-internal/no-production-logging + + Function.prototype.apply.call(console[level], console, argsWithFormat); + } + } + + var FunctionComponent = 0; + var ClassComponent = 1; + var IndeterminateComponent = 2; // Before we know whether it is function or class + + var HostRoot = 3; // Root of a host tree. Could be nested inside another node. + + var HostPortal = 4; // A subtree. Could be an entry point to a different renderer. + + var HostComponent = 5; + var HostText = 6; + var Fragment = 7; + var Mode = 8; + var ContextConsumer = 9; + var ContextProvider = 10; + var ForwardRef = 11; + var Profiler = 12; + var SuspenseComponent = 13; + var MemoComponent = 14; + var SimpleMemoComponent = 15; + var LazyComponent = 16; + var IncompleteClassComponent = 17; + var DehydratedFragment = 18; + var SuspenseListComponent = 19; + var ScopeComponent = 21; + var OffscreenComponent = 22; + var LegacyHiddenComponent = 23; + var CacheComponent = 24; + var TracingMarkerComponent = 25; + + // ----------------------------------------------------------------------------- + + var enableClientRenderFallbackOnTextMismatch = true; // TODO: Need to review this code one more time before landing + // the react-reconciler package. + + var enableNewReconciler = false; // Support legacy Primer support on internal FB www + + var enableLazyContextPropagation = false; // FB-only usage. The new API has different semantics. + + var enableLegacyHidden = false; // Enables unstable_avoidThisFallback feature in Fiber + + var enableSuspenseAvoidThisFallback = false; // Enables unstable_avoidThisFallback feature in Fizz + // React DOM Chopping Block + // + // Similar to main Chopping Block but only flags related to React DOM. These are + // grouped because we will likely batch all of them into a single major release. + // ----------------------------------------------------------------------------- + // Disable support for comment nodes as React DOM containers. Already disabled + // in open source, but www codebase still relies on it. Need to remove. + + var disableCommentsAsDOMContainers = true; // Disable javascript: URL strings in href for XSS protection. + // and client rendering, mostly to allow JSX attributes to apply to the custom + // element's object properties instead of only HTML attributes. + // https://github.com/facebook/react/issues/11347 + + var enableCustomElementPropertySupport = false; // Disables children for