Skip to content

feat(cli): add GitHub repository package installation#238

Open
khaliqgant wants to merge 10 commits intomainfrom
claude/github-package-download-nwPCq
Open

feat(cli): add GitHub repository package installation#238
khaliqgant wants to merge 10 commits intomainfrom
claude/github-package-download-nwPCq

Conversation

@khaliqgant
Copy link
Copy Markdown
Collaborator

@khaliqgant khaliqgant commented Dec 22, 2025

User description

Adds ability to install packages directly from GitHub repositories
without requiring a prpm.json manifest. Auto-discovers AI tool
configurations by scanning for known file patterns.

Supported syntax:

  • prpm install owner/repo
  • prpm install github:owner/repo
  • prpm install owner/repo@tag
  • prpm install owner/repo:path/to/file

Features:

  • Downloads via GitHub tarball API (no git required)
  • Auto-discovers .cursor/rules, .claude/skills, CLAUDE.md, etc.
  • Tracks commit SHA in lockfile for reproducibility
  • Supports GITHUB_TOKEN for private repos

CodeAnt-AI Description

Install AI-tool packages directly from GitHub repositories with caching and discovery

What Changed

  • New CLI support to install from explicit GitHub specs (github:, gh:, or full github.com URL), including @ref and :path variants; PRPM auto-discovers rules, skills, agents, and other AI-tool files inside repos and can install a specific file or all discovered packages.
  • Downloads use GitHub tarball API (no git needed) and follow a three-tier cache: local (~/.prpm/cache), shared registry cache, then GitHub; installs record the resolved commit SHA in the lockfile and skip reinstallation when the same commit is already present (use --force to override).
  • Registry gained a github-cache API: endpoints to GET/POST tarballs, view cache stats, and register discovered GitHub packages with content-hash deduplication; server-side cache enforces size/TTL and LRU eviction.
  • Lockfile and install output now include GitHub source metadata (owner, repo, ref, commit SHA, source path); documentation and CLI help updated with examples and private-repo token guidance.
  • Unit and integration tests added for GitHub spec parsing, tarball caching and registry cache routes, package registration, integrity hashing, and discovery behavior.

Impact

✅ Fewer GitHub API calls
✅ Shorter repeated installs from the same commit
✅ Reproducible GitHub-sourced installs (commit SHA recorded)

💡 Usage Guide

Checking Your Pull Request

Every time you make a pull request, our system automatically looks through it. We check for security issues, mistakes in how you're setting up your infrastructure, and common code problems. We do this to make sure your changes are solid and won't cause any trouble later.

Talking to CodeAnt AI

Got a question or need a hand with something in your pull request? You can easily get in touch with CodeAnt AI right here. Just type the following in a comment on your pull request, and replace "Your question here" with whatever you want to ask:

@codeant-ai ask: Your question here

This lets you have a chat with CodeAnt AI about your pull request, making it easier to understand and improve your code.

Example

@codeant-ai ask: Can you suggest a safer alternative to storing this secret?

Preserve Org Learnings with CodeAnt

You can record team preferences so CodeAnt AI applies them in future reviews. Reply directly to the specific CodeAnt AI suggestion (in the same thread) and replace "Your feedback here" with your input:

@codeant-ai: Your feedback here

This helps CodeAnt AI learn and adapt to your team's coding style and standards.

Example

@codeant-ai: Do not flag unused imports.

Retrigger review

Ask CodeAnt AI to review the PR again, by typing:

@codeant-ai: review

Check Your Repository Health

To analyze the health of your code repository, visit our dashboard at https://app.codeant.ai. This tool helps you identify potential issues and areas for improvement in your codebase, ensuring your repository maintains high standards of code health.

Adds ability to install packages directly from GitHub repositories
without requiring a prpm.json manifest. Auto-discovers AI tool
configurations by scanning for known file patterns.

Supported syntax:
- prpm install owner/repo
- prpm install github:owner/repo
- prpm install owner/repo@tag
- prpm install owner/repo:path/to/file

Features:
- Downloads via GitHub tarball API (no git required)
- Auto-discovers .cursor/rules, .claude/skills, CLAUDE.md, etc.
- Tracks commit SHA in lockfile for reproducibility
- Supports GITHUB_TOKEN for private repos
- Remove auto-detection of owner/repo pattern to avoid ambiguity
- Only github:, gh:, and https://github.com URLs trigger GitHub install
- Add package count summary grouped by format
- Prompt user to confirm when multiple packages found
- Show how to install specific files if user declines
- Cache downloaded tarballs in ~/.prpm/cache/github/{owner}/{repo}/{sha}.tar.gz
- Check lockfile for previously installed commit SHA
- Skip reinstall if commit SHA unchanged (use -y to force)
- Show "(cached)" indicator when serving from cache
- Add github-cache route in registry for shared tarball caching
- Implement 3-tier cache hierarchy: local -> registry -> GitHub
- Add cacheSource field to show where tarball was loaded from
- Auto-register discovered GitHub packages to PRPM registry
- Display cache source (local/registry/GitHub) in install output
The shared tarball cache already provides caching benefits for all users.
Auto-publishing to registry would incorrectly attribute packages to the
downloading user instead of the GitHub repository owner.
- Add /register endpoint for GitHub package registration
- Packages are attributed to GitHub owner, not downloading user
- Deduplication by content hash prevents duplicate storage
- Extract license info from repository
- Fire-and-forget registration doesn't block installs
Unit tests (CLI):
- generateRegistryPackageId: lowercase package ID generation
- computeIntegrity: SHA-256 hash computation for tarballs

Integration tests (Registry):
- GET /github-cache/:owner/:repo/:sha: cache hit/miss, LRU tracking
- POST /github-cache/:owner/:repo/:sha: store, skip if exists
- GET /github-cache/stats: cache statistics
- POST /github-cache/register: new package, duplicate detection, metadata storage
@codeant-ai
Copy link
Copy Markdown

codeant-ai bot commented Dec 22, 2025

CodeAnt AI is reviewing your PR.


Thanks for using CodeAnt! 🎉

We're free for open-source projects. if you're enjoying it, help us grow by sharing.

Share on X ·
Reddit ·
LinkedIn

@codeant-ai codeant-ai bot added the size:XXL This PR changes 1000+ lines, ignoring generated files label Dec 22, 2025
@my-senior-dev-pr-review
Copy link
Copy Markdown

my-senior-dev-pr-review bot commented Dec 22, 2025

🤖 My Senior Dev — Analysis Complete

👤 For @khaliqgant

📁 Expert in packages/ (190 edits) • ⚡ 20th PR this month

View your contributor analytics →


📊 12 files reviewed • 8 high risk • 5 need attention

🚨 High Risk:

  • packages/cli/src/commands/install.ts — Significant modifications in installation logic and user inputs leading to critical security vulnerabilities.

⚠️ Needs Attention:

  • packages/cli/src/__tests__/github.test.ts — New tests were added to ensure the reliability of GitHub package installation functionality, which are essential for both quality assurance and security verification.

🚀 Open Interactive Review →

The full interface unlocks features not available in GitHub:

  • 💬 AI Chat — Ask questions on any file, get context-aware answers
  • 🔍 Smart Hovers — See symbol definitions and usage without leaving the diff
  • 📚 Code Archeology — Understand how files evolved over time (/archeology)
  • 🎯 Learning Insights — See how this PR compares to similar changes

💬 Chat here: @my-senior-dev explain this change — or try @chaos-monkey @security-auditor @optimizer @skeptic @junior-dev

📖 View all 12 personas & slash commands

You can interact with me by mentioning @my-senior-dev in any comment:

In PR comments or on any line of code:

  • Ask questions about the code or PR
  • Request explanations of specific changes
  • Get suggestions for improvements

Slash commands:

  • /help — Show all available commands
  • /archeology — See the history and evolution of changed files
  • /profile — Performance analysis and suggestions
  • /expertise — Find who knows this code best
  • /personas — List all available AI personas

AI Personas (mention to get their perspective):

Persona Focus
@chaos-monkey 🐵 Edge cases & failure scenarios
@skeptic 🤨 Challenge assumptions
@optimizer Performance & efficiency
@security-auditor 🔒 Security vulnerabilities
@accessibility-advocate Inclusive design
@junior-dev 🌱 Simple explanations
@tech-debt-collector 💳 Code quality & shortcuts
@ux-champion 🎨 User experience
@devops-engineer 🚀 Deployment & scaling
@documentation-nazi 📚 Documentation gaps
@legacy-whisperer 🏛️ Working with existing code
@test-driven-purist Testing & TDD

For the best experience, view this PR on myseniordev.com — includes AI chat, file annotations, and interactive reviews.

Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

3 issues found across 7 files

Prompt for AI agents (all issues)

Check if these issues are valid — if so, understand the root cause of each and fix them.


<file name="packages/cli/src/commands/install.ts">

<violation number="1" location="packages/cli/src/commands/install.ts:518">
P1: Path traversal vulnerability: `file.path` from `additionalFiles` is not validated before being used in `path.join`. A malicious repository could use paths like `../../../.bashrc` to write files outside the intended directory. Consider reusing the existing `hasUnsafePathPatterns` and `isPathSafe` functions that are already used for tarball extraction.</violation>
</file>

<file name="packages/cli/src/core/github.ts">

<violation number="1" location="packages/cli/src/core/github.ts:109">
P2: Duplicate pattern for `AGENTS.md` - the codex format pattern on this line will never match because an identical pattern for &#39;agents.md&#39; format appears earlier in the array (line 94), and the code breaks after the first match. Consider using a different pattern like `/^codex\.md$/i` or removing one of the duplicates.</violation>
</file>

<file name="packages/registry/src/routes/github-cache.ts">

<violation number="1" location="packages/registry/src/routes/github-cache.ts:325">
P1: Missing TTL on stored package data. Unlike the tarball cache which uses `setex` with TTL and LRU eviction, the hash, metadata, and content keys in `/register` are stored with `set` and never expire. This could lead to unbounded Redis memory growth. Consider using `setex` with a TTL or implementing LRU eviction similar to the tarball cache.</violation>
</file>

Reply to cubic to teach it or ask questions. Re-run a review with @cubic-dev-ai review this PR

if (pkg.additionalFiles) {
const pkgDir = path.dirname(destPath);
for (const file of pkg.additionalFiles) {
const fileDest = path.join(pkgDir, file.path);
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Path traversal vulnerability: file.path from additionalFiles is not validated before being used in path.join. A malicious repository could use paths like ../../../.bashrc to write files outside the intended directory. Consider reusing the existing hasUnsafePathPatterns and isPathSafe functions that are already used for tarball extraction.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/cli/src/commands/install.ts, line 518:

<comment>Path traversal vulnerability: `file.path` from `additionalFiles` is not validated before being used in `path.join`. A malicious repository could use paths like `../../../.bashrc` to write files outside the intended directory. Consider reusing the existing `hasUnsafePathPatterns` and `isPathSafe` functions that are already used for tarball extraction.</comment>

<file context>
@@ -256,6 +265,375 @@ function findMainFile(
+        if (pkg.additionalFiles) {
+          const pkgDir = path.dirname(destPath);
+          for (const file of pkg.additionalFiles) {
+            const fileDest = path.join(pkgDir, file.path);
+            await fs.mkdir(path.dirname(fileDest), { recursive: true });
+            await fs.writeFile(fileDest, file.content, &#39;utf-8&#39;);
</file context>
Suggested change
const fileDest = path.join(pkgDir, file.path);
if (hasUnsafePathPatterns(file.path) || !isPathSafe(pkgDir, file.path)) {
console.log(` ${chalk.yellow('⚠️ Skipped unsafe path:')} ${file.path}`);
continue;
}
const fileDest = path.join(pkgDir, file.path);
Fix with Cubic

Comment thread packages/cli/src/core/github.ts Outdated
}

// Store content hash -> package ID mapping for deduplication
await server.redis.set(hashKey, packageId);
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot Dec 22, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1: Missing TTL on stored package data. Unlike the tarball cache which uses setex with TTL and LRU eviction, the hash, metadata, and content keys in /register are stored with set and never expire. This could lead to unbounded Redis memory growth. Consider using setex with a TTL or implementing LRU eviction similar to the tarball cache.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/registry/src/routes/github-cache.ts, line 325:

<comment>Missing TTL on stored package data. Unlike the tarball cache which uses `setex` with TTL and LRU eviction, the hash, metadata, and content keys in `/register` are stored with `set` and never expire. This could lead to unbounded Redis memory growth. Consider using `setex` with a TTL or implementing LRU eviction similar to the tarball cache.</comment>

<file context>
@@ -0,0 +1,365 @@
+      }
+
+      // Store content hash -&gt; package ID mapping for deduplication
+      await server.redis.set(hashKey, packageId);
+
+      // Store package metadata
</file context>
Fix with Cubic

Comment on lines +400 to +401
try {
const content = await readFile(join(rootDir, spec.filePath), 'utf-8');
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggestion: The spec.filePath value is joined directly with the extracted repo root and passed to readFile without any validation, so a user (or malicious command) can supply absolute or ..-containing paths (e.g. github:owner/repo:../../etc/passwd or github:owner/repo:/etc/passwd) that cause the CLI to read arbitrary local files outside the cloned repository and potentially leak their contents (e.g. via later registry upload); you should validate that the resolved path stays within the extracted repository directory before reading it. [security]

Severity Level: Critical 🚨

Suggested change
try {
const content = await readFile(join(rootDir, spec.filePath), 'utf-8');
// Resolve and validate that the requested path stays within the extracted repo
const requestedPath = join(rootDir, spec.filePath);
const relativeToRoot = relative(rootDir, requestedPath);
if (relativeToRoot.startsWith('..')) {
throw new Error(`Invalid file path (must be inside repository): ${spec.filePath}`);
}
try {
const content = await readFile(requestedPath, 'utf-8');
Why it matters? ⭐

The suggestion points out a real directory-traversal/absolute-path risk: join(rootDir, spec.filePath) will happily resolve "../something" or an absolute path and read files outside the extracted repo. The proposed guard using relative(rootDir, requestedPath) and rejecting paths that start with ".." (and implicitly handling absolute paths which produce a leading '..' when relative) fixes the vulnerability with minimal change. This is a security fix, not a cosmetic one, and directly relates to the code in the PR.

Prompt for AI Agent 🤖
This is a comment left during a code review.

**Path:** packages/cli/src/core/github.ts
**Line:** 400:401
**Comment:**
	*Security: The `spec.filePath` value is joined directly with the extracted repo root and passed to `readFile` without any validation, so a user (or malicious command) can supply absolute or `..`-containing paths (e.g. `github:owner/repo:../../etc/passwd` or `github:owner/repo:/etc/passwd`) that cause the CLI to read arbitrary local files outside the cloned repository and potentially leak their contents (e.g. via later registry upload); you should validate that the resolved path stays within the extracted repository directory before reading it.

Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.

*/

import { FastifyInstance, FastifyRequest, FastifyReply } from 'fastify';
import { config } from '../config.js';
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggestion: The imported config value from ../config.js is never used in this module, which is dead code and may cause linting or build failures in setups that treat unused imports as errors; it should be removed. [code quality]

Severity Level: Minor ⚠️

Suggested change
import { config } from '../config.js';
Why it matters? ⭐

The import of config is unused in this module (the file never references config). Removing the import is a small, correct cleanup that avoids linter warnings or build errors in strict configurations. This is a safe code-quality improvement.

Prompt for AI Agent 🤖
This is a comment left during a code review.

**Path:** packages/registry/src/routes/github-cache.ts
**Line:** 7:7
**Comment:**
	*Code Quality: The imported `config` value from `../config.js` is never used in this module, which is dead code and may cause linting or build failures in setups that treat unused imports as errors; it should be removed.

Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.

description?: string;
};

try {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggestion: The /register endpoint accepts arbitrary content strings and writes them directly to Redis without any size limit, which allows a client to send a single extremely large payload and consume excessive memory or Redis storage, enabling a denial-of-service; you should enforce a reasonable maximum content length and reject bodies that exceed it. [security]

Severity Level: Critical 🚨

Suggested change
try {
// Protect Redis and the server from overly large package bodies
const MAX_PACKAGE_CONTENT_SIZE = 1024 * 1024; // 1MB
if (body.content.length > MAX_PACKAGE_CONTENT_SIZE) {
return reply.status(400).send({
error: 'Package content too large',
});
}
Why it matters? ⭐

This is a valid security/resource-protection concern: the register endpoint accepts arbitrary content and writes it into Redis with no size checks. A malicious client could send very large content and exhaust memory or Redis storage. Adding an explicit maximum content size (and validating before any Redis writes) is a real hardening change, not merely stylistic.

Prompt for AI Agent 🤖
This is a comment left during a code review.

**Path:** packages/registry/src/routes/github-cache.ts
**Line:** 303:303
**Comment:**
	*Security: The `/register` endpoint accepts arbitrary `content` strings and writes them directly to Redis without any size limit, which allows a client to send a single extremely large payload and consume excessive memory or Redis storage, enabling a denial-of-service; you should enforce a reasonable maximum content length and reject bodies that exceed it.

Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.

Comment on lines +324 to +348
// Store content hash -> package ID mapping for deduplication
await server.redis.set(hashKey, packageId);

// Store package metadata
const metadataKey = `github:package:${packageId}`;
const metadata = {
owner: body.owner,
repo: body.repo,
commitSha: body.commitSha,
name: body.name,
format: body.format,
subtype: body.subtype,
sourcePath: body.sourcePath,
contentHash: body.contentHash,
license: body.license,
description: body.description,
createdAt: new Date().toISOString(),
};

await server.redis.set(metadataKey, JSON.stringify(metadata));

// Store content (for future retrieval if needed)
const contentKey = `github:content:${packageId}`;
await server.redis.set(contentKey, body.content);

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggestion: The deduplication mapping from content hash to package ID is written to Redis before the package metadata and content; if either of those later writes fails, you end up with a hash that points to a package ID for which no metadata/content exists, and future registrations with the same hash will be treated as duplicates but remain unusable. The mapping should only be stored after the metadata and content are successfully written so the system cannot enter this inconsistent state. [logic error]

Severity Level: Minor ⚠️

Suggested change
// Store content hash -> package ID mapping for deduplication
await server.redis.set(hashKey, packageId);
// Store package metadata
const metadataKey = `github:package:${packageId}`;
const metadata = {
owner: body.owner,
repo: body.repo,
commitSha: body.commitSha,
name: body.name,
format: body.format,
subtype: body.subtype,
sourcePath: body.sourcePath,
contentHash: body.contentHash,
license: body.license,
description: body.description,
createdAt: new Date().toISOString(),
};
await server.redis.set(metadataKey, JSON.stringify(metadata));
// Store content (for future retrieval if needed)
const contentKey = `github:content:${packageId}`;
await server.redis.set(contentKey, body.content);
// Store package metadata
const metadataKey = `github:package:${packageId}`;
const metadata = {
owner: body.owner,
repo: body.repo,
commitSha: body.commitSha,
name: body.name,
format: body.format,
subtype: body.subtype,
sourcePath: body.sourcePath,
contentHash: body.contentHash,
license: body.license,
description: body.description,
createdAt: new Date().toISOString(),
};
await server.redis.set(metadataKey, JSON.stringify(metadata));
// Store content (for future retrieval if needed)
const contentKey = `github:content:${packageId}`;
await server.redis.set(contentKey, body.content);
// Store content hash -> package ID mapping for deduplication (only after successful writes)
await server.redis.set(hashKey, packageId);
Why it matters? ⭐

The suggestion points out a real logic flaw in the current PR: the content-hash -> packageId mapping is written before metadata/content, which can leave Redis in an inconsistent state if subsequent writes fail. Reordering the writes (or using a Redis transaction/multi/exec) to persist metadata and content first, then set the hash mapping, prevents creating unreachable "duplicates". This is a correctness fix, not a cosmetic change.

Prompt for AI Agent 🤖
This is a comment left during a code review.

**Path:** packages/registry/src/routes/github-cache.ts
**Line:** 324:348
**Comment:**
	*Logic Error: The deduplication mapping from content hash to package ID is written to Redis before the package metadata and content; if either of those later writes fails, you end up with a hash that points to a package ID for which no metadata/content exists, and future registrations with the same hash will be treated as duplicates but remain unusable. The mapping should only be stored after the metadata and content are successfully written so the system cannot enter this inconsistent state.

Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.

Comment on lines +407 to +411
if (cachedCommitSha && cachedCommitSha === result.commitSha && !options.force) {
console.log(`\n${chalk.green('✓')} Already up to date at commit ${chalk.dim(result.commitSha.slice(0, 7))}`);
console.log(` ${chalk.dim('Use -y or --force to reinstall')}`);
return;
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggestion: When a repository is already up to date, the message instructs the user to "Use -y or --force to reinstall", but the CLI only defines -y, --yes (no --force flag), so following the guidance will cause an unknown-option error instead of working as described; the message should reference the actual --yes flag. [logic error]

Severity Level: Minor ⚠️

Suggested change
if (cachedCommitSha && cachedCommitSha === result.commitSha && !options.force) {
console.log(`\n${chalk.green('✓')} Already up to date at commit ${chalk.dim(result.commitSha.slice(0, 7))}`);
console.log(` ${chalk.dim('Use -y or --force to reinstall')}`);
return;
}
console.log(` ${chalk.dim('Use -y or --yes to reinstall')}`);
Why it matters? ⭐

The CLI defines '-y, --yes' for auto-confirming prompts and the action later uses options.yes to set force on installs; there is no global '--force' CLI option declared in this command. Instructing users to use '--force' is incorrect and will produce an unknown-option error. Updating the message to reference '-y' or '--yes' aligns documentation with actual flags.

Prompt for AI Agent 🤖
This is a comment left during a code review.

**Path:** packages/cli/src/commands/install.ts
**Line:** 407:411
**Comment:**
	*Logic Error: When a repository is already up to date, the message instructs the user to "Use -y or --force to reinstall", but the CLI only defines `-y, --yes` (no `--force` flag), so following the guidance will cause an unknown-option error instead of working as described; the message should reference the actual `--yes` flag.

Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.

Comment on lines +414 to +422
console.log(`\n${chalk.yellow('⚠️ No AI tool configurations found in this repository.')}`);
console.log(`\n Looking for:`);
console.log(` • ${chalk.dim('.cursor/rules/*.mdc')}`);
console.log(` • ${chalk.dim('.claude/skills/**/*.md')}`);
console.log(` • ${chalk.dim('CLAUDE.md, AGENTS.md')}`);
console.log(` • ${chalk.dim('.cursorrules, .windsurfrules')}`);
console.log(` • ${chalk.dim('and more...')}`);
console.log(`\n ${chalk.dim('Specify a file path:')} prpm install ${repoName}:path/to/file.md`);
return;
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggestion: The help text shown when no GitHub packages are discovered suggests running prpm install owner/repo:path/to/file.md, but looksLikeGitHubSpec only recognizes inputs with prefixes like github: or https://github.com/, so this example will be parsed as a registry package and fail; the example should include an explicit GitHub prefix so it is actually handled by the GitHub installer. [logic error]

Severity Level: Minor ⚠️

Suggested change
console.log(`\n${chalk.yellow('⚠️ No AI tool configurations found in this repository.')}`);
console.log(`\n Looking for:`);
console.log(` • ${chalk.dim('.cursor/rules/*.mdc')}`);
console.log(` • ${chalk.dim('.claude/skills/**/*.md')}`);
console.log(` • ${chalk.dim('CLAUDE.md, AGENTS.md')}`);
console.log(` • ${chalk.dim('.cursorrules, .windsurfrules')}`);
console.log(` • ${chalk.dim('and more...')}`);
console.log(`\n ${chalk.dim('Specify a file path:')} prpm install ${repoName}:path/to/file.md`);
return;
console.log(`\n ${chalk.dim('Specify a file path:')} prpm install github:${repoName}:path/to/file.md`);
Why it matters? ⭐

The message suggests "prpm install owner/repo:path" but the code only detects GitHub specs when they match the patterns parsed by looksLikeGitHubSpec (the code imports looksLikeGitHubSpec and the actual detection earlier expects prefixes like 'github:' or full URLs). The example as printed will be parsed as a registry package and not handled by the GitHub installer, so the help text is misleading. Changing the printed example to include the explicit 'github:' prefix fixes user confusion.

Prompt for AI Agent 🤖
This is a comment left during a code review.

**Path:** packages/cli/src/commands/install.ts
**Line:** 414:422
**Comment:**
	*Logic Error: The help text shown when no GitHub packages are discovered suggests running `prpm install owner/repo:path/to/file.md`, but `looksLikeGitHubSpec` only recognizes inputs with prefixes like `github:` or `https://github.com/`, so this example will be parsed as a registry package and fail; the example should include an explicit GitHub prefix so it is actually handled by the GitHub installer.

Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.

@codeant-ai
Copy link
Copy Markdown

codeant-ai bot commented Dec 22, 2025

CodeAnt AI finished reviewing your PR.

@codeant-ai
Copy link
Copy Markdown

codeant-ai bot commented Dec 22, 2025

💡 Enhance Your PR Reviews

We noticed that 3 feature(s) are not configured for this repository. Enabling these features can help improve your code quality and workflow:

🚦 Quality Gates

Status: Quality Gates are not enabled at the organization level
Learn more about Quality Gates

🎫 Jira Ticket Compliance

Status: Jira credentials file not found. Please configure Jira integration in your settings
Learn more about Jira Integration

⚙️ Custom Rules

Status: No custom rules configured. Add rules via organization settings or .codeant/review.json in your repository
Learn more about Custom Rules


Want to enable these features? Contact your organization admin or check our documentation for setup instructions.

@codeant-ai
Copy link
Copy Markdown

codeant-ai bot commented Jan 10, 2026

CodeAnt AI is running Incremental review


Thanks for using CodeAnt! 🎉

We're free for open-source projects. if you're enjoying it, help us grow by sharing.

Share on X ·
Reddit ·
LinkedIn

Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 issues found across 8 files (changes from recent commits).

Prompt for AI agents (all issues)

Check if these issues are valid — if so, understand the root cause of each and fix them.


<file name="public-documentation/cli/commands.mdx">

<violation number="1" location="public-documentation/cli/commands.mdx:413">
P2: Example comment is misleading: says 'Skip prompts when installing multiple packages' but the command only installs one repo. Either fix the comment or show an actual multi-package scenario.</violation>

<violation number="2" location="public-documentation/cli/commands.mdx:431">
P2: Documentation incorrectly lists Codex format discovering 'AGENTS.md'. According to the implementation, Codex only discovers 'codex.md' - AGENTS.md is handled by the 'Agents.md' format.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

# Convert to a specific format
prpm install github:anthropics/skills --as cursor

# Skip prompts when installing multiple packages
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot Jan 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Example comment is misleading: says 'Skip prompts when installing multiple packages' but the command only installs one repo. Either fix the comment or show an actual multi-package scenario.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At public-documentation/cli/commands.mdx, line 413:

<comment>Example comment is misleading: says 'Skip prompts when installing multiple packages' but the command only installs one repo. Either fix the comment or show an actual multi-package scenario.</comment>

<file context>
@@ -360,6 +360,126 @@ packages/
+# Convert to a specific format
+prpm install github:anthropics/skills --as cursor
+
+# Skip prompts when installing multiple packages
+prpm install github:owner/repo -y
+```
</file context>
Suggested change
# Skip prompts when installing multiple packages
# Skip prompts during installation
Fix with Cubic

| Kiro | `.kiro/steering/*.md`, `.kiro/agents/*.json` |
| Agents.md | `AGENTS.md` |
| Aider | `.aider.conf.yml`, `CONVENTIONS.md` |
| Codex | `AGENTS.md`, `codex.md` |
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot Jan 10, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Documentation incorrectly lists Codex format discovering 'AGENTS.md'. According to the implementation, Codex only discovers 'codex.md' - AGENTS.md is handled by the 'Agents.md' format.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At public-documentation/cli/commands.mdx, line 431:

<comment>Documentation incorrectly lists Codex format discovering 'AGENTS.md'. According to the implementation, Codex only discovers 'codex.md' - AGENTS.md is handled by the 'Agents.md' format.</comment>

<file context>
@@ -360,6 +360,126 @@ packages/
+| Kiro | `.kiro/steering/*.md`, `.kiro/agents/*.json` |
+| Agents.md | `AGENTS.md` |
+| Aider | `.aider.conf.yml`, `CONVENTIONS.md` |
+| Codex | `AGENTS.md`, `codex.md` |
+
+**Caching:**
</file context>
Suggested change
| Codex | `AGENTS.md`, `codex.md` |
| Codex | `codex.md` |
Fix with Cubic

@codeant-ai
Copy link
Copy Markdown

codeant-ai bot commented Jan 16, 2026

CodeAnt AI is running Incremental review


Thanks for using CodeAnt! 🎉

We're free for open-source projects. if you're enjoying it, help us grow by sharing.

Share on X ·
Reddit ·
LinkedIn

@codeant-ai codeant-ai bot added size:XXL This PR changes 1000+ lines, ignoring generated files and removed size:XXL This PR changes 1000+ lines, ignoring generated files labels Jan 16, 2026
eager: options.eager,
});
}
}
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggestion: GitHub-installed packages recorded in the lockfile cannot be faithfully reinstalled via installFromLockfile, because handleInstall only treats explicit github:/gh:/URL specs as GitHub sources and ignores lockfile entries with githubSource, so lockfile-driven reinstalls fall back to registry resolution using the derived package name and will either fail or install the wrong package instead of re-downloading from GitHub with the stored commit SHA. [logic error]

Severity Level: Critical 🚨
- ❌ Reinstalling GitHub-sourced packages from prpm.lock fails
- ⚠️ Lockfile reproducibility guarantees broken for GitHub installs
- ⚠️ CLI install-from-lockfile path uses registry resolution instead
Suggested change
}
} else {
// If not an explicit GitHub spec, check lockfile for a GitHub-sourced entry
const lockfileForGitHub = await readLockfile();
if (lockfileForGitHub) {
for (const [key, pkg] of Object.entries(lockfileForGitHub.packages)) {
const { packageId: lockedPackageId } = parseLockfileKey(key);
if (lockedPackageId === packageSpec && pkg.githubSource) {
const githubSpecFromLock: GitHubSpec = {
owner: pkg.githubSource.owner,
repo: pkg.githubSource.repo,
ref: pkg.githubSource.ref,
filePath: pkg.githubSource.sourcePath,
};
return await handleGitHubInstall(githubSpecFromLock, {
as: options.as,
subtype: options.subtype,
force: options.force,
location: options.location,
noAppend: options.noAppend,
manifestFile: options.manifestFile,
eager: options.eager,
});
}
}
}
Steps of Reproduction ✅
1. Start with a project that previously installed a GitHub-sourced package using
handleGitHubInstall (see packages/cli/src/commands/install.ts:399-586) so prpm.lock
contains an entry with githubSource (verify by inspecting prpm.lock produced by a GitHub
install).

2. Run the CLI to reinstall from the lockfile path: call handleInstall(packageSpec, ...)
with packageSpec equal to the packageId recorded in prpm.lock (entry lookup occurs at
packages/cli/src/commands/install.ts:723-731 and lockfile read at 726-727).

3. Execution reaches the GitHub-spec check at packages/cli/src/commands/install.ts:687-701
(the existing snippet). Because the packageSpec string is the packageId (not a github:...
spec or URL), looksLikeGitHubSpec(packageSpec) returns false and the code continues
without checking the lockfile githubSource.

4. Later, handleInstall treats the package as a registry package and calls
getRegistryClient(...).getPackage(packageId)
(packages/cli/src/commands/install.ts:834-856). The registry lookup will either fail or
return a different package, so the CLI will attempt registry-based install instead of
re-downloading the original GitHub repo at the recorded commit SHA.

5. Observe incorrect installation: the installed content differs from the original
GitHub-sourced package, or installation fails because registry package doesn't exist. This
reproduces the logic bug described.
Prompt for AI Agent 🤖
This is a comment left during a code review.

**Path:** packages/cli/src/commands/install.ts
**Line:** 701:701
**Comment:**
	*Logic Error: GitHub-installed packages recorded in the lockfile cannot be faithfully reinstalled via `installFromLockfile`, because `handleInstall` only treats explicit `github:`/`gh:`/URL specs as GitHub sources and ignores lockfile entries with `githubSource`, so lockfile-driven reinstalls fall back to registry resolution using the derived package name and will either fail or install the wrong package instead of re-downloading from GitHub with the stored commit SHA.

Validate the correctness of the flagged issue. If correct, How can I resolve this? If you propose a fix, implement it and please make it concise.

@codeant-ai
Copy link
Copy Markdown

codeant-ai bot commented Jan 16, 2026

CodeAnt AI Incremental review completed.

Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

14 issues found across 95 files (changes from recent commits).

Note: This PR contains a large number of files. cubic only reviews up to 75 files per PR, so some files may not have been reviewed.

Prompt for AI agents (all issues)

Check if these issues are valid — if so, understand the root cause of each and fix them.


<file name=".claude/rules/types.md">

<violation number="1">
P3: The `Format` union example omits `"windsurf"`, yet the subsequent `Record<Format, …>` example includes a `windsurf` key; as written, the sample code will not compile. Update the union so every key used in later examples is part of `Format`.</violation>
</file>

<file name=".github/workflows/deploy.yml">

<violation number="1">
P2: Writing newline-separated `invalidation_paths` with plain `echo` truncates the GitHub output to the first path, so CloudFront invalidation skips most changed files.</violation>
</file>

<file name=".claude/skills/creating-claude-commands/SKILL.md">

<violation number="1">
P2: The Quick Commit example repeats the type in the commit message and cannot produce the documented `type(scope): message` format because it commits `$1: $ARGUMENTS` while only accepting `<type> <message>`. Use just the message argument instead of `$ARGUMENTS` (or update the arguments to include a scope) so the generated commit follows the conventional commit spec.</violation>
</file>

<file name="packages/converters/schemas/amp-command.schema.json">

<violation number="1">
P1: `frontmatter.additionalProperties` is set to false while only `description` is defined, so valid Amp command fields like `name` and `argument-hint` will be rejected.</violation>
</file>

<file name="packages/types/src/package.ts">

<violation number="1">
P2: `Subtype` now includes "snippet", but no format in `FORMAT_SUBTYPES` allows that subtype, so every snippet package will fail validation.</violation>
</file>

<file name=".claude/skills/creating-claude-rules/SKILL.md">

<violation number="1">
P3: Quote the single-pattern example to stay consistent with the documented requirement that glob patterns must be quoted.</violation>
</file>

<file name=".claude/rules/converters.md">

<violation number="1">
P3: The `toFormat` example returns `content: result` without ever defining `result`, so the documented pattern is invalid TypeScript and will fail if copied. Provide or reference an actual converted value before returning the `ConversionResult`.</violation>
</file>

<file name="packages/converters/schemas/amp.schema.json">

<violation number="1">
P2: The schema never sets `"additionalProperties": false` at the root, so files with unsupported top-level keys still validate, defeating the purpose of the format validator.</violation>
</file>

<file name="packages/converters/src/taxonomy-utils.ts">

<violation number="1">
P2: `normalizeFormat` now matches any string containing the letters “amp”, so inputs like “example.md” or “sample rules” will be misclassified as the AMP format. Use a token/word-boundary check (or explicit prefixes such as `amp:`/`amp/`) so only true AMP identifiers are matched.</violation>
</file>

<file name="packages/cli/src/commands/show.ts">

<violation number="1">
P2: `show` cannot display packages that are delivered as a gzipped single file (no tar) because it always streams into `tar.extract` and aborts on `TAR_BAD_ARCHIVE`, unlike the install path which falls back to treating the payload as a single file. Add the same fallback so single-file packages can be previewed.</violation>

<violation number="2">
P2: Version ranges/dist-tags (e.g. `^1.0.0` or `beta`) are passed straight to `getPackageVersion`, so `prpm show` fails for specs that `prpm install` accepts. Resolve semver ranges/tags to an exact version before downloading, just like the install command does.</violation>

<violation number="3">
P2: `--json` mode still emits the human-friendly console logs before the JSON payload, so the output is never valid JSON. Suppress or redirect the progress logs whenever JSON output is requested.</violation>
</file>

<file name="packages/converters/src/cross-converters/mcp-transformer.ts">

<violation number="1">
P2: `%APPDATA%` resolves to AppData\Roaming, not to the home directory. Translating it to `${home}` during the Kiro→Gemini conversion rewrites paths to the wrong location and breaks any commands relying on the AppData subtree.</violation>
</file>

<file name="packages/cli/src/commands/install.ts">

<violation number="1">
P2: Reinstall detection for snippets ignores their actual target file, so rerunning a snippet that defaults to CLAUDE.md (or any non‑AGENTS.md target) re-appends it instead of reporting it as already installed.</violation>
</file>

Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.

@@ -78,6 +78,15 @@ import {
type CanonicalPackage,
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P2: Reinstall detection for snippets ignores their actual target file, so rerunning a snippet that defaults to CLAUDE.md (or any non‑AGENTS.md target) re-appends it instead of reporting it as already installed.

Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/cli/src/commands/install.ts, line 761:

<comment>Reinstall detection for snippets ignores their actual target file, so rerunning a snippet that defaults to CLAUDE.md (or any non‑AGENTS.md target) re-appends it instead of reporting it as already installed.</comment>

<file context>
@@ -750,23 +751,65 @@ export async function handleInstall(
+      let matchedKey: string | undefined;
+
+      // First, check for snippet installations at the requested location (or default AGENTS.md)
+      const snippetLocation = requestedLocation || 'AGENTS.md';
+      const snippetKey = getLockfileKey(packageId, targetFormat, snippetLocation);
+      if (lockfile.packages[snippetKey]) {
</file context>

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:XXL This PR changes 1000+ lines, ignoring generated files

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants