Skip to content

feat(llm): add Perplexity as a model provider#12259

Open
jliounis wants to merge 1 commit intocontinuedev:mainfrom
jliounis:feat/perplexity-provider
Open

feat(llm): add Perplexity as a model provider#12259
jliounis wants to merge 1 commit intocontinuedev:mainfrom
jliounis:feat/perplexity-provider

Conversation

@jliounis
Copy link
Copy Markdown

@jliounis jliounis commented Apr 29, 2026

Motivation

Continue's provider list covers 25+ inference services but Perplexity is missing. Perplexity's Sonar family (sonar, sonar-pro, sonar-reasoning, sonar-reasoning-pro) ships with built-in web search, which is particularly relevant for coding agents that need to research up-to-date documentation or APIs while answering — a common workflow in Continue.

The Perplexity API is OpenAI-compatible, so this PR mirrors the pattern used by other OpenAI-compatible providers like Groq, Together, OpenRouter, and Inception: the provider class subclasses OpenAI and only overrides providerName and apiBase. No custom request/response transformation is needed.

Changes

  • core/llm/llms/Perplexity.ts — provider class extending OpenAI with apiBase = "https://api.perplexity.ai/" and default model: "sonar"
  • core/llm/llms/index.ts — registered in LLMClasses (import + array entry)
  • core/llm/autodetect.ts — added "perplexity" to the list of providers that use OpenAI templating
  • packages/openai-adapters/src/types.ts — added "perplexity" to the OpenAIConfigSchema union
  • packages/openai-adapters/src/index.ts — added the case "perplexity" factory branch
  • extensions/vscode/config_schema.json — added "perplexity" to the provider enum, the model if/then block (Sonar models + AUTODETECT), and a markdown description
  • docs/customize/model-providers/more/perplexity.mdx — new provider page modeled on groq.mdx
  • docs/customize/model-providers/overview.mdx — added to the hosted-services table
  • docs/docs.json — added to the More providers sidebar group
  • core/llm/llms/Perplexity.vitest.ts — tests via the existing createOpenAISubclassTests helper

Tests

core/llm/llms/Perplexity.vitest.ts calls createOpenAISubclassTests(Perplexity, ...), the same helper used by Groq, Together, and other OpenAI-compatible providers. The helper generates 7 cases covering: providerName, default API base, streamChat, chat, streamComplete, complete, and embed against a mocked fetch. Run with:

```
cd core && npm run vitest -- llms/Perplexity
```

References


Summary by cubic

Adds Perplexity as a first-class LLM provider using its OpenAI-compatible API, with default sonar model. This enables Sonar models (sonar, sonar-pro, sonar-reasoning, sonar-reasoning-pro) with built-in web search for fresher answers.

  • New Features
    • Added Perplexity provider extending OpenAI with apiBase: "https://api.perplexity.ai/" and default model: "sonar".
    • Registered in LLMClasses and added to OpenAI-templated autodetect providers.
    • Updated packages/openai-adapters types and factory to support perplexity.
    • Updated VS Code config_schema.json with provider enum, Sonar model list (AUTODETECT supported), and help text.
    • Added docs: provider page, overview entry, and sidebar link.
    • Added unit tests via createOpenAISubclassTests for core chat/complete/embed paths.

Written for commit f4bdacd. Summary will update on new commits. Review in cubic

Adds Perplexity as a first-class provider, exposing the Sonar family of
chat-completion models (sonar, sonar-pro, sonar-reasoning,
sonar-reasoning-pro). The Perplexity API is OpenAI-compatible, so the
provider extends the existing OpenAI class with a custom apiBase.

Sonar models include built-in web search, which is particularly useful
for coding agents that need to research up-to-date documentation or
APIs while answering.

- core/llm/llms/Perplexity.ts: provider class extending OpenAI
- core/llm/llms/index.ts: registered in LLMClasses
- core/llm/autodetect.ts: added "perplexity" to providers using OpenAI templating
- packages/openai-adapters: provider registered in schema and factory
- extensions/vscode/config_schema.json: enum, model list, and description
- docs: new provider page and overview/sidebar entries
- core/llm/llms/Perplexity.vitest.ts: unit tests via createOpenAISubclassTests

See https://docs.perplexity.ai/docs/getting-started for API details.
@jliounis jliounis requested a review from a team as a code owner April 29, 2026 21:32
@jliounis jliounis requested review from sestinj and removed request for a team April 29, 2026 21:32
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Apr 29, 2026

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@dosubot dosubot Bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Apr 29, 2026
Copy link
Copy Markdown
Contributor

@cubic-dev-ai cubic-dev-ai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 10 files

@jliounis
Copy link
Copy Markdown
Author

jliounis commented May 4, 2026

I have read the CLA Document and I hereby sign the CLA

I have read the CLA Document and I hereby sign the CLA

@jliounis
Copy link
Copy Markdown
Author

jliounis commented May 5, 2026

I have read the CLA Document and I hereby sign the CLA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:M This PR changes 30-99 lines, ignoring generated files.

Projects

Status: Todo

Development

Successfully merging this pull request may close these issues.

1 participant