Add Perplexity AI as a new LLM provider with full LLM Proxy and Chat support Perplexity uses OpenAI-compatible API with built-in web search (no external tool calling) Streaming chat works correctly with proper response handling
Backend: Perplexity adapter, proxy routes, dual LLM client, error handling Frontend: API key form, model selector, provider icon Docs: Updated supported providers documentation
Perplexity has no /models endpoint - models are hardcoded (sonar, sonar-pro, sonar-reasoning-pro, sonar-deep-research) Tool calling disabled - Perplexity has built-in web search instead
API Key: https://www.perplexity.ai/settings/api
https://github.com/user-attachments/assets/1d1c8787-f2f8-4fa0-a775-9ceab0a92f9f
/claim #1854 Closes #1854
M Junaid Shaukat
@junaiddshaukat
Archestra
@archestra-ai