Implements a SmartRouter class for the JS SDK that provides LiteLLM-style intelligent routing across multiple LLM providers (OpenAI, Gemini, Cohere).
/claim #286
least-tokens (default), round-robin, fallbackstreamChat() returns an AsyncGenerator<StreamChunk> for all 3 providersgetUsageStats() and getTotalUsage() for monitoringcreateSentryCallback({ dsn }) — captures errors + breadcrumbscreatePostHogCallback({ apiKey, host }) — captures analytics eventsaddCallback() accepts custom onStart/onSuccess/onError hooksRouter config lives in jsonnet files per project convention:
{
strategy: "least-tokens",
deployments: [
{ provider: "openai", api_key: openai_key, model: "gpt-3.5-turbo", rpm_limit: 60 },
{ provider: "gemini", api_key: gemini_key, model: "gemini-pro" },
{ provider: "cohere", api_key: cohere_key, model: "command" },
],
}
src/ai/src/lib/smart-router/)types.ts — All interfaces (Provider, ModelDeployment, RouterConfig, ChatRequest/Response, etc.)providers.ts — OpenAI, Gemini, Cohere provider implementations (chat + streaming)smart-router.ts — Main SmartRouter class with routing logicusage-tracker.ts — Token/request tracking per deploymentlogging.ts — LoggingManager + Sentry/PostHog callback factoriesindex.ts — Barrel exportssrc/ai/src/tests/smartRouter.test.ts)examples/smart-router/)src/ai/src/index.ts — Exports SmartRouter and all related typesimport { SmartRouter, createSentryCallback } from "@arakoodev/edgechains.js/ai";
const router = SmartRouter.fromConfig(jsonnetConfig);
router.addCallback(createSentryCallback({ dsn: "..." }));
// Auto-routes to best available deployment
const response = await router.chat({ prompt: "Hello!" });
console.log(response.content, response.usage);
// Streaming
for await (const chunk of router.streamChat({ prompt: "Count to 5" })) {
process.stdout.write(chunk.content);
}
maoshuorz
@maoshuorz
Arakoo.ai
@arakoodev