Summary

Implements a SmartRouter class for the JS SDK that provides LiteLLM-style intelligent routing across multiple LLM providers (OpenAI, Gemini, Cohere).

/claim #286

Features

1. Load Balancing with Rate-Limit Awareness

  • 3 routing strategies: least-tokens (default), round-robin, fallback
  • Per-deployment RPM/TPM rate limits — router skips rate-limited deployments
  • Automatic failover: if one provider fails, retries on the next
  • Configurable retries and timeouts via axios interceptors (429 backoff built-in)

2. Streaming Support

  • streamChat() returns an AsyncGenerator<StreamChunk> for all 3 providers
  • OpenAI SSE, Gemini SSE, Cohere NDJSON streaming formats handled

3. Token Usage Tracking

  • Per-deployment stats: total requests, tokens used, failures, rate window tracking
  • getUsageStats() and getTotalUsage() for monitoring
  • Automatic token counting from API responses (estimated for streaming)

4. Logging Callbacks (Sentry + PostHog)

  • createSentryCallback({ dsn }) — captures errors + breadcrumbs
  • createPostHogCallback({ apiKey, host }) — captures analytics events
  • Extensible: addCallback() accepts custom onStart/onSuccess/onError hooks
  • Callbacks never throw — logging failures are silently caught

Jsonnet Configuration

Router config lives in jsonnet files per project convention:

{
strategy: "least-tokens",
deployments: [
{ provider: "openai", api_key: openai_key, model: "gpt-3.5-turbo", rpm_limit: 60 },
{ provider: "gemini", api_key: gemini_key, model: "gemini-pro" },
{ provider: "cohere", api_key: cohere_key, model: "command" },
],
}

Files Changed (13 files, 1476 lines)

Core (src/ai/src/lib/smart-router/)

  • types.ts — All interfaces (Provider, ModelDeployment, RouterConfig, ChatRequest/Response, etc.)
  • providers.ts — OpenAI, Gemini, Cohere provider implementations (chat + streaming)
  • smart-router.ts — Main SmartRouter class with routing logic
  • usage-tracker.ts — Token/request tracking per deployment
  • logging.ts — LoggingManager + Sentry/PostHog callback factories
  • index.ts — Barrel exports

Tests (src/ai/src/tests/smartRouter.test.ts)

  • 20+ vitest tests covering: routing strategies, fallback on failure, streaming, multi-provider, usage tracking, logging callbacks, callback factories

Example (examples/smart-router/)

  • Full working example with jsonnet config
  • Demonstrates: basic chat, multi-turn, streaming, usage stats, logging

Updated

  • src/ai/src/index.ts — Exports SmartRouter and all related types

Usage

import { SmartRouter, createSentryCallback } from "@arakoodev/edgechains.js/ai";
const router = SmartRouter.fromConfig(jsonnetConfig);
router.addCallback(createSentryCallback({ dsn: "..." }));
// Auto-routes to best available deployment
const response = await router.chat({ prompt: "Hello!" });
console.log(response.content, response.usage);
// Streaming
for await (const chunk of router.streamChat({ prompt: "Count to 5" })) {
process.stdout.write(chunk.content);
}

Claim

Total prize pool $200
Total paid $0
Status Pending
Submitted March 07, 2026
Last updated March 07, 2026

Contributors

MA

maoshuorz

@maoshuorz

100%

Sponsors

AR

Arakoo.ai

@arakoodev

$200