/claim #286

Summary

Fixes #286

Implements a multi-provider LLM routing system inspired by LiteLLM for the EdgeChains JS SDK.

Features

  1. Load Balancing — round-robin, least-tokens, latency-based, cost-based strategies with RPM/TPM rate limiting
  2. Streaming — AsyncGenerator-based with automatic fallback on stream errors
  3. Token Usage Tracking — per-model counters with getUsage() and resetUsageCounters()
  4. Logging — Sentry breadcrumbs + error capture, PostHog event tracking, custom callbacks

Testing

24 tests (unit + e2e with mock servers), all passing. TypeScript compiles clean.

Claim

Total prize pool $200
Total paid $0
Status Pending
Submitted March 19, 2026
Last updated March 19, 2026

Contributors

BL

blessuselessk

@n78dc4ytfv-ux

100%

Sponsors

AR

Arakoo.ai

@arakoodev

$200