AWS Bedrock Provider Integration

Added support for AWS Bedrock as a new LLM provider:

  • Anthropic Claude models
  • Amazon Titan models (text & embeddings)
  • AI21 Jurassic models

Key Changes

Added Bedrock provider implementation with model-specific handlers:

  • Anthropic: Reused existing Anthropic provider code with Bedrock-specific modifications ( no actual changes)
  • Titan: Added chat completion and embedding support
  • AI21: Added completion and chat completion support
  • Request/response mapping between Hub formats and Bedrock formats

Testing Notes

All tests pass using AWS credentials in us-east-1/2 regions Verified error handling for invalid credentials/models Tested non-streaming responses ( models in Bedrock don’t seem to have streaming types )

Review notes

The model ID from AWS link does not work consistently. Instead, use the Inference profile ARN or Inference profile ID from the cross-region reference tab as your model_id.

Issue: https://github.com/traceloop/hub/issues/20

/claim #20

Claim

Total prize pool $300
Total paid $0
Status Pending
Submitted January 10, 2025
Last updated January 10, 2025

Contributors

DE

detunjiSamuel

@detunjiSamuel

100%

Sponsors

TR

Traceloop (YC W23)

@traceloop

$300