Added support for AWS Bedrock as a new LLM provider:
Added Bedrock provider implementation with model-specific handlers:
All tests pass using AWS credentials in us-east-1/2 regions Verified error handling for invalid credentials/models Tested non-streaming responses ( models in Bedrock don’t seem to have streaming types )
The model ID from AWS link does not work consistently.
Instead, use the Inference profile ARN
or Inference profile ID
from the cross-region reference tab as your model_id.
Issue: https://github.com/traceloop/hub/issues/20
/claim #20
detunjiSamuel
@detunjiSamuel
Traceloop (YC W23)
@traceloop