Overview
AWS Bedrock is Amazon's fully managed generative AI service that provides access to high-performing foundation models (FMs) from leading AI companies including Anthropic, Amazon, Meta, and Mistral — all through a single unified API.
RustCloud already defines the LlmProvider trait with full generative AI operations (generate, stream, embed, generate_with_tools), but there
is currently no AWS implementation of this trait. This issue tracks adding BedrockProvider as the first AWS GenAI backend.
Background
The LlmProvider trait in rustcloud/src/traits/llm_provider.rs exposes:
| Method |
Description |
generate |
Single-turn / multi-turn text generation |
stream |
Streaming token-by-token generation |
embed |
Text embeddings for semantic search / similarity |
generate_with_tools |
Function calling / tool use |
All four methods map directly to AWS Bedrock Runtime API operations, making this a natural first implementation of the trait for AWS.
Proposed Implementation
BedrockProvider struct
- Holds an
aws_sdk_bedrockruntime::Client
- Constructors:
BedrockProvider::new() (loads from env) and
BedrockProvider::with_client(client) for custom configs
Operations
generate and generate_with_tools → Bedrock Converse API
- Unified interface across all foundation models
- Native tool/function calling support
- Handles system prompts, temperature, max_tokens
stream → Bedrock ConverseStream API
- Real-time token streaming via
converse_stream
- Maps SDK events to
LlmStreamEvent (DeltaText, Usage, Done)
embed → invoke_model with Amazon Titan Embed Text V2
amazon.titan-embed-text-v2:0 as the standard embedding model
- Processes texts individually and batches results into
EmbedResponse
Supported Models (via Converse API)
- Anthropic Claude — claude-3-5-haiku, claude-3-5-sonnet, claude-3-opus
- Amazon Titan — titan-text-express, titan-text-premier
- Meta Llama — llama3-2-3b, llama3-2-90b
- Mistral — mistral-7b, mixtral-8x7b
Files to be Added/Modified
| File |
Change |
src/aws/aws_apis/artificial_intelligence/aws_bedrock.rs |
New — core BedrockProvider implementation |
src/aws/aws_apis/artificial_intelligence/mod.rs |
New — module declaration |
src/tests/aws_bedrock_operations.rs |
New — tests for all 4 operations |
examples/aws/artificial_intelligence/bedrock.md |
New — usage guide and examples |
src/main.rs |
Updated — add artificial_intelligence module |
Cargo.toml |
Updated — add aws-sdk-bedrockruntime, aws-smithy-types |
README.md |
Updated — add AI/ML row to AWS provider table |
Benefits
- Aligns with the project's focus on generative AI support
- AWS Bedrock is the AWS equivalent of GCP Vertex AI — completing the big-cloud GenAI provider set
- The
LlmProvider trait already exists and is designed for exactly this
- Consistent with existing RustCloud AWS patterns (
aws-sdk-* crates)
Overview
AWS Bedrock is Amazon's fully managed generative AI service that provides access to high-performing foundation models (FMs) from leading AI companies including Anthropic, Amazon, Meta, and Mistral — all through a single unified API.
RustCloud already defines the
LlmProvidertrait with full generative AI operations (generate,stream,embed,generate_with_tools), but thereis currently no AWS implementation of this trait. This issue tracks adding
BedrockProvideras the first AWS GenAI backend.Background
The
LlmProvidertrait inrustcloud/src/traits/llm_provider.rsexposes:generatestreamembedgenerate_with_toolsAll four methods map directly to AWS Bedrock Runtime API operations, making this a natural first implementation of the trait for AWS.
Proposed Implementation
BedrockProviderstructaws_sdk_bedrockruntime::ClientBedrockProvider::new()(loads from env) andBedrockProvider::with_client(client)for custom configsOperations
generateandgenerate_with_tools→ Bedrock Converse APIstream→ Bedrock ConverseStream APIconverse_streamLlmStreamEvent(DeltaText, Usage, Done)embed→invoke_modelwith Amazon Titan Embed Text V2amazon.titan-embed-text-v2:0as the standard embedding modelEmbedResponseSupported Models (via Converse API)
Files to be Added/Modified
src/aws/aws_apis/artificial_intelligence/aws_bedrock.rsBedrockProviderimplementationsrc/aws/aws_apis/artificial_intelligence/mod.rssrc/tests/aws_bedrock_operations.rsexamples/aws/artificial_intelligence/bedrock.mdsrc/main.rsartificial_intelligencemoduleCargo.tomlaws-sdk-bedrockruntime,aws-smithy-typesREADME.mdBenefits
LlmProvidertrait already exists and is designed for exactly thisaws-sdk-*crates)