Skip to content

feat: implement LlmProvider for AWS Bedrock #61

@atharva-nagane

Description

@atharva-nagane

Overview

AWS Bedrock is Amazon's fully managed generative AI service that provides access to high-performing foundation models (FMs) from leading AI companies including Anthropic, Amazon, Meta, and Mistral — all through a single unified API.

RustCloud already defines the LlmProvider trait with full generative AI operations (generate, stream, embed, generate_with_tools), but there
is currently no AWS implementation of this trait. This issue tracks adding BedrockProvider as the first AWS GenAI backend.

Background

The LlmProvider trait in rustcloud/src/traits/llm_provider.rs exposes:

Method Description
generate Single-turn / multi-turn text generation
stream Streaming token-by-token generation
embed Text embeddings for semantic search / similarity
generate_with_tools Function calling / tool use

All four methods map directly to AWS Bedrock Runtime API operations, making this a natural first implementation of the trait for AWS.

Proposed Implementation

BedrockProvider struct

  • Holds an aws_sdk_bedrockruntime::Client
  • Constructors: BedrockProvider::new() (loads from env) and
    BedrockProvider::with_client(client) for custom configs

Operations

generate and generate_with_tools → Bedrock Converse API

  • Unified interface across all foundation models
  • Native tool/function calling support
  • Handles system prompts, temperature, max_tokens

stream → Bedrock ConverseStream API

  • Real-time token streaming via converse_stream
  • Maps SDK events to LlmStreamEvent (DeltaText, Usage, Done)

embedinvoke_model with Amazon Titan Embed Text V2

  • amazon.titan-embed-text-v2:0 as the standard embedding model
  • Processes texts individually and batches results into EmbedResponse

Supported Models (via Converse API)

  • Anthropic Claude — claude-3-5-haiku, claude-3-5-sonnet, claude-3-opus
  • Amazon Titan — titan-text-express, titan-text-premier
  • Meta Llama — llama3-2-3b, llama3-2-90b
  • Mistral — mistral-7b, mixtral-8x7b

Files to be Added/Modified

File Change
src/aws/aws_apis/artificial_intelligence/aws_bedrock.rs New — core BedrockProvider implementation
src/aws/aws_apis/artificial_intelligence/mod.rs New — module declaration
src/tests/aws_bedrock_operations.rs New — tests for all 4 operations
examples/aws/artificial_intelligence/bedrock.md New — usage guide and examples
src/main.rs Updated — add artificial_intelligence module
Cargo.toml Updated — add aws-sdk-bedrockruntime, aws-smithy-types
README.md Updated — add AI/ML row to AWS provider table

Benefits

  • Aligns with the project's focus on generative AI support
  • AWS Bedrock is the AWS equivalent of GCP Vertex AI — completing the big-cloud GenAI provider set
  • The LlmProvider trait already exists and is designed for exactly this
  • Consistent with existing RustCloud AWS patterns (aws-sdk-* crates)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions