Skip to main content

AWS Bedrock Integration

Memori integrates with AWS Bedrock through the LangChain ChatBedrock adapter, providing memory for Claude, Llama, Mistral, and other Bedrock-hosted models.

Installation

pip install memori langchain-aws boto3

Quick Start

from langchain_aws import ChatBedrock
from memori import Memori

client = ChatBedrock(
    model_id="anthropic.claude-sonnet-4-5-20250929",
    region_name="us-east-1"
)

# Register the Bedrock client with Memori
mem = Memori().llm.register(chatbedrock=client)
mem.attribution(entity_id="user_123", process_id="bedrock_agent")

response = client.invoke("Hello! My name is Alice.")
print(response.content)

AWS Configuration

Bedrock requires AWS credentials. Configure them using environment variables or AWS CLI:
export AWS_ACCESS_KEY_ID="your-access-key"
export AWS_SECRET_ACCESS_KEY="your-secret-key"
export AWS_DEFAULT_REGION="us-east-1"
Or pass credentials directly:
from langchain_aws import ChatBedrock
from memori import Memori
import boto3

session = boto3.Session(
    aws_access_key_id="your-access-key",
    aws_secret_access_key="your-secret-key",
    region_name="us-east-1"
)

client = ChatBedrock(
    model_id="anthropic.claude-sonnet-4-5-20250929",
    client=session.client("bedrock-runtime")
)

mem = Memori().llm.register(chatbedrock=client)
mem.attribution(entity_id="user_123", process_id="bedrock_secure")

Available Models

Memori supports all Bedrock models through ChatBedrock:

Anthropic Claude

from langchain_aws import ChatBedrock
from memori import Memori

# Claude Sonnet 3.5
client = ChatBedrock(
    model_id="anthropic.claude-3-5-sonnet-20241022-v2:0",
    region_name="us-east-1"
)

mem = Memori().llm.register(chatbedrock=client)
mem.attribution(entity_id="user_123", process_id="claude_sonnet")

Meta Llama

from langchain_aws import ChatBedrock
from memori import Memori

# Llama 3.1 70B
client = ChatBedrock(
    model_id="meta.llama3-1-70b-instruct-v1:0",
    region_name="us-east-1"
)

mem = Memori().llm.register(chatbedrock=client)
mem.attribution(entity_id="user_123", process_id="llama")

Mistral

from langchain_aws import ChatBedrock
from memori import Memori

# Mistral Large
client = ChatBedrock(
    model_id="mistral.mistral-large-2407-v1:0",
    region_name="us-east-1"
)

mem = Memori().llm.register(chatbedrock=client)
mem.attribution(entity_id="user_123", process_id="mistral")

Multi-Turn Conversations

LangChain provides conversation history management:
from langchain_aws import ChatBedrock
from langchain.schema import HumanMessage, AIMessage
from memori import Memori

client = ChatBedrock(
    model_id="anthropic.claude-sonnet-4-5-20250929",
    region_name="us-east-1"
)

mem = Memori().llm.register(chatbedrock=client)
mem.attribution(entity_id="user_456", process_id="chat")

messages = []

# First turn
messages.append(HumanMessage(content="My favorite programming language is Python."))
response = client.invoke(messages)
messages.append(AIMessage(content=response.content))

# Second turn - memory maintained
messages.append(HumanMessage(content="What's my favorite language?"))
response = client.invoke(messages)
print(response.content)

Model-Specific Parameters

Configure model-specific parameters through model_kwargs:
from langchain_aws import ChatBedrock
from memori import Memori

client = ChatBedrock(
    model_id="anthropic.claude-sonnet-4-5-20250929",
    region_name="us-east-1",
    model_kwargs={
        "temperature": 0.7,
        "top_p": 0.9,
        "max_tokens": 2048
    }
)

mem = Memori().llm.register(chatbedrock=client)
mem.attribution(entity_id="user_123", process_id="custom_params")

response = client.invoke("Explain quantum computing")
print(response.content)

Tool Use / Function Calling

Bedrock models support function calling:
from langchain_aws import ChatBedrock
from langchain.tools import tool
from memori import Memori

@tool
def get_weather(location: str) -> str:
    """Get the current weather for a location."""
    return f"Weather in {location}: Sunny, 72°F"

client = ChatBedrock(
    model_id="anthropic.claude-sonnet-4-5-20250929",
    region_name="us-east-1"
)

mem = Memori().llm.register(chatbedrock=client)
mem.attribution(entity_id="user_123", process_id="tools")

client_with_tools = client.bind_tools([get_weather])
response = client_with_tools.invoke("What's the weather in Seattle?")

if response.tool_calls:
    for tool_call in response.tool_calls:
        print(f"Tool: {tool_call['name']}")
        print(f"Args: {tool_call['args']}")

Streaming Responses

Stream responses for real-time applications:
from langchain_aws import ChatBedrock
from memori import Memori

client = ChatBedrock(
    model_id="anthropic.claude-sonnet-4-5-20250929",
    region_name="us-east-1",
    streaming=True
)

mem = Memori().llm.register(chatbedrock=client)
mem.attribution(entity_id="user_123", process_id="streaming")

for chunk in client.stream("Write a poem about AI"):
    print(chunk.content, end="", flush=True)

Supported Features

FeatureSupportMethod
Sync Clientclient.invoke()
Async Clientawait client.ainvoke()
Streamingclient.stream()
Tool Usebind_tools()
System PromptsLangChain message types
Custom Parametersmodel_kwargs
Cross-Regionregion_name parameter

How It Works

When you register a ChatBedrock client with Memori:
  1. Memori wraps the Bedrock runtime client’s invoke_model methods
  2. All requests (model ID, messages, parameters) are captured
  3. All responses are captured, including streaming chunks
  4. Conversations are stored in your Memori memory store
  5. A knowledge graph is built from conversation patterns
  6. Original LangChain behavior is preserved
Memori captures streaming responses by collecting chunks and reconstructing the full conversation after streaming completes.

Supported Model Families

ProviderModel FamilyExample Model ID
AnthropicClaudeanthropic.claude-3-5-sonnet-20241022-v2:0
MetaLlamameta.llama3-1-70b-instruct-v1:0
MistralMistralmistral.mistral-large-2407-v1:0
AmazonTitanamazon.titan-text-premier-v1:0
AI21 LabsJurassicai21.j2-ultra-v1
CohereCommandcohere.command-text-v14

Regional Availability

Bedrock models are available in specific AWS regions. Common regions:
  • us-east-1 (N. Virginia)
  • us-west-2 (Oregon)
  • eu-west-1 (Ireland)
  • ap-southeast-1 (Singapore)
Check AWS Bedrock documentation for current availability.

Next Steps

Build docs developers (and LLMs) love