Stores#
RedisStore and AsyncRedisStore provide cross-thread key-value persistence
with optional vector similarity search. While checkpointers store per-thread
graph state, stores hold shared data that any thread can read and write –
user profiles, knowledge bases, configuration, and more.
Basic Usage#
Creating a Store#
Use from_conn_string for managed connection lifecycle:
from langgraph.store.redis import RedisStore
with RedisStore.from_conn_string("redis://localhost:6379") as store:
store.setup()
# Use the store...
Or pass a Redis client directly:
from redis import Redis
from langgraph.store.redis import RedisStore
client = Redis.from_url("redis://localhost:6379")
store = RedisStore(client)
store.setup()
AsyncRedisStore#
The async variant:
from langgraph.store.redis import AsyncRedisStore
async with AsyncRedisStore.from_conn_string("redis://localhost:6379") as store:
await store.asetup()
# Use the store...
Store Operations#
put – Store an Item#
Items are organized by namespace (a tuple of strings) and key (a string):
store.put(
namespace=("users", "profiles"),
key="user-123",
value={"name": "Alice", "role": "admin"},
)
get – Retrieve an Item#
item = store.get(namespace=("users", "profiles"), key="user-123")
if item:
print(item.value) # {"name": "Alice", "role": "admin"}
print(item.key) # "user-123"
delete – Remove an Item#
store.delete(namespace=("users", "profiles"), key="user-123")
search – Find Items by Namespace Prefix#
Search returns all items under a namespace prefix:
results = store.search(namespace_prefix=("users",))
for item in results:
print(f"{item.namespace}/{item.key}: {item.value}")
Filter results by value fields:
results = store.search(
namespace_prefix=("users", "profiles"),
filter={"role": "admin"},
)
Control pagination with limit and offset:
results = store.search(
namespace_prefix=("users",),
limit=10,
offset=20,
)
Namespaces#
Namespaces are tuples of strings that form a hierarchical key space. They work like directory paths:
# Store items in nested namespaces
store.put(("app", "settings"), "theme", {"color": "dark"})
store.put(("app", "settings"), "language", {"locale": "en"})
store.put(("app", "users", "alice"), "preferences", {"notifications": True})
# Search across a namespace prefix
results = store.search(namespace_prefix=("app",)) # Returns all items under "app"
# List unique namespaces
namespaces = store.list_namespaces(prefix=("app",))
# Returns: [("app", "settings"), ("app", "users", "alice")]
Vector Search#
RedisStore supports semantic similarity search using vector embeddings. This
requires configuring an IndexConfig with embedding dimensions and a
compatible embedding model.
Configuring Vector Search#
from langgraph.store.redis import RedisStore
from langchain_openai import OpenAIEmbeddings
embeddings = OpenAIEmbeddings(model="text-embedding-3-small")
index_config = {
"dims": 1536,
"distance_type": "cosine",
"fields": ["text"],
"embed": embeddings,
}
with RedisStore.from_conn_string(
"redis://localhost:6379",
index=index_config,
) as store:
store.setup()
# Store items with text fields for embedding
store.put(
("docs",), "intro",
{"text": "LangGraph enables stateful AI workflows."},
)
store.put(
("docs",), "redis",
{"text": "Redis provides fast in-memory data storage."},
)
# Semantic search
results = store.search(
namespace_prefix=("docs",),
query="How do I build stateful agents?",
limit=5,
)
for item in results:
print(f"{item.key}: score={item.score:.3f}")
Index Configuration Options#
Field |
Type |
Description |
|---|---|---|
|
|
Embedding vector dimensions (e.g., 1536 for OpenAI) |
|
|
Distance metric: |
|
|
Value fields to embed for search |
|
|
LangChain-compatible embeddings instance |
Using with LangGraph compile()#
Pass a store alongside a checkpointer when compiling a graph:
from langgraph.checkpoint.redis import RedisSaver
from langgraph.store.redis import RedisStore
with RedisSaver.from_conn_string("redis://localhost:6379") as saver:
saver.setup()
with RedisStore.from_conn_string("redis://localhost:6379") as store:
store.setup()
graph = builder.compile(checkpointer=saver, store=store)
result = graph.invoke(inputs, config)
Inside graph nodes, access the store via the store parameter:
from langgraph.store.base import BaseStore
def my_node(state: MyState, store: BaseStore) -> dict:
# Read from the store
user = store.get(("users",), state["user_id"])
# Write to the store
store.put(("interactions",), state["interaction_id"], {"result": "success"})
return {"output": user.value["name"] if user else "unknown"}
TTL for Store Items#
Store items support per-item TTL as well as global TTL configuration:
# Global TTL configuration
with RedisStore.from_conn_string(
"redis://localhost:6379",
ttl={"default_ttl": 60}, # 60 minutes
) as store:
store.setup()
# Per-item TTL (overrides global)
store.put(
("cache",), "temp-result",
{"data": "expires soon"},
ttl=5, # 5 minutes
)
See TTL Configuration for more details.
Next Steps#
TTL Configuration – configure expiration policies
Checkpointers – per-thread state with
RedisSaverMiddleware – add caching and routing middleware