How to add cross-thread persistence (functional API)

How to add cross-thread persistence (functional API)#

!!! info “Prerequisites”

This guide assumes familiarity with the following:

- [Functional API](https://langchain-ai.github.io/langgraph/concepts/functional_api/)
- [Persistence](https://langchain-ai.github.io/langgraph/concepts/persistence/)
- [Memory](https://langchain-ai.github.io/langgraph/concepts/memory/)
- [Chat Models](https://python.langchain.com/docs/concepts/chat_models/)

LangGraph allows you to persist data across different threads. For instance, you can store information about users (their names or preferences) in a shared (cross-thread) memory and reuse them in the new threads (e.g., new conversations).

When using the functional API, you can set it up to store and retrieve memories by using the Store interface:

  1. Create an instance of a Store

    from langgraph.store.redis import RedisStore, BaseStore
    
    store = RedisStore.from_conn_string("redis://redis:6379")
    
  2. Pass the store instance to the entrypoint() decorator and expose store parameter in the function signature:

    from langgraph.func import entrypoint
    
    @entrypoint(store=store)
    def workflow(inputs: dict, store: BaseStore):
        my_task(inputs).result()
        ...
    

In this guide, we will show how to construct and use a workflow that has a shared memory implemented using the Store interface.

!!! note Note

Support for the [`Store`](https://langchain-ai.github.io/langgraph/reference/store/#langgraph.store.base.BaseStore) API that is used in this guide was added in LangGraph `v0.2.32`.

Support for __index__ and __query__ arguments of the [`Store`](https://langchain-ai.github.io/langgraph/reference/store/#langgraph.store.base.BaseStore) API that is used in this guide was added in LangGraph `v0.2.54`.

!!! tip “Note”

If you need to add cross-thread persistence to a `StateGraph`, check out this [how-to guide](../cross-thread-persistence).

Setup#

First, let’s install the required packages and set our API keys

%%capture --no-stderr
%pip install -U langchain_anthropic langchain_openai langgraph
import getpass
import os


def _set_env(var: str):
    if not os.environ.get(var):
        os.environ[var] = getpass.getpass(f"{var}: ")


_set_env("ANTHROPIC_API_KEY")
_set_env("OPENAI_API_KEY")

!!! tip “Set up LangSmith for LangGraph development”

Sign up for LangSmith to quickly spot issues and improve the performance of your LangGraph projects. LangSmith lets you use trace data to debug, test, and monitor your LLM apps built with LangGraph — read more about how to get started [here](https://docs.smith.langchain.com)

Example: simple chatbot with long-term memory#

Define store#

In this example we will create a workflow that will be able to retrieve information about a user’s preferences. We will do so by defining a RedisStore - an object that can store data in Redis and query that data.

When storing objects using the Store interface you define two things:

  • the namespace for the object, a tuple (similar to directories)

  • the object key (similar to filenames)

In our example, we’ll be using ("memories", <user_id>) as namespace and random UUID as key for each new memory.

Importantly, to determine the user, we will be passing user_id via the config keyword argument of the node function.

Let’s first define our store!

from langgraph.store.redis import RedisStore
from langgraph.store.base import IndexConfig
from langchain_openai import OpenAIEmbeddings

# Set up Redis connection
REDIS_URI = "redis://redis:6379"

# Create index configuration for vector search
index_config: IndexConfig = {
    "dims": 1536,
    "embed": OpenAIEmbeddings(model="text-embedding-3-small"),
    "ann_index_config": {
        "vector_type": "vector",
    },
    "distance_type": "cosine",
}

# Initialize the Redis store
redis_store = None
with RedisStore.from_conn_string(REDIS_URI, index=index_config) as s:
    s.setup()
    redis_store = s
00:35:16 langgraph.store.redis INFO   Redis standalone client detected for RedisStore.
00:35:16 redisvl.index.index INFO   Index already exists, not overwriting.
00:35:16 redisvl.index.index INFO   Index already exists, not overwriting.

Create workflow#

import uuid

from langchain_anthropic import ChatAnthropic
from langchain_core.runnables import RunnableConfig
from langchain_core.messages import BaseMessage
from langgraph.func import entrypoint, task
from langgraph.graph import add_messages
from langgraph.checkpoint.redis import RedisSaver
from langgraph.store.base import BaseStore

model = ChatAnthropic(model="claude-sonnet-4-20250514")


@task
def call_model(messages: list[BaseMessage], memory_store: BaseStore, user_id: str):
    namespace = ("memories", user_id)
    last_message = messages[-1]
    memories = memory_store.search(namespace, query=str(last_message.content))
    info = "\n".join([d.value["data"] for d in memories])
    system_msg = f"You are a helpful assistant talking to the user. User info: {info}"

    # Store new memories if the user asks the model to remember
    if "remember" in last_message.content.lower():
        memory = "User name is Bob"
        memory_store.put(namespace, str(uuid.uuid4()), {"data": memory})

    response = model.invoke([{"role": "system", "content": system_msg}] + messages)
    return response


# Set up Redis connection for checkpointer
REDIS_URI = "redis://redis:6379"
checkpointer = None
with RedisSaver.from_conn_string(REDIS_URI) as cp:
    cp.setup()
    checkpointer = cp


# NOTE: we're passing the store object here when creating a workflow via entrypoint()
@entrypoint(checkpointer=checkpointer, store=redis_store)
def workflow(
        inputs: list[BaseMessage],
        *,
        previous: list[BaseMessage],
        config: RunnableConfig,
        store: BaseStore,
):
    user_id = config["configurable"]["user_id"]
    previous = previous or []
    inputs = add_messages(previous, inputs)
    response = call_model(inputs, store, user_id).result()
    return entrypoint.final(value=response, save=add_messages(inputs, response))
00:35:16 langgraph.checkpoint.redis INFO   Redis client is a standalone client
00:35:16 redisvl.index.index INFO   Index already exists, not overwriting.
00:35:16 redisvl.index.index INFO   Index already exists, not overwriting.
00:35:16 redisvl.index.index INFO   Index already exists, not overwriting.

!!! note Note

If you're using LangGraph Platform or LangGraph Studio, you __don't need__ to pass store to the entrypoint decorator, since it's done automatically.

Run the workflow!#

Now let’s specify a user ID in the config and tell the model our name:

config = {"configurable": {"thread_id": "1", "user_id": "1"}}
input_message = {"role": "user", "content": "Hi! Remember: my name is Bob"}
for chunk in workflow.stream([input_message], config, stream_mode="values"):
    chunk.pretty_print()
00:35:16 httpx INFO   HTTP Request: POST https://api.openai.com/v1/embeddings "HTTP/1.1 200 OK"
00:35:16 httpx INFO   HTTP Request: POST https://api.openai.com/v1/embeddings "HTTP/1.1 200 OK"
00:35:19 httpx INFO   HTTP Request: POST https://api.anthropic.com/v1/messages "HTTP/1.1 200 OK"
================================== Ai Message ==================================

Hello Bob! Nice to meet you. I'll remember that your name is Bob. How can I help you today?
config = {"configurable": {"thread_id": "2", "user_id": "1"}}
input_message = {"role": "user", "content": "what is my name?"}
for chunk in workflow.stream([input_message], config, stream_mode="values"):
    chunk.pretty_print()
00:35:19 httpx INFO   HTTP Request: POST https://api.openai.com/v1/embeddings "HTTP/1.1 200 OK"
00:35:21 httpx INFO   HTTP Request: POST https://api.anthropic.com/v1/messages "HTTP/1.1 200 OK"
================================== Ai Message ==================================

Your name is Bob! How can I help you today, Bob?

We can now inspect our Redis store and verify that we have in fact saved the memories for the user:

for memory in redis_store.search(("memories", "1")):
    print(memory.value)
{'data': 'User name is Bob'}
{'data': 'User name is Bob'}
{'data': 'User name is Bob'}
{'data': 'User name is Bob'}

Let’s now run the workflow for another user to verify that the memories about the first user are self contained:

config = {"configurable": {"thread_id": "3", "user_id": "2"}}
input_message = {"role": "user", "content": "what is my name?"}
for chunk in workflow.stream([input_message], config, stream_mode="values"):
    chunk.pretty_print()
00:35:21 httpx INFO   HTTP Request: POST https://api.openai.com/v1/embeddings "HTTP/1.1 200 OK"
00:35:24 httpx INFO   HTTP Request: POST https://api.anthropic.com/v1/messages "HTTP/1.1 200 OK"
================================== Ai Message ==================================

I don't have any information about your name. You haven't introduced yourself to me yet! What would you like me to call you?