redis_openai_agents.middleware.ConversationMemoryMiddleware#
- class ConversationMemoryMiddleware(history, *, session_tag=None, top_k=5, distance_threshold=None, persist_reply=True, response_text_extractor=None)[source]#
Prepend semantically relevant past messages into the request.
- Parameters:
history (SemanticMessageHistory) – A
SemanticMessageHistoryinstance backing the retrieval and storage. Callers are responsible for its lifecycle.session_tag (str | None) – Tag passed to history queries and inserts. Allows tenant / user / conversation isolation.
top_k (int) – Maximum number of past messages to prepend.
distance_threshold (float | None) – Optional override for relevance matching.
persist_reply (bool) – When True (default), the assistant reply is also stored back into history so follow-up turns can retrieve it.
response_text_extractor (Any) – Optional callable that turns a model response into the text to store. Defaults to best-effort extraction from OpenAI Responses-shaped responses.
- __init__(history, *, session_tag=None, top_k=5, distance_threshold=None, persist_reply=True, response_text_extractor=None)[source]#
- Parameters:
history (SemanticMessageHistory)
session_tag (str | None)
top_k (int)
distance_threshold (float | None)
persist_reply (bool)
response_text_extractor (Any)
- Return type:
None
Methods
__init__(history, *[, session_tag, top_k, ...])awrap_model_call(request, handler)