redis_openai_agents.MiddlewareStack#
- class MiddlewareStack(model, middlewares)[source]#
Compose middleware around an OpenAI Agents SDK
Model.Example:
base = OpenAIResponsesModel(model="gpt-4o") stack = MiddlewareStack( model=base, middlewares=[SemanticCacheMiddleware(cache), MetricsMiddleware(metrics)], ) result = await Runner.run(agent, "Hello", model=stack)
- Parameters:
model (Any) – The inner
Modelinstance that performs the actual LLM call.middlewares (Sequence[AgentMiddleware]) – Sequence of middleware applied outermost-first.
- __init__(model, middlewares)[source]#
- Parameters:
model (Any)
middlewares (Sequence[AgentMiddleware])
- Return type:
None
Methods
__init__(model, middlewares)close()Release any resources held by the model.
get_response(system_instructions, input, ...)Run the middleware chain then the underlying model.
get_retry_advice(request)Return provider-specific retry guidance for a failed model request.
stream_response(system_instructions, input, ...)Delegate streaming directly to the inner model.
Attributes
innerThe wrapped
Modelinstance.middlewaresApplied middlewares, in outer-to-inner order.