redis_openai_agents.MiddlewareStack#

class MiddlewareStack(model, middlewares)[source]#

Compose middleware around an OpenAI Agents SDK Model.

Example:

base = OpenAIResponsesModel(model="gpt-4o")
stack = MiddlewareStack(
    model=base,
    middlewares=[SemanticCacheMiddleware(cache), MetricsMiddleware(metrics)],
)
result = await Runner.run(agent, "Hello", model=stack)
Parameters:
  • model (Any) – The inner Model instance that performs the actual LLM call.

  • middlewares (Sequence[AgentMiddleware]) – Sequence of middleware applied outermost-first.

__init__(model, middlewares)[source]#
Parameters:
Return type:

None

Methods

__init__(model, middlewares)

close()

Release any resources held by the model.

get_response(system_instructions, input, ...)

Run the middleware chain then the underlying model.

get_retry_advice(request)

Return provider-specific retry guidance for a failed model request.

stream_response(system_instructions, input, ...)

Delegate streaming directly to the inner model.

Attributes

inner

The wrapped Model instance.

middlewares

Applied middlewares, in outer-to-inner order.