Usage Guide¶
This guide covers all endpoints, request/response formats, and error handling.
Endpoints overview¶
When you register a graph named my_agent, the following endpoints are created:
| Method | Route | Description |
|---|---|---|
POST |
/api/graphs/my_agent/invoke |
Synchronous graph invocation |
POST |
/api/graphs/my_agent/stream |
Buffered SSE streaming |
GET |
/api/graphs/my_agent/threads/{thread_id}/state |
Thread state inspection |
GET |
/api/health |
Health check with registered graph list |
Invoke endpoint¶
Request¶
{
"input": {
"messages": [{"role": "human", "content": "Hello!"}]
},
"config": {
"configurable": {"thread_id": "conversation-1"}
},
"metadata": {
"user_id": "abc-123"
}
}
| Field | Type | Required | Description |
|---|---|---|---|
input |
dict |
Yes | Input state for the graph |
config |
dict or null |
No | LangGraph config (e.g., thread_id for checkpointer) |
metadata |
dict or null |
No | Additional metadata passed to the run |
Response¶
{
"output": {
"messages": [
{"role": "human", "content": "Hello!"},
{"role": "assistant", "content": "Hi there!"}
]
}
}
The output field contains the full graph output state.
Stream endpoint¶
Request¶
{
"input": {
"messages": [{"role": "human", "content": "Hello!"}]
},
"config": {
"configurable": {"thread_id": "conversation-1"}
},
"stream_mode": "values"
}
| Field | Type | Required | Default | Description |
|---|---|---|---|---|
input |
dict |
Yes | — | Input state for the graph |
config |
dict or null |
No | null |
LangGraph config |
stream_mode |
str |
No | "values" |
Stream mode: "values", "updates", "messages", or "custom" |
metadata |
dict or null |
No | null |
Additional metadata |
Response¶
The response is formatted as Server-Sent Events (SSE):
event: data
data: {"messages": [{"role": "human", "content": "Hello!"}]}
event: data
data: {"messages": [{"role": "human", "content": "Hello!"}, {"role": "assistant", "content": "Hi!"}]}
event: end
data: {}
Buffered streaming (v0.2)
In v0.2, streaming is buffered — all chunks are collected first, then returned as a single SSE-formatted HTTP response. This is not true chunked streaming. True streaming support is planned for a future release. In v0.1, streaming is buffered — all chunks are collected first, then returned as a single SSE-formatted HTTP response. This is not true chunked streaming. True streaming support is planned for a future release.
Stream error handling¶
If the graph raises an exception during streaming, an error event is included:
Health endpoint¶
Request¶
Response¶
{
"status": "ok",
"graphs": [
{
"name": "my_agent",
"description": "Customer support agent",
"has_checkpointer": true
}
]
}
State endpoint¶
Request¶
Returns the current state of a thread for graphs compiled with a checkpointer (graphs satisfying the StatefulGraph protocol).
Response¶
{
"values": {"messages": [...]},
"next": [],
"metadata": {"source": "loop", "step": 2},
"config": {"configurable": {"thread_id": "session-123"}},
"created_at": "2025-01-01T00:00:00Z",
"parent_config": null
}
| Status Code | Condition |
|---|---|
| 200 | Thread state found |
| 404 | Thread not found or graph is not stateful |
| 500 | Internal error |
OpenAPI generation¶
Use azure-functions-openapi with
azure_functions_langgraph.openapi.register_with_openapi. See
Migrating to azure-functions-openapi below.
Migrating to azure-functions-openapi¶
Starting with v0.5.0, OpenAPI spec generation is moving to the dedicated
azure-functions-openapi package.
Using the bridge¶
from azure_functions_langgraph import LangGraphApp
from azure_functions_langgraph.openapi import register_with_openapi
app = LangGraphApp()
app.register(graph=graph, name="agent", request_model=MyRequest)
# Forward route metadata to azure-functions-openapi
count = register_with_openapi(app)
print(f"Registered {count} routes with azure-functions-openapi")
The bridge reads route metadata from LangGraphApp.get_app_metadata() and forwards it
to azure-functions-openapi's register_openapi_metadata() programmatic API.
Requires azure-functions-openapi >= 0.16.0.
Error responses¶
All error responses follow a consistent format:
| Status Code | Condition |
|---|---|
| 400 | Invalid JSON body |
| 422 | Request validation error (missing/invalid fields) |
| 500 | Graph execution failed |
| 501 | Stream requested on a graph that does not support stream() |
| 404 | Thread not found (state endpoint) |
Using with checkpointers¶
Thread-based conversation state is managed through LangGraph's checkpointer mechanism. The library passes the config from the request body directly to graph.invoke() or graph.stream().
from langgraph.checkpoint.memory import InMemorySaver
checkpointer = InMemorySaver()
graph = builder.compile(checkpointer=checkpointer)
app = LangGraphApp()
app.register(graph=graph, name="stateful_agent")
Then in requests, include thread_id:
{
"input": {"messages": [{"role": "human", "content": "Remember my name is Alice"}]},
"config": {"configurable": {"thread_id": "session-abc"}}
}
Subsequent requests with the same thread_id will resume the conversation:
{
"input": {"messages": [{"role": "human", "content": "What is my name?"}]},
"config": {"configurable": {"thread_id": "session-abc"}}
}
Invoke-only graphs¶
If your graph only supports invoke() (no stream() method), you can still register it. The invoke endpoint works, but the stream endpoint returns a 501 error:
Per-graph authentication¶
Override the app-level auth for individual graphs:
import azure.functions as func
app = LangGraphApp(auth_level=func.AuthLevel.FUNCTION)
# Public graph — no auth required
app.register(graph=public_graph, name="public", auth_level=func.AuthLevel.ANONYMOUS)
# Admin graph — requires master key
app.register(graph=admin_graph, name="admin", auth_level=func.AuthLevel.ADMIN)
When auth_level is passed to register(), it overrides the app-level setting for that graph's endpoints only.