| Field | Type | Required | Description | Example |
|---|---|---|---|---|
save_chat |
Optional[bool] | ➖ | Save the current interaction as a Chat for the user to access and potentially continue later. | |
chat_id |
Optional[str] | ➖ | The id of the Chat that context should be retrieved from and messages added to. An empty id starts a new Chat, and the Chat is saved if saveChat is true. | |
messages |
List[models.ChatMessage] | ✔️ | A list of chat messages, from most recent to least recent. It can be assumed that the first chat message in the list is the user's most recent query. | |
agent_config |
Optional[models.AgentConfig] | ➖ | Describes the agent that executes the request. | |
inclusions |
Optional[models.ChatRestrictionFilters] | ➖ | N/A | |
exclusions |
Optional[models.ChatRestrictionFilters] | ➖ | N/A | |
timeout_millis |
Optional[int] | ➖ | Timeout in milliseconds for the request. A 408 error will be returned if handling the request takes longer. |
30000 |
session_info |
Optional[models.SessionInfo] | ➖ | N/A | |
application_id |
Optional[str] | ➖ | The ID of the application this request originates from, used to determine the configuration of underlying chat processes. This should correspond to the ID set during admin setup. If not specified, the default chat experience will be used. | |
agent_id |
Optional[str] | ➖ | The ID of the Agent that should process this chat request. Only Agents with trigger set to 'User chat message' are invokable through this API. If not specified, the default chat experience will be used. | |
stream |
Optional[bool] | ➖ | If set, response lines will be streamed one-by-one as they become available. Each will be a ChatResponse, formatted as JSON, and separated by a new line. If false, the entire response will be returned at once. Note that if this is set and the model being used does not support streaming, the model's response will not be streamed, but other messages from the endpoint still will be. |