You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -122,9 +124,26 @@ In order to serve inference as part of burpference, the model must be running on
122
124
}
123
125
```
124
126
125
-
## Model System Prompts
127
+
### Cohere `/v2/chat` Inference
128
+
129
+
#### Example Cohere `/v2/chat` inference
130
+
131
+
```json
132
+
{
133
+
"api_type": "cohere",
134
+
"headers": {
135
+
"Authorization": "bearer CO_API_KEY",
136
+
"accept": "application/json",
137
+
"content-type": "application/json"
138
+
},
139
+
"host": "https://api.cohere.com/v2/chat",
140
+
"model": "command-r-plus-08-2024",
141
+
"stream": false
142
+
}
143
+
```
126
144
127
-
By default, the system prompt sent as pretext to the model is defined [here](../prompts/proxy_prompt.txt), feel free to edit, tune and tweak as you see fit.
145
+
## Model System Prompts
128
146
147
+
By default, the system prompt sent as pretext to the model is defined [here](../prompts/proxy_prompt.txt), feel free to edit, tune and tweak as you see fit. This is also true for the scanner extension tab.
0 commit comments