Optional
apiThe API key to use.
Optional
modelThe name of the model to use.
Optional
streamWhether or not to include token usage when streaming.
This will include an extra chunk at the end of the stream
with eventType: "stream-end"
and the token usage in
usage_metadata
.
Optional
streamingWhether or not to stream the response.
Optional
temperatureWhat sampling temperature to use, between 0.0 and 2.0. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
Input interface for ChatCohere