Snap Pack History: OpenAI LLM
|
|
|
Updates |
---|---|---|---|
|
|
|
Added the following Snaps to the OpenAI LLM Snap Pack:
Enhanced the following Snaps with Advanced response
configurations to customize response behavior that generates precise
tuning of outputs:
|
|
|
|
Added the following Snaps to the OpenAI LLM Snap Pack:
|
|
|
|
Enhanced the OpenAI Chat Completions Snap with the Stop sequences field to support specifying stop sequences as a model parameter. You can specify the sequence of tokens or text to indicate the model when it should stop generating text. |
|
|
|
Added the following Snaps to the OpenAI LLM Snap Pack:
|
|
|
|
|
|
|
|
Enhanced the OpenAI Chat Completions Snap with the Advanced prompt configurations field set that enables the Snap to parse JSON objects to include them in the output and configure prompts to guide the responses or actions. |
|
|
|
Fixed an issue with OpenAI Prompt Generator Snap that displayed an error when handling mustache operator, {{.}} from the input schema. |
|
|
|
Updated and certified against the current SnapLogic Platform release. |
|
|
|
Added support for Model configuration in the OpenAI Embedder Snap. |
|
|
|
Updated and certified against the current SnapLogic Platform release. |
|
|
|
Introduced the OpenAI LLM Snap Pack, which contains the following Snaps and Account:
|