Amazon Bedrock Converse API
Overview
You can use this Snap to generate messages using the specified Amazon Bedrock Converse API model and model parameters.
- Transform-type Snap
- Works in Ultra Pipelines
Prerequisites
None.
Limitations and known issues
None.
Snap views
View | Description | Examples of upstream and downstream Snaps |
---|---|---|
Input | This Snap supports a maximum of one binary or document input view.
|
|
Output | This Snap has at the most one document output view. The Snap provides the result generated by the Amazon Bedrock Converse API. | Mapper |
Error |
Error handling is a generic way to handle errors without losing data or failing the Snap execution. You can handle the errors that the Snap might encounter when running the pipeline by choosing one of the following options from the When errors occur list under the Views tab. The available options are:
Learn more about Error handling in Pipelines. |
Snap settings
- Suggestion icon (): Indicates a list that is dynamically populated based on the configuration.
- Expression icon (): Indicates whether the value is an expression (if enabled) or a static value (if disabled). Learn more about Using Expressions in SnapLogic.
- Add icon (): Indicates that you can add fields in the field set.
- Remove icon (): Indicates that you can remove fields from the field set.
Field / Field set | Type | Description |
---|---|---|
Label | String |
Required. Specify a unique name for the Snap. Modify this to be more appropriate, especially if more than one of the same Snaps is in the pipeline. Default value: Amazon Bedrock Converse API Example: Create customer support chatbots |
Model name | String/Expression/Suggestion |
Required. Specify the model name to use for converse API. Learn more about the list of supported Amazon Bedrock Converse API models. Default value: N/A Example: anthropic.claude-3-sonnet |
Use message payload | Checkbox |
Select this checkbox to generate responses using the messages specified in the Message payload field. Note:
Default status: Deselected |
Message payload | String/Expression |
Appears when you select the Use message payload checkbox. Required. Specify the prompt to send to the chat completions endpoint as the user message. The expected data type for this field is a list of objects (a list of messages). You can generate this list with the Amazon Bedrock Prompt Generator Snap. For example,
Note:
Default value: N/A Example: $messages |
Prompt | String/Expression |
Appears when you select Document as the Input type. Required. Specify the prompt to send to the chat completions endpoint as the user message. Default value: N/A Example: $msg |
Model parameters | Configure the parameters to tune the model runtime. | |
Maximum tokens | Integer/Expression |
Specify the maximum number of tokens to generate in the chat completion. If left
blank, the default will be set to the specified model's maximum allowed value.
Learn more.
Note: The response may
be incomplete if the sum of the prompt tokens and Maximum tokens exceed the
allowed token limit for the model. Minimum value: 1Default value: N/A Example: 100 |
Temperature | Decimal/Expression |
Specify the sampling temperature to use a decimal value between 0 and 1. If left blank, the model will use its default value. Learn more. Default value: N/A Example: 0.2 |
Top P | Decimal/Expression |
Specify the nucleus sampling value, a decimal value between 0 and 1. If left blank, the model will use its default value. Learn more. Default value: N/A Example: 0.2 |
Stop sequences | String/Expression |
Specify a list of stop sequences. A stop sequence is a sequence of characters that causes the model to stop generating the response. To use a list of strings, expression must be enabled. Default value: N/A Example:
|
Advanced prompt configuration | Appears when a compatible model is selected. Learn more about the supported models. Configure the advanced prompt settings. |
|
JSON mode | Checkbox/Expression |
Select this checkbox to enable the model to generate strings that can be parsed into valid JSON objects. The output includes a field named json_output that contains the parsed JSON object, encapsulating the data. Learn more about the supported models. Note:
Default status: Deselected |
System Prompt | String/Expression |
Specify the prompt (inital instruction). This prompt prepares for the conversation by defining the model's role, personality, tone, and other relevant details to understand and respond to the user's input. Learn more about the supported models. Note:
Default value: N/A Example:
|
Snap execution | Dropdown list |
Select one of the three modes in which the Snap executes.
Available options are:
Default value: Validate & Execute Example: Execute only |
Additional information
The following table lists the supported models for the JSON Mode and System prompt fields:
Field name | Supported models |
---|---|
JSON Mode | This field only supports models that starts with
anthropic.claude:
|
System prompt |
|