AgentCreator FAQ

Frequently Asked Questions about SnapLogic AgentCreator

Is AgentCreator the same feature as GenAI App Builder?

Yes, GenAI App Builder was the name of the first SnapLogic's GenAI feature, and AgentCreator is the evolution of this feature.

Is AgentCreator an LLM?

No, but it can connect to LLMs. Supported LLM vendors are Amazon Claude, OpenAI, Azure OpenAI, and Google Gemini.

Can I build LLM apps with AgentCreator?

Yes, you can build pipelines that connect to LLMs in your system.

Does AgentCreator support tool calling?

Yes, Agent Creator supports tool calling capabilities from Amazon, OpenAI, and Azure OpenAI.

Can I use AgentCreator to build Agents?

Yes, with the support for Amazon Bedrock agents, tool calling capabilities from different LLM vendors, and the introduction of Pipe Loop Snap, You can now create LLM-based agents.

Are various levels of chunking (character, phrase, or document) supported?

Yes. We currently support a paragraph-based chunker (in implementation) that supports character limit or token limit, but any chunking is possible in our platform through pipeline design.

Is there a choice among the embeddings provided by the LLM?

Yes, we support all OpenAI embedding algorithms, but you may also leverage the Titan embedding models using the Amazon Bedrock Snap Pack.

Since the search in the vector database can use different algorithms (such as cosine similarity or kNN), can these be chosen or are they implemented by default?

Yes, however, this depends on your vector database. Some allow changing the search type at the database level, but some do not.

When uploading source files (such as a PDF), is it possible to retrieve only the modified files and update the database?

Yes, but this depends on the implementation of your RAG approach. You can build a RAG-like pipeline to support this functionality, but if you store the data in a way it can be updated or upserted to your vector database.