AI brokers are quickly reworking enterprise operations. Though a single agent can carry out particular duties successfully, advanced enterprise processes typically span a number of methods, requiring knowledge retrieval, evaluation, decision-making, and motion execution throughout totally different methods. With multi-agent collaboration, specialised AI brokers can work collectively to automate intricate workflows.
This put up explores a sensible collaboration, integrating Salesforce Agentforce with Amazon Bedrock Brokers and Amazon Redshift, to automate enterprise workflows.
Multi-agent collaboration in Enterprise AI
Enterprise environments right this moment are advanced, that includes numerous applied sciences throughout a number of methods. Salesforce and AWS present distinct benefits to prospects. Many organizations already preserve vital infrastructure on AWS, together with knowledge, AI, and varied enterprise functions similar to ERP, finance, provide chain, HRMS, and workforce administration methods. Agentforce delivers highly effective AI-driven agent capabilities which might be grounded in enterprise context and knowledge. Whereas Salesforce gives a wealthy supply of trusted enterprise knowledge, prospects more and more want brokers that may entry and act on info throughout a number of methods. By integrating AWS-powered AI providers into Agentforce, organizations can orchestrate clever brokers that function throughout Salesforce and AWS, unlocking the strengths of each.
Agentforce and Amazon Bedrock Brokers can work collectively in versatile methods, leveraging the distinctive strengths of each platforms to ship smarter, extra complete AI workflows. Instance collaboration fashions embrace:
- Agentforce as the first orchestrator:
- Manages finish to finish customer-oriented workflows
- Delegates specialised duties to Amazon Bedrock Brokers as wanted by means of customized actions
- Coordinates entry to exterior knowledge and providers throughout methods
This integration creates a extra highly effective resolution that maximizes the advantages of each Salesforce and AWS, so you may obtain higher enterprise outcomes by means of enhanced AI capabilities and cross-system performance.
Agentforce overview
Agentforce brings digital labor to each worker, division, and enterprise course of, augmenting groups and elevating buyer experiences.It really works seamlessly along with your present functions, knowledge, and enterprise logic to take significant motion throughout the enterprise. And since it’s constructed on the trusted Salesforce platform, your knowledge stays safe, ruled, and in your management. With Agentforce, you may:
- Deploy prebuilt brokers designed for particular roles, industries, or use instances
- Allow brokers to take motion with present workflows, code, and APIs
- Join your brokers to enterprise knowledge securely
- Ship correct and grounded outcomes by means of the Atlas Reasoning Engine
Amazon Bedrock Brokers and Amazon Bedrock Data Bases overview
Amazon Bedrock is a totally managed AWS service providing entry to high-performing basis fashions (FMs) from varied AI corporations by means of a single API. On this put up, we focus on the next options:
- Amazon Bedrock Brokers – Managed AI brokers use FMs to grasp person requests, break down advanced duties into steps, preserve dialog context, and orchestrate actions. They’ll work together with firm methods and knowledge sources by means of APIs (configured by means of motion teams) and entry info by means of data bases. You present directions in pure language, choose an FM, and configure knowledge sources and instruments (APIs), and Amazon Bedrock handles the orchestration.
- Amazon Bedrock Data Bases – This functionality allows brokers to carry out Retrieval Augmented Era (RAG) utilizing your organization’s non-public knowledge sources. You join the data base to your knowledge hosted in AWS, similar to in Amazon Easy Storage Service (Amazon S3) or Amazon Redshift, and it robotically handles the vectorization and retrieval course of. When requested a query or given a activity, the agent can question the data base to search out related info, offering extra correct, context-aware responses and selections with no need to retrain the underlying FM.
Agentforce and Amazon Bedrock Agent integration patterns
Agentforce can name Amazon Bedrock brokers in several methods, permitting flexibility to construct totally different architectures. The next diagram illustrates synchronous and asynchronous patterns.
For a synchronous or request-reply interplay, Agentforce makes use of customized agent actions facilitated by Exterior Companies, Apex Invocable Strategies, or Circulate to name an Amazon Bedrock agent. The authentication to AWS is facilitated utilizing named credentials. Named credentials are designed to securely handle authentication particulars for exterior providers built-in with Salesforce. They alleviate the necessity to hardcode delicate info like person names and passwords, minimizing the chance of publicity and potential knowledge breaches. This separation of credentials from the applying code can considerably improve safety posture. Named credentials streamline integration by offering a centralized and constant technique for dealing with authentication, decreasing complexity and potential errors. You should use Salesforce Non-public Join to offer a safe non-public reference to AWS utilizing AWS PrivateLink. Seek advice from Non-public Integration Between Salesforce and Amazon API Gateway for extra particulars.
For asynchronous calls, Agentforce makes use of Salesforce Occasion Relay and Circulate with Amazon EventBridge to name an Amazon Bedrock agent.
On this put up, we focus on the synchronous name sample. We encourage you to discover Salesforce Occasion Relay with EventBridge to construct event-driven agentic AI workflows. Agentforce additionally presents the Agent API, which makes it simple to name an Agentforce agent from an Amazon Bedrock agent, utilizing EventBridge API locations, for bi-directional agentic AI workflows.
Answer overview
As an example the multi-agent collaboration between Agentforce and AWS, we use the next structure, which gives entry to Web of Issues (IoT) sensor knowledge to the Agentforce agent and handles doubtlessly inaccurate sensor readings utilizing a multi-agent strategy.
The instance workflow consists of the next steps:
- Coral Cloud has outfitted their rooms with good air conditioners and temperature sensors. These IoT gadgets seize crucial info similar to room temperature and error code and retailer it in Coral Cloud’s AWS database in Amazon Redshift.
- Agentforce agent calls an Amazon Bedrock agent by means of the Agent Wrapper API with questions similar to “What’s the temperature in room 123” to reply buyer questions associated to the consolation of the room. This API is applied as an AWS Lambda perform, performing because the entry level within the AWS Cloud.
- The Amazon Bedrock agent, upon receiving the request, wants context. It queries its configured data base by producing the required SQL question.
- The data base is related to a Redshift database containing historic sensor knowledge or contextual info (just like the sensor’s thresholds and upkeep historical past). It retrieves related info primarily based on the agent’s question and responds again with a solution.
- With the preliminary knowledge and the context from the data base, the Amazon Bedrock agent makes use of its underlying FM and pure language directions to resolve the suitable motion. On this situation, detecting an error prompts it to create a case when it receives inaccurate readings from a sensor.
- The motion group comprises the Agentforce Agent Wrapper Lambda perform. The Amazon Bedrock agent securely passes the required particulars (like which sensor or room wants a case) to this perform.
- The Agentforce Agent Wrapper Lambda perform acts as an adapter. It interprets the request from the Amazon Bedrock agent into the precise format required by the Agentforce service‘s API or interface.
- The Lambda perform calls Agentforce, instructing it to create a case related to the contact or account linked to the sensor that despatched the inaccurate studying.
- Agentforce makes use of its inner logic (agent, matters, and actions) to create or escalate the case inside Salesforce.
This workflow demonstrates how Amazon Bedrock Brokers orchestrates duties, utilizing Amazon Bedrock Data Bases for context and motion teams (by means of Lambda) to work together with Agentforce to finish the end-to-end course of.
Conditions
Earlier than constructing this structure, be sure you have the next:
- AWS account – An lively AWS account with permissions to make use of Amazon Bedrock, Lambda, Amazon Redshift, AWS Id and Entry Administration (IAM), and API Gateway.
- Amazon Bedrock entry – Entry to Amazon Bedrock Brokers and to Anthropic’s Claude 3.5 Haiku v1 enabled in your chosen AWS Area.
- Redshift assets – An operational Redshift cluster or Amazon Redshift Serverless endpoint. The related tables containing sensor knowledge (historic readings, sensor thresholds, and upkeep historical past) should be created and populated.
- Agentforce system – Entry to and understanding of the Agentforce system, together with how you can configure it. You’ll be able to join for a developer version with Agentforce and Knowledge Cloud.
- Lambda data – Familiarity with creating, deploying, and managing Lambda features (utilizing Python).
- IAM roles and insurance policies – Understanding of how you can create IAM roles with the obligatory permissions for Amazon Bedrock Brokers, Lambda features (to name Amazon Bedrock, Amazon Redshift, and the Agentforce API), and Amazon Bedrock Data Bases.
Put together Amazon Redshift knowledge
Ensure your knowledge is structured and out there in your Redshift occasion. Be aware the database identify, credentials, and desk and column names.
Create IAM roles
For this put up, we create two IAM roles:
custom_AmazonBedrockExecutionRoleForAgents
:- Connect the next AWS managed insurance policies to the function:
AmazonBedrockFullAccess
AmazonRedshiftDataFullAccess
- Within the belief relationship, present the next belief coverage (present your AWS account ID):
- Connect the next AWS managed insurance policies to the function:
custom_AWSLambdaExecutionRole
:- Connect the next AWS managed insurance policies to the function:
AmazonBedrockFullAccess
AmazonLambdaBasicExecutionRole
- Within the belief relationship, present the next belief coverage (present your AWS account ID):
- Connect the next AWS managed insurance policies to the function:
Create an Amazon Bedrock data base
Full the next steps to create an Amazon Bedrock data base:
- On the Amazon Bedrock console, select Data Bases within the navigation pane.
- Select Create and Data Base with structured knowledge retailer.
- On the Present Data Base particulars web page, present the next info:
- Enter a reputation and elective description.
- For Question engine, choose Amazon Redshift.
- For IAM permissions, choose Use an present service function and select
custom_AmazonBedrockExecutionRoleForAgents
. - Select Subsequent.
- For Question engine connection particulars, choose Redshift provisioned and select your cluster.
- For Authentication, choose IAM Position.
- For Storage configuration, choose Amazon Redshift database and Redshift database listing.
- On the Configure question engine web page, present the next info:
- Present desk and column descriptions. The next is an instance.
- Select Create Data Base.
- After you create the data base, open the Redshift question editor and grant permissions for the function to entry Redshift tables by working the next queries:
For extra info, seek advice from arrange your question engine and permissions for making a data base with structured knowledge retailer.
- 5. Select Sync to sync the question engine.
Ensure the standing reveals as Full earlier than shifting to the following steps.
- When the sync is full, select Take a look at Data Base.
- Choose Retrieval and response technology: knowledge sources and mannequin and select Claude 3.5 Haiku for the mannequin.
- Enter a query about your knowledge and be sure you get a sound reply.
Create an Amazon Bedrock agent
Full the next steps to create an Amazon Bedrock agent:
- On the Amazon Bedrock console, select Brokers within the navigation pane.
- Select Create agent.
- On the Agent particulars web page, present the next info:
- Enter a reputation and elective description.
- For Agent useful resource function, choose Use an present service function and select
custom_AmazonBedrockExecutionRoleForAgents
.
- Present detailed directions on your agent. The next is an instance:
- Select Save to save lots of the agent.
- Add the data base you created in earlier step to this agent.
- Present detailed directions concerning the data base for the agent.
- Select Save after which select Put together the agent.
- Take a look at the agent by asking a query (within the following instance, we ask about sensor readings).
- Select Create alias.
- On the Create alias web page, present the next info:
- Enter an alias identify and elective description.
- For Affiliate model, choose Create a brand new model and affiliate it to this alias.
- For Choose throughput, choose On-demand.
- Select Create alias.
- Be aware down the agent ID, which you’ll use in subsequent steps.
- Be aware down the alias ID and agent ID, which you’ll use in subsequent steps.
Create a Lambda perform
Full the next steps to create a Lambda perform to obtain requests from Agentforce:
- On the Lambda console, select Capabilities within the navigation pane.
- Select Create perform.
- Configure the perform with the next logic to obtain requests by means of API Gateway and name Amazon Bedrock brokers:
- Outline the required IAM permissions by assigning
custom_AWSLambdaExecutionRole
.
Create a REST API
Full the next steps to create a REST API in API Gateway:
- On the API Gateway console, create a REST API with proxy integration.
- Allow API key required to guard the API from unauthenticated entry.
- Configure the utilization plan and API key. For extra particulars, see Arrange API keys for REST APIs in API Gateway.
- Deploy the API.
- Be aware down the Invoke URL to make use of in subsequent steps.
Create named credentials in Salesforce
Now that you’ve created an Amazon Bedrock agent with an API Gateway endpoint and Lambda wrapper, let’s full the configuration on the Salesforce aspect. Full the next steps:
- Log in to Salesforce.
- Navigate to Setup, Safety, Named Credentials.
- On the Exterior Credentials tab, select New.
- Present the next info:
- Enter a label and identify.
- For Authentication Protocol, select Customized.
- Select Save.
- Open the Exterior Credentials entry to offer further particulars:
- Below Principals, create a brand new principal and supply the parameter identify and worth.
-
- Below Customized Headers, create a brand new entry and supply a reputation and worth.
- Select Save.
Now you may grant entry to the agent person to entry these credentials.
- Navigate to Setup, Customers, Consumer Profile, Enabled Exterior Credential Principal Entry and add the exterior credential principal you created to the enable listing.
- Select New to create a named credentials entry.
- Present particulars similar to label, identify, the URL of the API Gateway endpoint, and authentication protocol, then select Save.
You’ll be able to optionally use Salesforce Non-public Join with PrivateLink to offer a safe non-public reference to. This permits crucial knowledge to circulate from the Salesforce surroundings to AWS with out utilizing the general public web.
Add an exterior service in Salesforce
Full the next steps so as to add an exterior service in Salesforce:
- In Salesforce, navigate to Setup, Integrations, Exterior Companies and select Add an Exterior Service.
- For Choose an API supply, select From API Specification.
- On the Edit an Exterior Service web page, present the next info:
- Enter a reputation and elective description.
- For Service Schema, select Add from native.
- For Choose a Named Credential, select the named credential you created.
- Add an Open API specification for the API Gateway endpoint. See the next instance:
- Select Save and Subsequent.
- Allow the operation to make it out there for Agentforce to invoke.
- Select End.
Create an Agentforce agent motion to make use of the exterior service
Full the next steps to create an Agentforce agent motion:
- In Salesforce, navigate to Setup, Agentforce, Einstein Generative AI, Agentforce Studio, Agentforce Belongings.
- On the Actions tab, select New Agent Motion.
- Below Hook up with an present motion, present the next info:
- For Reference Motion Sort, select API.
- For Reference Motion Class, select Exterior Companies.
- For Reference Motion, select the Name Bedrock Agent motion that you just configured.
- Enter an agent motion label and API identify.
- Select Subsequent.
- Present the next info to finish the agent motion configuration:
- For Agent Motion Directions, enter Name Bedrock Agent to get the details about system readings, sensor readings, upkeep or threshold info.
- For Loading Textual content, enter Calling Bedrock Agent.
- Below Enter, for Physique, enter Present the enter within the enter Textual content discipline.
- Below Outputs, for 200, enter Profitable response.
- Save the agent motion.
Configure the Agentforce agent to make use of the agent motion
Full the next steps to configure the Agentforce agent to make use of the agent motion:
- In Salesforce, navigate to Setup, Agentforce, Einstein Generative AI, Agentforce Studio, Agentforce Brokers and open the agent in Agent Builder.
- Create a brand new subject.
- On the Matter Configuration tab, present the next info:
- For Title, enter Gadget Info.
- For Classification Description, enter This subject handles inquiries associated to system and sensor info, together with studying, upkeep, and threshold.
- For Scope, enter Your job is simply to offer details about system readings, sensor readings, system upkeep, sensor upkeep, and threshold. Don’t try to handle points exterior of offering system info.
- For Directions, enter the next:
- On the This Matter’s Actions tab, select New and Add from Asset Library.
- Select the Name Bedrock Agent motion.
- Activate the agent and enter a query, similar to “What’s the newest studying for sensor with system id CITDEV003.”
The agent will point out that it’s calling the Amazon Bedrock agent, as proven within the following screenshot.
The agent will fetch the data utilizing the Amazon Bedrock agent from the related data base.
Clear up
To keep away from further prices, delete the assets that you just created while you not want them:
- Delete the Amazon Bedrock data base:
- On the Amazon Bedrock console, select Data Bases within the navigation pane.
- Choose the data base you created and select Delete.
- Delete the Amazon Bedrock agent:
- On the Amazon Bedrock console, select Brokers within the navigation pane.
- Choose the agent you created and select Delete.
- Delete the Lambda perform:
- On the Lambda console, select Capabilities within the navigation pane.
- Choose the perform you created and select Delete.
- Delete the REST API:
- On the API Gateway console, select APIs within the navigation pane.
- Choose the REST API you created and select Delete.
Conclusion
On this put up, we described an structure that demonstrates the facility of mixing AI providers on AWS with Agentforce. Through the use of Amazon Bedrock Brokers and Amazon Bedrock Data Bases for contextual understanding by means of RAG, and Lambda features and API Gateway to bridge interactions with Agentforce, companies can construct refined, automated workflows. As AI capabilities proceed to develop, such collaborative multi-agent methods will turn into more and more central to enterprise automation methods. In an upcoming put up, we’ll present you how you can construct the asynchronous integration sample from Agentforce to Amazon Bedrock utilizing Salesforce Occasion Relay.
To get began, see Change into an Agentblazer Innovator and seek advice from How Amazon Bedrock Brokers works.
Concerning the authors
Yogesh Dhimate is a Sr. Associate Options Architect at AWS, main expertise partnership with Salesforce. Previous to becoming a member of AWS, Yogesh labored with main corporations together with Salesforce driving their trade resolution initiatives. With over 20 years of expertise in product administration and options structure Yogesh brings distinctive perspective in cloud computing and synthetic intelligence.
Kranthi Pullagurla has over 20+ years’ expertise throughout Software Integration and Cloud Migrations throughout A number of Cloud suppliers. He works with AWS Companions to construct options on AWS that our joint prospects can use. Previous to becoming a member of AWS, Kranthi was a strategic advisor at MuleSoft (now Salesforce). Kranthi has expertise advising C-level buyer executives on their digital transformation journey within the cloud.
Shitij Agarwal is a Associate Options Architect at AWS. He creates joint options with strategic ISV companions to ship worth to prospects. When not at work, he’s busy exploring NY city and the climbing trails that encompass it, and occurring bike rides.
Ross Belmont is a Senior Director of Product Administration at Salesforce overlaying Platform Knowledge Companies. He has greater than 15 years of expertise with the Salesforce ecosystem.
Sharda Rao is a Senior Director of Product Administration at Salesforce overlaying Agentforce Go To Market technique
Hunter Reh is an AI Architect at Salesforce and a passionate builder who has developed over 100 brokers for the reason that launch of Agentforce. Exterior of labor, he enjoys exploring new trails on his bike or getting misplaced in an excellent ebook.