Close Menu
    Main Menu
    • Home
    • News
    • Tech
    • Robotics
    • ML & Research
    • AI
    • Digital Transformation
    • AI Ethics & Regulation
    • Thought Leadership in AI

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Influencer Advertising and marketing in Numbers: Key Stats

    March 15, 2026

    INC Ransom Menace Targets Australia And Pacific Networks

    March 15, 2026

    NYT Connections Sports activities Version hints and solutions for March 15: Tricks to remedy Connections #538

    March 15, 2026
    Facebook X (Twitter) Instagram
    UK Tech InsiderUK Tech Insider
    Facebook X (Twitter) Instagram
    UK Tech InsiderUK Tech Insider
    Home»Machine Learning & Research»Join Amazon Bedrock brokers to cross-account information bases
    Machine Learning & Research

    Join Amazon Bedrock brokers to cross-account information bases

    Oliver ChambersBy Oliver ChambersNovember 9, 2025No Comments12 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    Join Amazon Bedrock brokers to cross-account information bases
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link


    Organizations want seamless entry to their structured information repositories to energy clever AI brokers. Nevertheless, when these sources span a number of AWS accounts integration challenges can come up. This submit explores a sensible resolution for connecting Amazon Bedrock brokers to information bases in Amazon Redshift clusters residing in several AWS accounts.

    The problem

    Organizations that construct AI brokers utilizing Amazon Bedrock can preserve their structured information in Amazon Redshift clusters. When these information repositories exist in separate AWS accounts from their AI brokers, they face a big limitation: Amazon Bedrock Data Bases doesn’t natively assist cross-account Redshift integration.

    This creates a problem for enterprises with multi-account architectures who wish to:

    • Leverage current structured information in Redshift for his or her AI brokers.
    • Preserve separation of issues throughout completely different AWS accounts.
    • Keep away from duplicating information throughout accounts.
    • Guarantee correct safety and entry controls.

    Answer overview

    Our resolution permits cross-account information base integration by means of a safe, serverless structure that maintains safe entry controls whereas permitting AI brokers to question structured information. The method makes use of AWS Lambda as an middleman to facilitate safe cross-account information entry.

    The motion stream as proven above:

    1. Customers enter their pure language query in Amazon Bedrock Brokers which is configured within the agent account.
    2. Amazon Bedrock Brokers invokes a Lambda perform by means of motion teams which gives entry to the Amazon Bedrock information base configured within the agent-kb account above.
    3. Motion group Lambda perform working in agent account assumes an IAM function created in agent-kb account above to connect with the information base within the agent-kb account.
    4. Amazon Bedrock Data Base within the agent-kb account makes use of an IAM function created in the identical account to entry Amazon Redshift information warehouse and question information within the information warehouse.

    The answer follows these key elements:

    1. Amazon Bedrock agent within the agent account that handles person interactions.
    2. Amazon Redshift serverless workgroup in VPC and personal subnet within the agent-kb account containing structured information.
    3. Amazon Bedrock Data base utilizing the Amazon Redshift serverless workgroup as structured information supply.
    4. Lambda perform within the agent account.
    5. Motion group configuration to attach the agent within the agent account to the Lambda perform.
    6. IAM roles and insurance policies that allow safe cross-account entry.

    Conditions

    This resolution requires you to have the next:

    1. Two AWS accounts. Create an AWS account for those who don’t have one. Particular permissions required for each account which will likely be arrange in subsequent steps.
    2. Set up the AWS CLI (2.24.22 – present model)
    3. Arrange authentication utilizing IAM person credentials for the AWS CLI for every account
    4. Be sure to have jq put in, jq is light-weight command-line JSON processor. For instance, in Mac you need to use the command brew set up jq (jq-1.7.1-apple – present model) to put in it.
    5. Navigate to the Amazon Bedrock console and be sure you allow entry to the meta.llama3-1-70b-instruct-v1:0 mannequin for the agent-kb account and entry for us.amazon.nova-pro-v1:0 mannequin within the agent account within the us-west-2, US West (Oregon) AWS Area.

    Assumption

    Let’s name the AWS account profile, agent profile that has the Amazon Bedrock agent. Equally, the AWS account profile be referred to as agent-kb that has the Amazon Bedrock information base with Amazon Redshift Serverless and the structured information supply. We are going to use the us-west-2 US West (Oregon) AWS Area however be happy to decide on one other AWS Area as crucial (the conditions will likely be relevant to the AWS Area you select to deploy this resolution in). We are going to use the meta.llama3-1-70b-instruct-v1:0 mannequin for the agent-kb. That is an obtainable on-demand mannequin in us-west-2. You might be free to decide on different fashions with cross-Area inference however that may imply altering the roles and polices accordingly and allow mannequin entry in all Areas they’re obtainable in. Based mostly on our mannequin alternative for this resolution the AWS Area have to be us-west-2. For the agent we will likely be utilizing an Amazon Bedrock agent optimized mannequin like us.amazon.nova-pro-v1:0.

    Implementation walkthrough

    The next is a step-by-step implementation information. Make certain to carry out all steps in the identical AWS Area in each accounts.

    These steps are to deploy and take a look at an end-to-end resolution from scratch and if you’re already working a few of these elements, you could skip over these steps.

      1. Make an observation of the AWS account numbers within the agent and agent-kb account. Within the implementation steps we’ll refer them as follows:
        Profile AWS account Description
        agent 111122223333 Account for the Bedrock Agent
        agent-kb 999999999999 Account for the Bedrock Data base

        Notice: These steps use instance profile names and account numbers, please change with actuals earlier than working.

      2. Create the Amazon Redshift Serverless workgroup within the agent-kb account:
        1. Go surfing to the agent-kb account
        2. Observe the workshop hyperlink to create the Amazon Redshift Serverless workgroup in personal subnet
        3. Make an observation of the namespace, workgroup, and different particulars and observe the remainder of the hands-on workshop directions.
      3. Arrange your information warehouse within the agent-kb account.
      4. Create your AI information base within the agent-kb account. Make an observation of the information base ID.
      5. Prepare your AI Assistant within the agent-kb account.
      6. Take a look at pure language queries within the agent-kb account. You could find the code in aws-samples git repository: sample-for-amazon-bedrock-agent-connect-cross-account-kb.
      7. Create crucial roles and insurance policies in each the accounts. Run the script create_bedrock_agent_kb_roles_policies.sh with the next enter parameters.
        Enter parameter Worth Description
        –agent-kb-profile agent-kb The agent knowledgebase profile that you just arrange with the AWS CLI with aws_access_key_id, aws_secret_access_key as talked about within the conditions.
        –lambda-role lambda_bedrock_kb_query_role That is the IAM function the agent account Bedrock agent motion group lambda will assume to connect with the Redshift cross account
        –kb-access-role bedrock_kb_access_role That is the IAM function the agent-kb account which the lambda_bedrock_kb_query_role in agent account assumes to connect with the Redshift cross account
        –kb-access-policy bedrock_kb_access_policy IAM coverage hooked up to the IAM function bedrock_kb_access_role
        –lambda-policy lambda_bedrock_kb_query_policy IAM coverage hooked up to the IAM function lambda_bedrock_kb_query_role
        –knowledge-base-id XXXXXXXXXX Substitute with the precise information base ID created in Step 4
        –agent-account 111122223333 Substitute with the 12-digit AWS account quantity the place the Bedrock agent is working. (agent account)
        –agent-kb-account 999999999999 Substitute with the 12-digit AWS account quantity the place the Bedrock information base is working. (agent-kb acccount)
      8. Obtain the script (create_bedrock_agent_kb_roles_policies.sh) from the aws-samples GitHub repository.
      9. Open Terminal in Mac or comparable bash shell for different platforms.
      10. Find and alter the listing to the downloaded location, present executable permissions:
        cd /my/location
        chmod +x create_bedrock_agent_kb_roles_policies.sh

      11. In case you are nonetheless not clear on the script utilization or inputs, then you possibly can run the script with the –assist possibility and the script will show the utilization:
        ./create_bedrock_agent_kb_roles_policies.sh –assist
      12. Run the script with the proper enter parameters as described within the earlier desk.
        ./create_bedrock_agent_kb_roles_policies.sh --agent-profile agent  
          --agent-kb-profile agent-kb  
          --lambda-role lambda_bedrock_kb_query_role  
          --kb-access-role bedrock_kb_access_role  
          --kb-access-policy bedrock_kb_access_policy  
          --lambda-policy lambda_bedrock_kb_query_policy  
          --knowledge-base-id XXXXXXXXXX  
          --agent-account 111122223333  
          --agent-kb-account 999999999999

      13. The script on profitable execution exhibits the abstract of the IAM, roles and insurance policies created in each accounts.
      14. Go surfing to each the agent and agent-kb account to confirm the IAM roles and insurance policies are created.
            • For the agent account: Make an observation of the ARN of the lambda_bedrock_kb_query_role as that would be the worth of CloudFormation stack parameter AgentLambdaExecutionRoleArn within the subsequent step.
              Agent IAM Role
            • For the agent-kb account: Make an observation of the ARN of the bedrock_kb_access_role as that would be the worth of CloudFormation stack parameter TargetRoleArn within the subsequent step.
              Agent KB IAM Role
      15. Run the AWS CloudFormation script to create a Bedrock agent:
              1. Obtain the CloudFormation script: cloudformation_bedrock_agent_kb_query_cross_account.yaml from the aws-samples GitHub repository.
              2. Go surfing to the agent account and navigate to the CloudFormation console, and confirm you might be within the us-west-2 (Oregon) Area, select Create stack and select With new sources (commonplace).
              3. Within the Specify template part select Add a template file after which Select file and choose the file from (1). Then, select Subsequent.
              4. Enter the next stack particulars and select Subsequent.
                Parameter Worth Description
                Stack title bedrock-agent-connect-kb-cross-account-agent You possibly can select any title
                AgentFoundationModelId us.amazon.nova-pro-v1:0 Don’t change
                AgentLambdaExecutionRoleArn arn:aws:iam:: 111122223333:function/lambda_bedrock_kb_query_role Substitute with you agent account quantity
                BedrockAgentDescription Agent to question stock information from Redshift Serverless database Preserve this as default
                BedrockAgentInstructions You might be an assistant that helps customers question stock information from our Redshift Serverless database utilizing the motion group. Don’t change
                BedrockAgentName bedrock_kb_query_cross_account Preserve this as default
                KBFoundationModelId meta.llama3-1-70b-instruct-v1:0 Don’t change
                KnowledgeBaseId XXXXXXXXXX Data base id from Step 4
                TargetRoleArn arn:aws:iam::999999999999:function/bedrock_kb_access_role Substitute with you agent-kb account quantity

              5. Full the acknowledgement and select Subsequent.
              6. Scroll down by means of the web page and select Submit.
              7. You will notice the CloudFormation stack is getting created as proven by the standing CREATE_IN_PROGRESS.
              8. It is going to take a couple of minutes, and you will notice the standing change to CREATE_COMPLETE indicating creation of all sources. Select the Outputs tab to make an observation of the sources that have been created.
                In abstract, the CloudFormation script does the next within the agent account.
                    • Creates a Bedrock agent
                    • Creates an motion group
                    • Additionally creates a Lambda perform which is invoked by the Bedrock motion group
                    • Defines the OpenAPI schema
                    • Creates crucial roles and permissions for the Bedrock agent
                    • Lastly, it prepares the Bedrock agent in order that it is able to take a look at.
      16. Test for mannequin entry in Oregon (us-west-2)
              1. Confirm Nova Professional (us.amazon.nova-pro-v1:0) mannequin entry within the agent account. Navigate to the Amazon Bedrock console and select Mannequin entry underneath Configure and study. Seek for Mannequin title : Nova Professional to confirm entry. If not, then allow mannequin entry.
              2. Confirm entry to the meta.llama3-1-70b-instruct-v1:0 mannequin within the agent-kb account. This could already be enabled as we arrange the information base earlier.
      17. Run the agent. Go surfing to agent account. Navigate to Amazon Bedrock console and select Brokers underneath Construct.
      18. Select the title of the agent and select Take a look at. You possibly can take a look at the next questions as talked about the workshop’s Stage 4: Take a look at Pure Language Queries web page. For instance:
              1. Who’re the highest 5 prospects in Saudi Arabia?
              2. Who’re the highest components provider in america by quantity?
              3. What’s the whole income by area for the yr 1998?
              4. Which merchandise have the very best revenue margins?
              5. Present me orders with the very best precedence from the final quarter of 1997.

      19. Select Present hint to analyze the agent traces.

    Some really useful greatest practices:

        • Phrase your query to be extra particular
        • Use terminology that matches your desk descriptions
        • Strive questions much like your curated examples
        • Confirm your query pertains to information that exists within the TPCH dataset
        • Use Amazon Bedrock Guardrails so as to add configurable safeguards to questions and responses.

    Clear up sources

    It’s endorsed that you just clear up any sources you do not want anymore to keep away from any pointless costs:

        1. Navigate to the CloudFormation console for the agent and agent-kb account, seek for the stack and and select Delete.
        2. S3 buckets should be deleted individually.
        3. For deleting the roles and insurance policies created in each accounts, obtain the script delete-bedrock-agent-kb-roles-policies.sh from the aws-samples GitHub repository.
          1. Open Terminal in Mac or comparable bash shell on different platforms.
          2. Find and alter the listing to the downloaded location, present executable permissions:
          cd /my/location
          			chmod +x delete-bedrock-agent-kb-roles-policies.sh

        4. In case you are nonetheless not clear on the script utilization or inputs, then you possibly can run the script with the –assist possibility then the script will show the utilization:
          ./ delete-bedrock-agent-kb-roles-policies.sh –assist
        5. Run the script: delete-bedrock-agent-kb-roles-policies.sh with the identical values for a similar enter parameters as in Step7 when working the create_bedrock_agent_kb_roles_policies.sh script. Notice: Enter the right account numbers for agent-account and agent-kb-account earlier than working.
          ./delete-bedrock-agent-kb-roles-policies.sh --agent-profile agent  
            	--agent-kb-profile agent-kb  
          	  --lambda-role lambda_bedrock_kb_query_role  
          	  --kb-access-role bedrock_kb_access_role  
          	  --kb-access-policy bedrock_kb_access_policy  
          	  --lambda-policy lambda_bedrock_kb_query_policy  
          	  --agent-account 111122223333  
          	  --agent-kb-account 999999999999

          The script will ask for a affirmation, say sure and press enter.

    Abstract

    This resolution demonstrates how the Amazon Bedrock agent within the agent account can question the Amazon Bedrock information base within the agent-kb account.

    Conclusion

    This resolution makes use of Amazon Bedrock Data Bases for structured information to create a extra built-in method to cross-account information entry. The information base in agent-kb account connects on to Amazon Redshift Serverless in a personal VPC. The Amazon Bedrock agent within the agent account invokes an AWS Lambda perform as a part of its motion group to make a cross-account connection to retrieve response from the structured information base.

    This structure provides a number of benefits:

        • Makes use of Amazon Bedrock Data Bases capabilities for structured information
        • Offers a extra seamless integration between the agent and the info supply
        • Maintains correct safety boundaries between accounts
        • Reduces the complexity of direct database entry codes

    As Amazon Bedrock continues to evolve, you possibly can make the most of future enhancements to information base performance whereas sustaining your multi-account structure.


    Concerning the Authors

    Author KunalKunal Ghosh is an skilled in AWS applied sciences. He keen about constructing environment friendly and efficient options on AWS, particularly involving generative AI, analytics, information science, and machine studying. Apart from household time, he likes studying, swimming, biking, and watching films, and he’s a foodie.

    Author ArghyaArghya Banerjee is a Sr. Options Architect at AWS within the San Francisco Bay Space, centered on serving to prospects undertake and use the AWS Cloud. He’s centered on huge information, information lakes, streaming and batch analytics providers, and generative AI applied sciences.

    Author IndranilIndranil Banerjee is a Sr. Options Architect at AWS within the San Francisco Bay Space, centered on serving to prospects within the hi-tech and semi-conductor sectors clear up advanced enterprise issues utilizing the AWS Cloud. His particular pursuits are within the areas of legacy modernization and migration, constructing analytics platforms and serving to prospects undertake leading edge applied sciences resembling generative AI.

    Author VinayakVinayak Datar is Sr. Options Supervisor based mostly in Bay Space, serving to enterprise prospects speed up their AWS Cloud journey. He’s specializing in serving to prospects to transform concepts from ideas to working prototypes to manufacturing utilizing AWS generative AI providers.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Oliver Chambers
    • Website

    Related Posts

    Enhance operational visibility for inference workloads on Amazon Bedrock with new CloudWatch metrics for TTFT and Estimated Quota Consumption

    March 15, 2026

    5 Highly effective Python Decorators for Excessive-Efficiency Information Pipelines

    March 14, 2026

    What OpenClaw Reveals In regards to the Subsequent Part of AI Brokers – O’Reilly

    March 14, 2026
    Top Posts

    Evaluating the Finest AI Video Mills for Social Media

    April 18, 2025

    Utilizing AI To Repair The Innovation Drawback: The Three Step Resolution

    April 18, 2025

    Midjourney V7: Quicker, smarter, extra reasonable

    April 18, 2025

    Meta resumes AI coaching utilizing EU person knowledge

    April 18, 2025
    Don't Miss

    Influencer Advertising and marketing in Numbers: Key Stats

    By Amelia Harper JonesMarch 15, 2026

    Influencer advertising and marketing has grown into probably the most data-driven division of digital advertising…

    INC Ransom Menace Targets Australia And Pacific Networks

    March 15, 2026

    NYT Connections Sports activities Version hints and solutions for March 15: Tricks to remedy Connections #538

    March 15, 2026

    The Essential Management Ability Most Leaders Do not Have!

    March 15, 2026
    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo

    Subscribe to Updates

    Get the latest creative news from SmartMag about art & design.

    UK Tech Insider
    Facebook X (Twitter) Instagram
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms Of Service
    • Our Authors
    © 2026 UK Tech Insider. All rights reserved by UK Tech Insider.

    Type above and press Enter to search. Press Esc to cancel.