Managing and optimizing AWS infrastructure prices is a essential problem for organizations of all sizes. Conventional price evaluation approaches usually contain the next:
- Advanced spreadsheets – Creating and sustaining detailed price fashions, which requires vital effort
- A number of instruments – Switching between the AWS Pricing Calculator, AWS Value Explorer, and third-party instruments
- Specialised information – Understanding the nuances of AWS pricing throughout companies and AWS Areas
- Time-consuming evaluation – Manually evaluating totally different deployment choices and situations
- Delayed optimization – Value insights usually come too late to tell architectural selections
Amazon Q Developer CLI with the Mannequin Context Protocol (MCP) affords a revolutionary method to AWS price evaluation. Through the use of generative AI by way of pure language prompts, groups can now generate detailed price estimates, comparisons, and optimization suggestions in minutes reasonably than hours, whereas offering accuracy by way of integration with official AWS pricing knowledge.
On this put up, we discover find out how to use Amazon Q CLI with the AWS Value Evaluation MCP server to carry out refined price evaluation that follows AWS greatest practices. We focus on fundamental setup and superior methods, with detailed examples and step-by-step directions.
Answer overview
Amazon Q Developer CLI is a command line interface that brings the generative AI capabilities of Amazon Q on to your terminal. Builders can work together with Amazon Q by way of pure language prompts, making it a useful software for varied improvement duties.
Developed by Anthropic as an open protocol, the Mannequin Context Protocol (MCP) gives a standardized method to join AI fashions to totally different knowledge sources or instruments. Utilizing a client-server structure (as illustrated within the following diagram), the MCP helps builders expose their knowledge by way of light-weight MCP servers whereas constructing AI purposes as MCP purchasers that join to those servers.
The MCP makes use of a client-server structure containing the next parts:
- Host – A program or AI software that requires entry to knowledge by way of the MCP protocol, reminiscent of Anthropic’s Claude Desktop, an built-in improvement atmosphere (IDE), or different AI purposes
- Consumer – Protocol purchasers that keep one-to-one connections with servers
- Server – Light-weight packages that expose capabilities by way of standardized MCP or act as instruments
- Information sources – Native knowledge sources reminiscent of databases and file methods, or exterior methods obtainable over the web by way of APIs (internet APIs) that MCP servers can join with
As introduced in April 2025, the MCP allows Amazon Q Developer to attach with specialised servers that reach its capabilities past what’s potential with the bottom mannequin alone. MCP servers act as plugins for Amazon Q, offering domain-specific information and performance. The AWS Value Evaluation MCP server particularly allows Amazon Q to generate detailed price estimates, studies, and optimization suggestions utilizing real-time AWS pricing knowledge.
Stipulations
To implement this answer, you need to have an AWS account with applicable permissions and comply with the steps beneath.
Arrange your atmosphere
Earlier than you can begin analyzing prices, you have to arrange your atmosphere with Amazon Q CLI and the AWS Value Evaluation MCP server. This part gives detailed directions for set up and configuration.
Set up Amazon Q Developer CLI
Amazon Q Developer CLI is accessible as a standalone set up. Full the next steps to put in it:
- Obtain and set up Amazon Q Developer CLI. For directions, see Utilizing Amazon Q Developer on the command line.
- Confirm the set up by working the next command:
q --version
It is best to see output just like the next: Amazon Q Developer CLI model 1.x.x - Configure Amazon Q CLI along with your AWS credentials:
q login
- Select the login methodology appropriate for you:
Arrange MCP servers
Earlier than utilizing the AWS Value Evaluation MCP server with Amazon Q CLI, you need to set up a number of instruments and configure your atmosphere. The next steps information you thru putting in the mandatory instruments and organising the MCP server configuration:
- Set up Panoc utilizing the next command (you may set up with brew as properly), changing the output to PDF:
pip set up pandoc
- Set up uv with the next command:
pip set up uv
- Set up Python 3.10 or newer:
uv python set up 3.10
- Add the servers to your
~/.aws/amazonq/mcp.json file
:{ "mcpServers": { "awslabs.cost-analysis-mcp-server": { "command": "uvx", "args": ["awslabs.cost-analysis-mcp-server"], "env": { "FASTMCP_LOG_LEVEL": "ERROR" }, "autoApprove": [], "disabled": false } } }
Now, Amazon Q CLI mechanically discovers MCP servers within the
~/.aws/amazonq/mcp.json
file.
Understanding MCP server instruments
The AWS Value Evaluation MCP server gives a number of highly effective instruments:
- get_pricing_from_web – Retrieves pricing data from AWS pricing webpages
- get_pricing_from_api – Fetches pricing knowledge from the AWS Worth Record API
- generate_cost_report – Creates detailed price evaluation studies with breakdowns and visualizations
- analyze_cdk_project – Analyzes AWS Cloud Improvement Equipment (AWS CDK) tasks to establish companies used and estimate prices
- analyze_terraform_project – Analyzes Terraform tasks to establish companies used and estimate prices
- get_bedrock_patterns – Retrieves structure patterns for Amazon Bedrock with price issues
These instruments work collectively that can assist you create correct price estimates that comply with AWS greatest practices.
Take a look at your setup
Let’s confirm that all the things is working appropriately by producing a easy price evaluation:
- Begin the Amazon Q CLI chat interface and confirm the output reveals the MCP server being loaded and initialized:
q chat
- Within the chat interface, enter the next immediate:
Please create a value evaluation for a easy internet utility with an Software Load Balancer, two t3.medium EC2 situations, and an RDS db.t3.medium MySQL database. Assume 730 hours of utilization per thirty days and average site visitors of about 100 GB knowledge switch. Convert estimation to a PDF format.
- Amazon Q CLI will ask for permission to belief the software that’s getting used; enter
t
to belief it. Amazon Q ought to generate and show an in depth price evaluation. Your output ought to appear to be the next screenshot.
In case you see the fee evaluation report, your atmosphere is ready up appropriately. In case you encounter points, confirm that Amazon Q CLI can entry the MCP servers by ensuring you put in set up the mandatory instruments and the servers are within the~/.aws/amazonq/mcp.json
file.
Configuration choices
The AWS Value Evaluation MCP server helps a number of configuration choices to customise your price evaluation expertise:
- Output format – Select between markdown, CSV codecs, or PDF (which we put in the package deal for) for price studies
- Pricing model – Specify on-demand, reserved situations, or financial savings plans
- Assumptions and exclusions – Customise the assumptions and exclusions in your price evaluation
- Detailed price knowledge – Present particular utilization patterns for extra correct estimates
Now that the environment is ready up, let’s create extra price analyses.
Create AWS Value Evaluation studies
On this part, we stroll by way of the method of making AWS price evaluation studies utilizing Amazon Q CLI with the AWS Value Evaluation MCP server.
If you present a immediate to Amazon Q CLI, the AWS Value Evaluation MCP server completes the next steps:
- Interpret your necessities.
- Retrieve pricing knowledge from AWS pricing sources.
- Generate an in depth price evaluation report.
- Present optimization suggestions.
This course of occurs seamlessly, so you may deal with describing what you need reasonably than find out how to create it.
AWS Value Evaluation studies sometimes embrace the next data:
- Service prices – Breakdown of prices by AWS service
- Unit pricing – Detailed unit pricing data
- Utilization portions – Estimated utilization portions for every service
- Calculation particulars – Step-by-step calculations displaying how prices had been derived
- Assumptions – Clearly said assumptions used within the evaluation
- Exclusions – Prices that weren’t included within the evaluation
- Suggestions – Value optimization recommendations
Instance 1: Analyze a serverless utility
Let’s create a value evaluation for a easy serverless utility. Use the next immediate:
Create a value evaluation for a serverless utility utilizing API Gateway, Lambda, and DynamoDB. Assume 1 million API calls per thirty days, common Lambda execution time of 200ms with 512MB reminiscence, and 10GB of DynamoDB storage with 5 million learn requests and 1 million write requests per thirty days. Convert estimation to a PDF format.
Upon coming into your immediate, Amazon Q CLI will retrieve pricing knowledge utilizing the get_pricing_from_web
or get_pricing_from_api
instruments, and can use generate_cost_report
with awslabscost_analysis_mcp_server
.
It is best to obtain an output giving an in depth price breakdown primarily based on the immediate together with optimization suggestions.
The generated price evaluation reveals the next data:
- Amazon API Gateway prices for 1 million requests
- AWS Lambda prices for compute time and requests
- Amazon DynamoDB prices for storage, learn, and write capability
- Whole month-to-month price estimate
- Value optimization suggestions
Instance 2: Analyze multi-tier architectures
Multi-tier architectures separate purposes into purposeful layers (presentation, utility, and knowledge) to enhance scalability and safety. This instance analyzes prices for implementing such an structure on AWS with parts for every tier:
Create a value evaluation for a three-tier internet utility with a presentation tier (ALB and CloudFront), utility tier (ECS with Fargate), and knowledge tier (Aurora PostgreSQL). Embrace prices for two Fargate duties with 1 vCPU and 2GB reminiscence every, an Aurora db.r5.massive occasion with 100GB storage, an Software Load Balancer with 10
This time, we’re formatting it into each PDF and DOCX.
The associated fee evaluation reveals the next data:
Instance 3: Examine deployment choices
When deploying containers on AWS, selecting between Amazon ECS with Amazon Elastic Compute Cloud (Amazon EC2) or Fargate includes totally different price constructions and administration overhead. This instance compares these choices to find out essentially the most cost-effective answer for a selected workload:
Examine the prices between working a containerized utility on ECS with EC2 launch kind versus Fargate launch kind. Assume 4 containers every needing 1 vCPU and 2GB reminiscence, working 24/7 for a month. For EC2, use t3.medium situations. Present a suggestion on which possibility is more cost effective for this workload. Convert estimation to a HTML webpage.
This time, we’re formatting it right into a HTML webpage.
The associated fee comparability consists of the next data:
- Amazon ECS with Amazon EC2 launch kind prices
- Amazon ECS with Fargate launch kind prices
- Detailed breakdown of every possibility’s pricing parts
- Aspect-by-side comparability of whole prices
- Suggestions for essentially the most cost-effective possibility
- Concerns for when every possibility is perhaps most popular
Actual-world examples
Let’s discover some real-world structure patterns and find out how to analyze their prices utilizing Amazon Q CLI with the AWS Value Evaluation MCP server.
Ecommerce platform
Ecommerce platforms require scalable, resilient architectures with cautious price administration. These methods sometimes use microservices to deal with varied features independently whereas sustaining excessive availability. This instance analyzes prices for an entire ecommerce answer with a number of parts serving average site visitors ranges:
Create a value evaluation for an e-commerce platform with microservices structure. Embrace parts for product catalog, purchasing cart, checkout, cost processing, order administration, and person authentication. Assume average site visitors of 500,000 month-to-month energetic customers, 2 million web page views per day, and 50,000 orders per thirty days. Make sure the evaluation follows AWS greatest practices for price optimization. Convert estimation to a PDF format.
The associated fee evaluation consists of the next key parts:
Information analytics platform
Fashionable knowledge analytics platforms have to effectively ingest, retailer, course of, and visualize massive volumes of information whereas managing prices successfully. This instance examines the AWS companies and prices concerned in constructing a whole analytics pipeline dealing with vital each day knowledge volumes with a number of person entry necessities:
Create a value evaluation for an information analytics platform processing 500GB of latest knowledge each day. Embrace parts for knowledge ingestion (Kinesis), storage (S3), processing (EMR), and visualization (QuickSight). Assume 50 customers accessing dashboards each day and knowledge retention of 90 days. Make sure the evaluation follows AWS greatest practices for price optimization and consists of suggestions for cost-effective scaling. Convert estimation to a HTML webpage.
The associated fee evaluation consists of the next key parts:
- Information ingestion prices (Amazon Kinesis Information Streams and Amazon Information Firehose)
- Storage prices (Amazon S3 with lifecycle insurance policies)
- Processing prices (Amazon EMR cluster)
- Visualization prices (Amazon QuickSight)
- Information switch prices between companies
- Whole month-to-month price estimate
- Value optimization suggestions for every part
- Scaling issues and their price implications
Clear up
In case you now not want to make use of the AWS Value Evaluation MCP server with Amazon Q CLI, you may take away it out of your configuration:
- Open your
~/.aws/amazonq/mcp.json
file. - Take away or remark out the “
awslabs.cost-analysis-mcp-server
” entry. - Save the file.
This may stop the server from being loaded once you begin Amazon Q CLI sooner or later.
Conclusion
On this put up, we explored find out how to use Amazon Q CLI with the AWS Value Evaluation MCP server to create detailed price analyses that use correct AWS pricing knowledge. This method affords vital benefits over conventional price estimation strategies:
- Time financial savings – Generate advanced price analyses in minutes as an alternative of hours
- Accuracy – Make certain estimates use the newest AWS pricing data
- Complete – Embrace related price parts and issues
- Actionable – Obtain particular optimization suggestions
- Iterative – Shortly evaluate totally different situations by way of easy prompts
- Validation – Examine estimates in opposition to official AWS pricing
As you proceed exploring AWS price evaluation, we encourage you to deepen your information by studying extra concerning the Mannequin Context Protocol (MCP) to grasp the way it enhances the capabilities of Amazon Q. For hands-on price estimation, the AWS Pricing Calculator affords an interactive expertise to mannequin and evaluate totally different deployment situations. To ensure your architectures comply with monetary greatest practices, the AWS Nicely-Architected Framework Value Optimization Pillar gives complete steerage on constructing cost-efficient methods. And to remain on the innovative of those instruments, control updates to the official AWS MCP servers—they’re consistently evolving with new options to make your price evaluation expertise much more highly effective and correct.
Concerning the Authors
Joel Asante, an Austin-based Options Architect at Amazon Net Companies (AWS), works with GovTech (Authorities Expertise) clients. With a powerful background in knowledge science and utility improvement, he brings deep technical experience to creating safe and scalable cloud architectures for his clients. Joel is obsessed with knowledge analytics, machine studying, and robotics, leveraging his improvement expertise to design modern options that meet advanced authorities necessities. He holds 13 AWS certifications and enjoys household time, health, and cheering for the Kansas Metropolis Chiefs and Los Angeles Lakers in his spare time.
Dunieski Otano is a Options Architect at Amazon Net Companies primarily based out of Miami, Florida. He works with World Huge Public Sector MNO (Multi-Worldwide Organizations) clients. His ardour is Safety, Machine Studying and Synthetic Intelligence, and Serverless. He works along with his clients to assist them construct and deploy excessive obtainable, scalable, and safe options. Dunieski holds 14 AWS certifications and is an AWS Golden Jacket recipient. In his free time, you will discover him spending time along with his household and canine, watching an incredible film, coding, or flying his drone.
Varun Jasti is a Options Architect at Amazon Net Companies, working with AWS Companions to design and scale synthetic intelligence options for public sector use circumstances to fulfill compliance requirements. With a background in Pc Science, his work covers broad vary of ML use circumstances primarily specializing in LLM coaching/inferencing and laptop imaginative and prescient. In his spare time, he loves enjoying tennis and swimming.