5 Ways to Use Kiro and Amazon Q to Optimize Your Infrastructure

FinOps Article

It’s Friday morning. You’re expecting an easy day when suddenly—ding—a budget alert hits your inbox. Not only have you been notified, but so has your manager and the FinOps team. Your relaxed Friday just disappeared. Sound familiar? This scenario happens more often than it should. With Kiro CLI or Amazon Q Developer IDE, AWS’s generative AI-powered assistant, you can prevent these panic-inducing moments while saving significant money.

Here are five powerful ways to use AI to optimize your AWS infrastructure, as shared at the re:Invent 2025 talk, “Optimize AWS Costs: Developer Tools and Techniques.”

1. Find Optimization Opportunities with MCPs (Model Context Protocol)

What It Does

MCPs enable Kiro to interact directly with AWS cost management tools like AWS Cost Optimization Hub, AWS Cost Explorer, and billing console APIs.

How to Use It

Instead of manually navigating through the AWS Console, ask Kiro directly from your Kiro IDE or terminal using Kiro CLI with a simple command:

"Get me compute optimizer recommendations for my account"

Real-World Impact

What used to take over five minutes navigating consoles now takes seconds in your development environment. Here’s a sample of an mcp.json file tailored for cost optimization.

{
  "mcpServers": {
    "awslabs.cost-explorer-mcp-server": {
      "command": "uvx",
      "args": ["awslabs.cost-explorer-mcp-server@latest"],
      "env": {
        "FASTMCP_LOG_LEVEL": "ERROR",
        "AWS_PROFILE": "<PROFILE>"
      },
      "timeout": 120000,
      "disabled": false
    },
    "awslabs.aws-pricing-mcp-server": {
      "command": "uvx",
      "args": ["awslabs.aws-pricing-mcp-server@latest"],
      "env": {
        "FASTMCP_LOG_LEVEL": "ERROR",
        "AWS_PROFILE": "<PROFILE>",
        "AWS_REGION": "us-east-1"
      },
      "disabled": false,
      "autoApprove": []
    }
  }
}

2. Automatically Apply Recommendations to Your Infrastructure as Code

What It Does

Kiro doesn’t just identify optimization opportunities—it can automatically update your infrastructure as code files.

Workflow

  1. Ask Kiro to find optimization opportunities (like switching to Graviton instances).
  2. Provide your infrastructure folder path.
  3. Specify what changes to make and where to deploy.

Time Saved

A task that once took 10-15 minutes (finding the file, making changes, and testing) now takes about two minutes.

3. Optimize Infrastructure Files with One Command

What It Does

The /optimize command in Amazon Q Developer analyzes your entire infrastructure file and suggests comprehensive optimizations.

Examples of What It Finds

  • Storage tier optimizations (e.g., moving from GP2 to GP3).
  • Networking improvements (e.g., NAT Gateway consolidation).
  • Database optimization (e.g., Amazon Graviton migrations).

Example Output

Recommendations:

  • Move EBS volumes from GP2 to GP3: ~$450/month savings
  • Implement S3 Intelligent-Tiering: ~$780/month savings
  • Optimize VPC Flow Logs retention: ~$120/month savings
  • Convert EC2 instances to Graviton: ~$1,200/month savings

Total potential monthly savings: $2,550

4. Create Cost-Optimized Service Control Policies (SCPs) with Natural Language

What It Does

Kiro converts your cost governance rules written in plain English into proper AWS Service Control Policies.

Example Rule Description

“Create an SCP with these rules:

  • Use GP3 volumes instead of GP2
  • Deny NAT Gateway creation outside the main VPC
  • Enforce tagging on all resources

Bonus Features

  • Validation Before Deployment: Kiro scans your entire codebase to flag any resources that would be blocked by the new policy.
  • Auto-Fix Violations: Automatically update your files to comply with the policy.

5. Embed Cost Optimization into Your Development Workflow with Context/Rules

What It Does

Amazon Q Developer IDE Plugin “rules” or Kiro CLI “context” teach AI your organization’s best practices.

Setup

Create a rule file optimization-rules.md outlining your standards in plain English:

## My Infrastructure Rules

### Lambda Functions
- Always use ARM64 architecture (Graviton)
- Attach a CloudWatch Log Group with 7-day retention
- Use the latest Python runtime (3.12+)

### EC2 Instances
- Prefer Graviton instances when available
- Use GP3 volumes by default

Why It Matters

This ensures every new resource you build is optimized from the start—no more going back to fix cost issues later.

Key Takeaways

  • Work Smarter: Let AI do the heavy lifting—finding resources, updating code, and generating policies.
  • Optimize Forever: Use rules and SCPs to bake optimization into your workflow.
  • Save Context: AI can remember your preferences and continue where you left off.

Stop getting those panic-inducing budget alerts. Start optimizing with AI today. Want to learn more? Check out “The Keys to AWS Optimization” on YouTube for regular cost optimization tips and techniques.