AWS Moves Towards Modular Offerings for AI Agent Development

The AWS Marketplace Has Introduced a New Category for AI Tools and Agents

AWS has expanded its marketplace to include a dedicated category called "AI Tools and Agents." This new section goes beyond just offering ready-to-use AI agents and tools for powering applications; it also features professional services, developer solutions, and software that integrates various agent-based components.

With this update, Amazon Web Services is providing two additional delivery methods for AI services. The first is via Application Programming Interface (API), enabling seamless integration into existing systems. The second is through Docker containers that contain foundational frameworks and models, all embedded within Amazon Bedrock AgentCore. These containers include pre-configured environments suitable for deploying AI agents at scale, offering flexibility for developers and enterprises.

Currently in preview, this offering is part of a broader push by AWS to extend and diversify its AI product lineup. It builds upon a product launched in late 2023 called Bedrock Agents. While Bedrock Agents already provided a foundation for deploying AI models and agents, AWS is now decomposing many of its functionalities into independent modules. The goal is to allow users to detach these modules from Bedrock, enabling them to support a wider range of technologies not natively available on the platform.

Modular Pricing Structure

AWS is adopting a pay-as-you-go model, with pricing tiers for each component. There are six core modules: runtime, tools, gateway, memory, identities, and observability.

Runtime Module:
This component handles serverless deployment within the AWS cloud. Each session runs in a dedicated micro-virtual machine (micro-VM). The cost for runtime usage is $0.0895 per vCPU-hour and $0.00945 per GB-hour of RAM. Billing is calculated per second, with a minimum memory allocation of 128MB.

Memory Management:
Memory is managed on two levels: short-term (to preserve session history) and long-term (across sessions). Users can choose from three pre-configured strategies for long-term storage—focused on facts and knowledge (semantic approach), summarization of sessions, and user preferences. Pricing for memory operations is as follows:

  • Short-term memory creation: $0.25 per 1,000 events
  • Long-term memory creation: $0.25 per 1,000 events with custom strategies; $0.75 per 1,000 events for standard storage
  • Memory retrieval: $0.50 per 1,000 events

Gateway Module:
This module allows setting up an MCP (Messaging and Communication Protocol) server to connect multiple tools and data sources, such as OpenAPI targets, AWS Lambda, and Smithy. It supports integrations with platforms like Asana, Jira, Salesforce, Slack, and Zendesk. Pricing is structured around usage:

  • Tool invocation: $0.005 per 1,000 requests
  • Tool discovery/search: $0.025 per 1,000 searches
  • Tool indexing: $0.02 per 100 items per month

Identity and Access Management:
For managing security tokens or API keys, AWS charges $0.01 for every 1,000 tokens or keys requested per agent, with this service being free within the AWS ecosystem.

Tools Suite:
The tools include a code interpreter and a sandboxed web browser. Both are accessible through Bedrock Agents, although the web browsing feature is still in beta testing, utilizing services like Anthropic’s Computer Use with Claude 3.7 Sonnet and Claude 3.5 Sonnet v2.

The use of Bedrock AgentCore is offered free of charge until September 16, 2025, excluding supplementary services such as CloudWatch for observability.

Module Capacity Limits

Each module has specific quotas, with some examples including:

  • Runtime:

    • Up to 1,000 agents and 500 sessions per AWS account, with the possibility of requesting higher limits.
    • Synchronous requests have a 15-minute timeout; streaming connections can last up to 1 hour; asynchronous tasks are permitted up to 8 hours.
  • Memory:

    • 50 resources per AWS region per account.
    • A maximum of 6 strategies per instance, with prompts limited to 30 KB for custom strategies.
  • Tools:

    • Individual files up to 100 MB; up to 2 GB via Amazon S3.
  • Identities:
    • 50 API keys and 50 OAuth client credentials per AWS region per account.

It is worth noting that currently, Bedrock AgentCore lacks a separate orchestration component for custom workflows. However, in late 2024, Bedrock Agents introduced this capability, enabling invocation of AWS Lambda functions to implement more complex, sequenced methods beyond the default approaches like ReAct.

Conclusion

This new suite of modular AI tools and agents crafted by AWS marks a significant step towards flexible, scalable AI deployments. By decomposing complex functionalities into standalone modules and offering multiple delivery formats, AWS aims to cater to a broad range of enterprise needs—from rapid prototyping to production-grade solutions. The free tier period for Bedrock AgentCore also provides developers with time to experiment and adapt these tools to their specific workflows, fostering innovation within the rapidly evolving AI landscape.

Dawn Liphardt

Dawn Liphardt

I'm Dawn Liphardt, the founder and lead writer of this publication. With a background in philosophy and a deep interest in the social impact of technology, I started this platform to explore how innovation shapes — and sometimes disrupts — the world we live in. My work focuses on critical, human-centered storytelling at the frontier of artificial intelligence and emerging tech.