Heroku AI provides access to to top models and built-in tools for agents.
Managed Inference and Agents simplifies AI integration by providing access to powerful foundation models, including text, embedding, and diffusion models. Easily attach model resources to your Heroku app, and the add-on will automatically configure environment variables, enabling seamless API calls. Invoke models using the CLI plug-in or with API endpoints.
Extend Agents with tools that allow Large Language Models (LLMs) to execute actions within Heroku’s trusted environment. Deploy autonomous agents that can call APIs, run code, or interact with your app through tools like code_exec, http, or custom ones. Move from prototyping to production with optimized inference latency and minimal infrastructure management.
The Model Context Protocol (MCP) is an open standard that helps you extend Agents by connecting large language models to tools, services, and data sources. You can bring your own custom tools by deploying them as a heroku app and registering them by attaching the addon. Access all you mcp servers through a single toolkit.
Text Generation Use models like Claude-Sonnet to generate text, write code, or chat intelligently. Retrieval-Augmented Generation (RAG) Bring your own data to power LLMs with up-to-date, domain-specific knowledge. Personalize User Experiences: Leverage agents to deliver tailored content, recommendations, or support. Data Analysis and Business Intelligence: Deploy agents that can analyze large datasets, identify trends, generate reports, and provide actionable insights.
For those customers paying by credit card, Heroku Managed Inference and Agents uses metered billing, as set forth in the Plans & Pricing tables below For enterprise customers, your usage of Heroku Managed Inference and Agents will consume your General Add-on Credits and/or Data Add-on Credits as set forth in the Plans & Pricing tables below.
Heroku Managed Inference and Agents doesn’t store or log your prompts and completions. Heroku Managed Inference and Agents doesn’t use your prompts and completions to train any models and doesn’t distribute them to third parties for training.
The models are available for provisioning in the region shown below but are available to be accessed by apps in all regions including private spaces. The models by default are provisioned based on the region of your app. You can use the Heroku CLI when provisioning to override the model's default region.
Model | United States | European Union |
---|---|---|
Claude-3-5-haiku | Available | Available |
Claude-3-5-sonnet-latest | Available | |
Claude-3-7-sonnet | Available | Available |
Claude-3-haiku | Available | |
Claude-4-sonnet | Available | Available |
Cohere embed multilingual | Available | Available |
GPT-OSS-120B | Available | |
Nova Lite | Available | Available |
Nova Pro | Available | Available |
Stable-image-ultra | Available | Available |
Claude-3-5-haiku
A faster, more affordable large language model that supports chat and tool-calling.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
Claude-3-5-sonnet-latest
A state-of-the-art large language model that supports chat and tool-calling.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
Claude-3-7-sonnet
A state-of-the-art large language model that supports chat and tool-calling.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
Claude-3-haiku
A faster, more affordable large language model that supports chat and tool-calling.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
Claude-4-sonnet
A state-of-the-art large language model that supports chat and tool-calling.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
Cohere embed multilingual
A state-of-the-art embedding model that supports multiple languages. This model is helpful for developing Retrieval Augmented Generation (RAG) search.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
GPT-OSS-120B
A powerful, open-source large language model developed by OpenAI, designed for a wide range of generative AI applications. It offers advanced capabilities in natural language understanding, generation, and complex problem-solving, making it a versatile tool for developers and enterprises.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
Nova Lite
A fast and highly cost-effective model, perfect for applications requiring rapid text generation, summarization, and copywriting. It's optimized for high-throughput tasks and general-purpose use cases.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
Nova Pro
A high-performance model designed for more complex tasks, including advanced question-answering, detailed content creation, and nuanced data extraction. It provides a significant step up in capability for applications that demand higher quality and deeper understanding.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
Stable-image-ultra
A state-of-the-art diffusion (image generation) model.
Metered usage amounts
This model is available to apps in all regions. Override the region in which the model is provisioned by adding the --region
flag. Refer to Region Availability for supported regions by model.
To provision, copy the snippet into your CLI or use the install button above.
The Heroku Managed Inference and Agent add-on may employ third-party generative AI models to provide the Service. Due to the nature of generative AI, the output that it generates may be unpredictable, and may include inaccurate or harmful responses. Customer assumes all responsibility for such output, including ensuring its accuracy, safety, and compliance with applicable laws and third-party acceptable use policies. For more information, please see the Heroku Notices and License Information Documentation.