Apache Kafka on Heroku

Reliable and powerful Apache Kafka as a service. Starting at $1500/mo.

Elegant Developer Experience

Easy to use CLI and web tooling make Kafka simple to provision, configure and operate. Add topics, create partitions, manage log compaction, and monitor key metrics from the comfort of the CLI or Heroku Dashboard.

World Class Operations

Now you can consume Kafka as a service with Heroku’s world-class orchestration and thoughtfully tuned configurations that keep Kafka fast and robust. We distribute Kafka resources across network zones for fault-tolerance, and ensure your Kafka cluster is always available and addressable.

Seamless Integration with Apps

Run producers and consumers as Heroku apps for simple vertical and horizontal scalability. Config vars make it easy to securely connect to your Kafka cluster, so you can focus on your core logic.

Take Control of Your Events

Events are everywhere — user activity streams, log events, telemetry from embedded devices and mobile phones, and more. Apache Kafka on Heroku flips the script from push to pull, letting you take control of high volume event streams in your applications to transform the customer experience. With Apache Kafka on Heroku, you can accept inbound events at any scale with ease and route them to key-based partitions, providing a clear path to real-time stream processing for user activity tracking, ad tracking, IoT, mobile sync and messaging systems.

Build Modern Application Architectures

Application architectures like microservices require new approaches to coordination, scaling and orchestration. Apache Kafka on Heroku’s pull based communication model reduces backpressure on key services under load, letting you add and scale new services independently. Apache Kafka on Heroku enables moving from actor to channel centric app dev models, simplifying services discovery and reducing brittle RPC style and many-to-many coordination between services.

Elastic Queuing

Apache Kafka on Heroku acts as the edge of your system, durably accepting high volumes of inbound events - be it user click interactions, log events, mobile telemetry, ad tracking, or other events. This enables you to create new types of architectures for incremental processing of immutable event streams. You can add and remove downstream services seamlessly without impacting the ability to accept high throughput inbound events, and Kafka’s durability ensures events are available when services reconnect after failures so no events are lost.

Data Pipelines and Analytics

Kafka is an ideal transport for building data pipelines for transforming stream data and computing aggregate metrics. Pipelines can help you build advanced data-centric applications and enable analytics teams to make better decisions. Kafka’s distributed architecture and immutable event streams make it trivial to build pipelines for incremental, parallel processing of fast moving data. You can integrate all the disparate sources and sinks of data in your organization.

Microservices Coordination

Kafka enables you to model your application as a collection of microservices that process events and exchange state over channel-like topics. Kafka becomes the backplane for service communication, allowing microservices to become loosely coupled. Bootstrapping microservices becomes order independent, since all communications happens over topics. Service discovery is simply a matter of connecting to new topics. Consuming and producing services, as well as Kafka brokers, can be scaled independently so your architecture is fully elastic. Kafka distributes topics and replicates messages across multiple servers for event durability, so if a broker fails for any reason, your event data will be safe. If a service fails it can reconnect and start processing from the last known offset.

Region Availability

The available application locations for this add-on are shown below, and depend on whether the application is deployed to a Common Runtime region or Private Space. Learn More

  • Common Runtime
  • Private Spaces
Region Available
United States Available
Europe Available
Map
Region Available Installable in Space
Virginia Available Available
Oregon Available Available
Frankfurt Available Available
Tokyo Available Available
Sydney Available Available
Dublin Beta Available Available
Map

Plans & Pricing

    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 2 Weeks
    • Capacity 150 GB
    • Kafka Cluster Configuration 3 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 2 Weeks
    • Capacity 150 GB
    • Kafka Cluster Configuration 3 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 2 Weeks
    • Capacity 300 GB
    • Kafka Cluster Configuration 3 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 2 Weeks
    • Capacity 300 GB
    • Kafka Cluster Configuration 3 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 2 Weeks
    • Capacity 900 GB
    • Kafka Cluster Configuration 3 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 2 Weeks
    • Capacity 900 GB
    • Kafka Cluster Configuration 3 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 6 Weeks
    • Capacity 400 GB
    • Kafka Cluster Configuration 8 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 6 Weeks
    • Capacity 800GB
    • Kafka Cluster Configuration 8 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 6 Weeks
    • Capacity 400 GB
    • Kafka Cluster Configuration 8 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 6 Weeks
    • Capacity 800 GB
    • Kafka Cluster Configuration 8 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 6 Weeks
    • Capacity 2400 GB
    • Kafka Cluster Configuration 8 Kafka Brokers
    • Additional Networking & Configuration Options
    • Dedicated Clusters
    • Maximum Data Retention 6 Weeks
    • Capacity 2400 GB
    • Kafka Cluster Configuration 8 Kafka Brokers

To provision or login to provision on Elements.

Apache Kafka on Heroku Documentation