by brianvanburken


Note: This is a self-maintained fork. It contains changes for my own environment. Difference between the original and this fork are:

  • store image output on the server for caching and improving procession speed;
  • stronger e-tags;
  • added caching headers to keep images for a year (this assums that if an image changes on a site the URL to the image also changes);
  • lowered minimum image size for compression (this assums that the compression rate is really low. I use 20%);
  • enable compression for images that have content-type "application/octet-stream";
  • use latest Node.js for performance increasement;
  • latest pacakges versions;

Bandwidth Hero Data Compression Service

This data compression service is used by Bandwidth Hero browser extension. It compresses given image to low-res WebP or JPEG image. Optionally it also converts image to greyscale to save even more data.

It downloads original image and transforms it with Sharp on the fly without saving images on disk.

This is NOT an anonymizing proxy — it downloads images on user's behalf, passing cookies and user's IP address through to the origin host.



You can deploy this service to Heroku:


Deploy to Heroku guide


Data compression service is a Node.js app which you can run on any server that supports Node.js. Check out this guide on how to setup Node.js on Ubuntu.

DigitalOcean also provides an easy way to setup a server ready to host Node.js apps.