Simplify file storage and processing for logs, media, and more. Starting at ~$0/hour.
Easily automate file processing workflows by specifying dynos to run after you save files to S3.
Job queues and workflow systems add cost and complexity to your applications. Get S3 data storage with these features built-in so you can create simpler, more reliable systems.
Define workflows using JSON instructions within or attached to your files. Minimize dependencies and continue using your favorite programming language(s).
We provide you with a private S3 bucket designed to protect sensitive data. Access is provided through temporary AWS credentials that rotate automatically every 6 hours.
We’ve included several safeguards to maximize fault tolerance and ensure each workflow runs exactly once. S3 events are delivered using EventBridge (99.99% availability). In case of intermittent outages, tasks will be automatically retried (with exp. backoff).
Access your private S3 bucket and monitor your workflows in real-time using the line^d dashboard,
The available application locations for this add-on are shown below, and depend on whether the application is deployed to a Common Runtime region or Private Space. Learn More
Region | Available |
---|---|
United States | Available |
Europe |
Region | Available | Installable in Space |
---|---|---|
Dublin | ||
Frankfurt | ||
London | ||
Montreal | ||
Mumbai | ||
Oregon | ||
Singapore | ||
Sydney | ||
Tokyo | ||
Virginia | Available |
To provision, copy the snippet into your CLI or use the install button above.