Manage tens to billions of objects at scale
S3 Batch Operations
How it works: S3 Batch Operations

S3 Batch Operations tutorial
Teespring
Teespring was founded in 2011 and enables users to create and sell custom on-demand products online. As every piece of custom merchandise requires multiple assets inside Teespring, they store petabytes of data in Amazon S3.
"Amazon S3 Batch Operations helped us optimize our storage by utilizing Amazon S3’s Glacier storage class. We used our own storage metadata to create batches of objects that we could move to Amazon S3 Glacier. With Amazon S3 Glacier we saved more than 80% of our storage costs. We are always looking for opportunities to automate storage management, and with S3 Batch Operations, we can manage millions of objects in minutes."
James Brady, VP of Engineering - Teespring

Capital One
Capital One is a bank founded at the intersection of finance and technology and one of America’s most recognized brands. Capital One used Amazon S3 Batch Operations to copy data between two AWS regions to increase their data’s redundancy and to standardize their data footprint between those two locations.
"With Amazon S3 Batch Operations we created a job to copy millions of objects in hours, work that had traditionally taken months to complete. We used Amazon S3’s inventory report, which gave a list of objects in our bucket, as the input to our Amazon S3 Batch Operations job. Amazon S3 was instrumental in copying the data, providing progress updates, and delivering an audit report when the job was complete. Having this feature saved our team weeks of manual effort and turned this large-scale data transfer into something routine. "
Franz Zemen, Vice President, Software Engineering - Capital One

ePlus
ePlus, an AWS Advanced Consulting Partner, works with customers to optimize their IT environments and uses solutions like, S3 Batch Operations, to save clients time and money.
"S3 Batch Operations is simply amazing. Not only did it help one of our clients reduce the time, complexity, and painstaking chore of having to pull together a wide selection of S3 operations, scheduling jobs, then rendering the information in a simple to use dashboard, it also helped tackle some daunting use cases I don't think we would have been able to tackle in the fraction of the time it took S3 Batch Operations to do.
For example, S3 Batch Operations made quick work of copying over 2 million objects across regions within the same account while keeping the metadata intact. The solution worked seamlessly performing similar tasks across accounts, and most notably, generated a completion report that automatically sifted and separated successful versus failed operations amongst 400 million objects allowing for simpler handling of the failed operations in a single file."
David Lin, Senior Solutions Architect & AWS Certified Professional - ePlus
