S3 - Batch Operations

 

Use S3 Batch Operations to perform large-scale batch operations on Amazon S3 objects. Batch Operations can run a single operation on a list of Amazon S3 objects that you specify in the manifest.

In this post, we will see how to copy data from one bucket to another bucket using batch operations. The same steps can be used to replicate existing objects from one bucket to another bucket.

This is my primary bucket:



Using batch operations, we are going to copy the existing files to another bucket.

Let's start by creating a "Batch Job".

Batch works by collecting information (Metadata) about existing objects before triggering any actions.

We are pointing the source bucket.

Next, we need to select what operation we are trying to perform? COPY


Select the destination bucket to copy the objects.



Review the job and save the completion report if needed. I usually save them to investigate job failures.


Finally, create an IAM role to perform the data copy.



Now, the batch is ready to run. Select the "Job" and click "Run job".

Our batch job completed and the objects are copied to our destination bucket.






Comments

Popular posts from this blog

K8s - ETCD

SRE Interview Questions and Answers - Part II

K8s - Deployment and HPA replicas