S3 - Event Driven Action

 

In this blog, we will see how to use S3 with event driven architecture.

Flow is as below:

1) Application running on EC2 instance uploads object to a prefix called "raw_data" into a S3 bucket.

2) Once the object is uploaded, S3 event is detected by the Event Bridge rule.

3) Event bridge rule is configured to listen for "S3:ObjectCreate" event with destination as Lambda.

4) Lambda is configured to read data from the "raw_data" folder and process it.

5) Processed data is saved into a file under the folder "processed_data".

6) Both the folders "raw_data" and "processed_data" exist in the same bucket "demo-event-application"

Here is the bucket.



EC2 instance is attached with the IAM EC2 instance profile to perform S3 actions. This profile/role has admin S3 access for S3 actions.





Attaching the role to the EC2 instance.


Let's login to the EC2 instance and verify the S3 actions.


It worked.

Here is the snippet to upload data to the S3 bucket.


This is data that gets uploaded to the "raw_data" folder.

Let’s enable Amazon EventBridge for the bucket.

Use Amazon EventBridge to build event-driven applications at scale using S3 event notifications.

Let’s create  a lambda function to process the file "raw_data" and store under the same bucket under the folder “processed_data”.


NOTE: Ensure lambda execution role has S3 permission.

Let’s start creating an event bridge rule to drive the event driven actions.







Rules have been created. Let’s test it.


No files exist under the folders. Triggering the upload from the EC2 instance.


Now, we could see data has been processed and saved under the "processed_data" folder.

Comments

Popular posts from this blog

K8s - ETCD

K8s - Deployment and HPA replicas

K8s - User and Groups