Posts

Agentic AI - Series 5

Image
                                                              Hope you all having fun with our Agentic AI series. In this blog, we will see how to design a MAS (Multi Agent System) using CrewAI. CrewAI is a popular framework for building production grade AI systems.  We know that every agent needs Role. Goal. Backstory. Tasks. Tools MAS is nothing but completing a work with the help or collaboration of one more agents. Lets understand with a real life example. We are going to design a application which involves in recommending food for diabetic patients based on the given ingredients. To accomplish the task, we need a agent called " Chef ". Role - Experience Chef. Goal - Prepare delicious food based on the given ingredient. Backstory : You are an experienced chef and received accolades for preparing ...

Agentic AI - Series 4

Image
                                                                   In this blog, will how to create a simple agent using AI framework like Langchain with a simple LLM model  gpt-4o-mini. from langchain_openai import ChatOpenAI from langchain_core . tools import tool import os # Creating a tool @ tool def add_numbers ( x : int , y : int ) -> int :     "Add two numbers"     return x + y @tool is a decorator to define a tool. This is a simple tool to add 2 numbers. The below line creates a Langchain LLM wrapper around on the OpenAI chat mode 'gpt-4o-mini' # Binding the tool to a model llm = ChatOpenAI( model = "gpt-4o-mini" ).bind_tools([add_numbers]) response = llm .invoke( "What is 5 + 7?" ) When we print response, is the expected outcome? Is it 12. NO...

S3 - Directory Bucket

Image
  Directory buckets organize data hierarchically into directories as opposed to the flat storage structure of general purpose buckets. There aren't prefix limits for directory buckets, and individual directories can scale horizontally. Directory buckets support bucket creation in the following bucket location types : Availability Zone or Local Zone. For low latency use cases, you can create a directory bucket in a single Availability Zone to store data. Amazon S3 Express One Zone is a high-performance, single-zone Amazon S3 storage class that is purpose-built to deliver consistent, single-digit millisecond data access for your most latency-sensitive applications.  S3 Express One Zone is the lowest latency cloud-object storage class available today, with data access speeds up to 10x faster and with request costs 50 percent lower than S3 Standard. With S3 Express One Zone, your data is redundantly stored on multiple devices within a single Availability Zone.   You can ac...

S3 - Batch Operations

Image
  Use S3 Batch Operations to perform large-scale batch operations on Amazon S3 objects. Batch Operations can run a single operation on a list of Amazon S3 objects that you specify in the manifest. In this post, we will see how to copy data from one bucket to another bucket using batch operations. The same steps can be used to replicate existing objects from one bucket to another bucket. This is my primary bucket: Using batch operations, we are going to copy the existing files to another bucket. Let's start by creating a "Batch Job". Batch works by collecting information (Metadata) about existing objects before triggering any actions. We are pointing the source bucket. Next, we need to select what operation we are trying to perform? COPY Select the destination bucket to copy the objects. Review the job and save the completion report if needed. I usually save them to investigate job failures. Finally, create an IAM role to perform the data copy. Now, the batch is ready to ...

Agentic AI - Series 3

Image
                                                                      In this blog we will see how to build a simple agent without using any LLM model. Simple flow of an agent: Agent takes input -> Based on the input, it decides which tool to use -> Performs operation and send the output back to the user. An AI agent framework is  a set of tools, libraries, and structures that simplifies building, deploying, and managing autonomous AI agents. But, here we are going to build an agent without using any framework and LLM's. Agent functionality is to perform "Addition" and "Subtraction". Agent uses 2 python functions (Tools) to perform addition and subtraction operations. Finally, it send the output to the user. We are going to implement the above discussed functionality vi...

Agentic AI - Series 2

Image
  Now we know the brain behind Agents are LLM models. So, how do I access and use the models? That where we rely on organizations like OpenAI, Google, Meta, Mircosoft, Hugging Face, Nvidia, Grog and others who primarily build LLM models that serves various purpose like Text Generation, Image Recognition, and others. I am sure everyone have been using ChatGPT which is a service provided by OpenAI for interactive/chat based conversation for our daily activities starting from asking a riddle, solving math problem and other tasks. This is my simple interaction with ChatGPT asking for the "Weather in California?" Lets imagine building an Agent which needs perform the same action as above, then it must be done programmatically.  If its programmatically, then you need API credentials to perform the same. I have generated OpenAI API credentials via  API keys - OpenAI API Let's ask the same question to ChatGPT programmatically: from openai import OpenAI from dotenv import loa...

S3 - Event Driven Action

Image
  In this blog, we will see how to use S3 with event driven architecture. Flow is as below: 1) Application running on EC2 instance uploads object to a prefix called " raw_data " into a S3 bucket. 2) Once the object is uploaded, S3 event is detected by the Event Bridge rule. 3) Event bridge rule is configured to listen for "S3:ObjectCreate" event with destination as Lambda. 4) Lambda is configured to read data from the " raw_data " folder and process it. 5) Processed data is saved into a file under the folder " processed_data ". 6) Both the folders " raw_data " and " processed_data " exist in the same bucket " demo-event-application" Here is the bucket. EC2 instance is attached with the IAM EC2 instance profile to perform S3 actions. This profile/role has admin S3 access for S3 actions. Attaching the role to the EC2 instance. Let's login to the EC2 instance and verify the S3 actions. It worked. Here is the snippe...