Sunday, April 6, 2025

AI agent using an LLM (openai) model - Python agent

A customer wants to check the status of their food delivery order.

### pip install openai
import openai
class AI_Agent:
def __init__(self, model):
self.model = model

def collect_input(self, user_input):
print(f"Customer: {user_input}")
return user_input

def send_query_to_model(self, user_input):
query = f"The customer wants to know: {user_input}"
response = self.model.process_query(query)
return response

def execute_action(self, model_response):
print(f"AI Response: {model_response}")

class OpenAI_Model:
def __init__(self, api_key):
openai.api_key = api_key

def process_query(self, query):
try:

# Using OpenAI's GPT model to process the query
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo", # Use the specific model you want
messages=[{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": query}]
)
return response['choices'][0]['message']['content']
except Exception as e:
return f"Error: {e}"

# Replace 'your_openai_api_key' with your actual OpenAI API key
api_key = "your_openai_api_key"
openai_model = OpenAI_Model(api_key)
ai_agent = AI_Agent(openai_model)

# Simulate customer interaction
user_input = "Where's my order?"
input_collected = ai_agent.collect_input(user_input)
response = ai_agent.send_query_to_model(input_collected)
ai_agent.execute_action(response)

Explanation:
OpenAI's GPT Model: The OpenAI_Model class now interacts with OpenAI's API to process queries.

Agent Query: The agent formulates the user's query and sends it to the GPT model via OpenAI's API.

API Key: Replace "your_openai_api_key" with your actual API key to run the code.

No comments:

Post a Comment

Hugging Face, Claude, and MCP (Model Context Protocol)

Hugging Face, Claude, and MCP (Model Context Protocol) serve different purposes in the AI ecosystem, but they share some similarities in th...