Hi Prasad S,
You can achieve this without using Retrieval-Augmented Generation (RAG) by directly reading the file's content, sending it as part of a prompt to Azure OpenAI, and receiving responses without storing history. Below is a Python script that does this programmatically using the Azure OpenAI API.
import openai
# Azure OpenAI API Configurations
AZURE_OPENAI_ENDPOINT = "Endpoint"
AZURE_OPENAI_KEY = "Key"
DEPLOYMENT_NAME = "Name of your deployment"
# Initialize Azure OpenAI client
client = openai.AzureOpenAI(
azure_endpoint=AZURE_OPENAI_ENDPOINT,
api_key=AZURE_OPENAI_KEY,
api_version="2024-02-01" # Use a valid API version
)
def read_txt_file(file_path):
""" Reads a text file and returns its content """
with open(file_path, "r", encoding="utf-8") as f:
return f.read()
def ask_openai(question, file_content):
""" Sends file content and question to Azure OpenAI for a response """
messages = [
{"role": "system", "content": "You are an AI that answers questions based on provided file content."},
{"role": "user", "content": f"File content:\n{file_content}\n\nNow, answer this question:\n{question}"}
]
response = client.chat.completions.create(
model=DEPLOYMENT_NAME, # Correct method for Azure OpenAI
messages=messages,
temperature=0.5,
max_tokens=500
)
return response.choices[0].message.content # Correct response handling
# Example usage
file_path = "cric.txt" # Your file name
file_content = read_txt_file(file_path)
question = "What is the main topic discussed in the file?"
answer = ask_openai(question, file_content)
print("Answer:", answer)
Here is the screenshot:
I hope this information helps.