I want to call the Azure Open API along with my own data to filter the data?

Koteshwara R 85 Reputation points
2024-07-31T10:48:56.02+00:00

Below is my ruby code AI call. I can able to get the data for general questions. But I want to call the Azure open AI with our application data to filter the results.
Example data is:

I have 10000 employee records. I want to get the highest salary of the employee. Could you please help me find out how to solve the issue?

require 'net/http'
require 'uri'
require 'json'
question = 'Give me list of AI Technologies.'
url = 'https://test.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2024-02-15-preview'
uri = URI.parse(url)
request = Net::HTTP::Post.new(uri)
request.content_type = 'application/json'
request['api-key'] = 'ZDDEEE89999'
request.body = {
  messages: [
    {
      role: 'system', 
      content: question 
    }
  ],
  max_tokens: 800,
  temperature: 0.7,
  frequency_penalty: 0,
  presence_penalty: 0,
  top_p: 0.95,
  stop: nil
}.to_json
response = Net::HTTP.start(uri.host, uri.port, use_ssl: uri.scheme == 'https') do |http|
  http.request(request)
end

Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
4,092 questions
0 comments No comments
{count} votes

Accepted answer
  1. YutongTie-MSFT 53,971 Reputation points Moderator
    2024-07-31T21:43:05.9733333+00:00

    Hello,

    Thanks for reaching out to us. To filter results using your own data while making a call to Azure OpenAI, you'll need to incorporate the data into your API call in a meaningful way. Azure OpenAI's GPT models are not designed to directly access external databases or files, so you need to include the data as part of your prompt or context.

    Incorporate Data in Prompt: For small datasets, you can include relevant portions of the data directly in the prompt. However, including large datasets like 10,000 employee records directly in the prompt might not be practical due to token limits.

    Preprocess Data: Preprocess the data on your server or application side to derive the necessary information (e.g., highest salary) before sending a prompt to the AI model. This involves querying your database or processing your dataset to get the results.

    Interactive Data Handling: If you need the AI model to interact with the data dynamically, you can build a system where you preprocess the data, then use AI to analyze or interpret the results.

    Example Approach

    Here’s a Ruby script to find the highest salary from your employee records, then use Azure OpenAI to potentially generate a summary or interpret the results:

    Preprocess Data: First, preprocess the data to find the highest salary.

    require 'net/http'
    require 'uri'
    require 'json'
    highest_salary = 75000 # Example highest salary
    # Define your question
    question = "The highest salary among our employees is $#{highest_salary}. What are some insights or next steps we can take based on this information?"
    # Azure OpenAI API details
    url = 'https://test.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2024-02-15-preview'
    uri = URI.parse(url)
    request = Net::HTTP::Post.new(uri)
    request.content_type = 'application/json'
    request['api-key'] = 'ZDDEEE89999'
    request.body = {
      messages: [
        {
          role: 'system', 
          content: question 
        }
      ],
      max_tokens: 800,
      temperature: 0.7,
      frequency_penalty: 0,
      presence_penalty: 0,
      top_p: 0.95
    }.to_json
    response = Net::HTTP.start(uri.host, uri.port, use_ssl: uri.scheme == 'https') do |http|
      http.request(request)
    end  puts response.body
    

    I hope this helps.

    Regards,

    Yutong

    -Please kindly accept the answer if you feel helpful to support the community, thanks a lot.

    0 comments No comments

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.