I am getting very strange completions on prompt to generate a short story on a topic. Why does model return some kind of logs, Python code samples or random chat history?

Dmitry Kholodkov 0 Reputation points Microsoft Employee
2023-09-08T04:45:23.9233333+00:00
This pro prompt worked well with the older version of the model. 

Input: Input: write a scary horror short story about hoar frost? Reply with story text only.
  *****End OF Prompt

Chatbot:  (300 word limit).", "Output: Hoar Frost Disguise.", "Output: You arrive home late one winter afternoon, and the world around you is suddenly covered in the most beautiful hoar frost you have ever seen. It shimmers and glistens in the low sun. You feel cheered for the fi...
#nlp #twitter


```diff
+ Solution
import openai
import time

# now set up openai
secret = "YOUR SECRET KEY HERE"
model_name = "text-davinci-002"   #You can set any model here, however text-davinci-002 understands the context better.
openai.api_key = secret
prompt = 'Input: write a scary horror short story about hoar frost? \nReply with story text only. (300 word limit).\nOutput:'

def get_generated_story(prompt, max_tokens):
  response = openai.Completion.create(
    engine=model_name,
    prompt=prompt,
    max_tokens=max_tokens,
    n=1,
    stop=None,
    temperature=0.5,
  )
  story = response.choices[0].text
  return story

def run_on_bot(prompt, max_tokens):
  try:
    story = get_generated_story(prompt, max_tokens)
    if bool(story.strip()):
      print("GPT3 generated the following story: ", story)
    else:
      print("We couldn't generate any story for you. ")
      return ""
    return story
  except Exception as e:
    print(e)

bot_story = run_on_bot(prompt, 300)
  • Creating Bot using GPT3 API OpenAI's GPT-3 is a cost-effective, prebuilt conversational AI tool that can add natural language processing to your application with a minimal amount of design and structure. When it comes to automation, chatbots are arguably the lowest hanging fruit. They are easy, fast, and cheap to produce, since they don't require custom design or user interface (UI) development. The first step in creating a bot would be to determine the types of chats you'd like it to handle. Based on the objective, we can go ahead and create a tailored chatbot that responds according to the nature of input and the information necessary for a response. For integrating GPT-3 API with the bot, we have to first create credentials on Open AI and then create API code. Once we have created the API code, we can start implementing a user interface for the chatbot. (I demonstrated the API code above, to create a bot we can add some basic responses for different types of inputs)
- API credentials are hidden for security reasons

Input:

"What are the services you provide?"

Output:

"We offer a range of IT services including legacy application migration, data management, cloud computing, security compliance, DevOps, and more."

Input:

"How long does it take to develop an app?"

Output:

"The time to develop an app depends on its complexity and feature requirements. Our developers work diligently to create quality apps in the least possible time and within budget"

Input:

"Can I change the requirements in the middle of a project?"

Output:

"Yes, you can change the requirements at any point in the project. Our team is flexible and will work with you to ensure that these changes do not affect the project's timeline"

Here are a few responses that I tailored for the bot.

# API Integration and bot creation customized responses

import openai
import time

secret = "YOUR SECRET KEY HERE"
model_name = "text-davinci-002"
openai.api_key = secret
prompt = 'Input:'

def get_response(prompt, message):
  time.sleep(1)
  result = ""
  prompt += message
  try:
    response = openai.Completion.create(
      engine=model_name,
      prompt=prompt,
      max_tokens=1024,
      n=1,
      stop=None,
      temperature=0.5,
    )

    result = response.choices[0].text.strip()
  except Exception as e:
    print(e)
  return result

def gpt_bot():
  message = "Hi there! I'm your virtual assistant and I'm here to help you get information about our company, our services, and anything tech-related. Ask me anything and I'll do my best to answer you!"

  while True:
    user_input = input(message)
    if "bye" in user_input.lower():
      print("Bot: Bye! Have a great day")
      break

    result = ""
    if bool(user_input.strip()):
      result = get_response(prompt, user_input)

    if "what" in user_input.lower() and "do you provide" in user_input.lower():
      print("\nBot: We offer a range of IT services including legacy application migration, data management, cloud computing, security compliance, DevOps, and more.\n")
    elif "app" in user_input.lower() and "develop" in user_input.lower():
      print("\nBot: The time to develop an app depends on its complexity and feature requirements. Our developers work diligently to create quality apps in the least possible time and within budget.\n")
    elif any(a.lower() in user_input.lower() for a in ["can","change"]) and "project" in user_input.lower():
      print("\nBot: Yes, you can change the requirements at any point in the project. Our team is flexible and will work with you to ensure that these changes do not affect the project's timeline.\n")
    elif bool(result.strip()):
      print("Bot:", result)

gpt_bot()

Conclusion

In this post, we learned about the use cases of NLP and how we can use it to improve business decisions, operations, and customer satisfaction. We also worked with different tools and APIs(Stanford's Named Entity Recognizer, Google's TextBlob, and OpenAi API) in Python to understand and analyze text data. We also demonstrated how we can integrate NLP tools into different business cases to build bots and extract insights from feedback or generate contextual responses for better interactions with the customers.

If you have any questions, leave a comment! Thanks for reading.

References:

If you liked reading this tutorial and have any suggestion, let's discuss it in the comment section.

Show some ?? by giving a star to this repo if you find this repository useful to you and for other developers!

Also, Did you mention me on Twitter sharing your data science projects? @Krushnat_Dange Connect with me on social mediums.You can also visit my website for more projects and articles - https://krushnat.io/

Thank You



Azure OpenAI Service
Azure OpenAI Service
An Azure service that provides access to OpenAI’s GPT-3 models with enterprise capabilities.
3,027 questions
{count} votes

1 answer

Sort by: Most helpful
  1. Saurabh Sharma 23,816 Reputation points Microsoft Employee
    2023-09-14T04:33:05.02+00:00

    @Dmitry Kholodkov As per internal discussions, this erratic behavior was due to using regular prompts with GPT-35-Turbo model. We started getting results, once the prompts got changed with the Chat Completions format as mentioned in documentation.

    C# sample is available at: https://learn.microsoft.com/en-us/azure/ai-services/openai/chatgpt-quickstart?tabs=command-line&pivots=programming-language-csharp

    However, you have faced another problem which we are currently looking into it. I will update this thread once we figure out the root cause and fix.

    Thanks

    Saurabh


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.