共用方式為


搭配 Python SDK 和 Synapse ML 在 Fabric 中使用 Azure OpenAI (預覽版)

重要

這項功能處於預覽狀態

本文說明如何使用 OpenAI Python SDK 和 SynapseML 在 Fabric 中使用 Azure OpenAI 的範例。

必要條件

OpenAI Python SDK 未安裝在預設運行時間中,您必須先加以安裝。

%pip install openai==0.28.1

聊天

ChatGPT 和 GPT-4 是針對交談介面優化的語言模型。 此處顯示的範例會展示簡單的聊天完成作業,並不適合做為教學課程。

import openai

response = openai.ChatCompletion.create(
    deployment_id='gpt-35-turbo-0125', # deployment_id could be one of {gpt-35-turbo-0125 or gpt-4-32k}
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Knock knock."},
        {"role": "assistant", "content": "Who's there?"},
        {"role": "user", "content": "Orange."},
    ],
    temperature=0,
)

print(f"{response.choices[0].message.role}: {response.choices[0].message.content}")

輸出

    assistant: Orange who?

我們也可以串流回應

response = openai.ChatCompletion.create(
    deployment_id='gpt-35-turbo-0125', # deployment_id could be one of {gpt-35-turbo-0125 or gpt-4-32k}
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Knock knock."},
        {"role": "assistant", "content": "Who's there?"},
        {"role": "user", "content": "Orange."},
    ],
    temperature=0,
    stream=True
)

for chunk in response:
    delta = chunk.choices[0].delta

    if "role" in delta.keys():
        print(delta.role + ": ", end="", flush=True)
    if "content" in delta.keys():
        print(delta.content, end="", flush=True)

輸出

    assistant: Orange who?

Embeddings

內嵌是機器學習模型和演算法可以輕鬆地利用的特殊數據表示格式。 它包含文字的信息豐富語意意義,以浮點數向量表示。 向量空間中兩個內嵌之間的距離與兩個原始輸入之間的語意相似性有關。 例如,如果兩個文字相似,則其向量表示也應該相似。

此處示範的範例示範如何取得內嵌,但並非作為教學課程。

deployment_id = "text-embedding-ada-002" # set deployment_name as text-embedding-ada-002
embeddings = openai.Embedding.create(deployment_id=deployment_id,
                                     input="The food was delicious and the waiter...")
                                
print(embeddings)

輸出

    {
      "object": "list",
      "data": [
        {
          "object": "embedding",
          "index": 0,
          "embedding": [
            0.002306425478309393,
            -0.009327292442321777,
            0.015797346830368042,
            ...
            0.014552861452102661,
            0.010463837534189224,
            -0.015327490866184235,
            -0.01937841810286045,
            -0.0028842221945524216
          ]
        }
      ],
      "model": "ada",
      "usage": {
        "prompt_tokens": 8,
        "total_tokens": 8
      }
    }