Bemærk
Adgang til denne side kræver godkendelse. Du kan prøve at logge på eller ændre mapper.
Adgang til denne side kræver godkendelse. Du kan prøve at ændre mapper.
Vigtigt
Denne funktion er en prøveversion.
Dette dokument viser eksempler på, hvordan du bruger Azure OpenAI i Fabric ved hjælp af REST API.
Initialisering
from synapse.ml.mlflow import get_mlflow_env_config
from trident_token_library_wrapper import PyTridentTokenLibrary
mlflow_env_configs = get_mlflow_env_config()
mwc_token = PyTridentTokenLibrary.get_mwc_token(mlflow_env_configs.workspace_id, mlflow_env_configs.artifact_id, 2)
auth_headers = {
"Authorization" : "MwcToken {}".format(mwc_token)
}
Chatte
GPT-4o og GPT-4o-mini er sprogmodeller, der er optimeret til samtalegrænseflader.
import requests
def print_chat_result(messages, response_code, response):
print("==========================================================================================")
print("| OpenAI Input |")
for msg in messages:
if msg["role"] == "system":
print("[System] ", msg["content"])
elif msg["role"] == "user":
print("Q: ", msg["content"])
else:
print("A: ", msg["content"])
print("------------------------------------------------------------------------------------------")
print("| Response Status |", response_code)
print("------------------------------------------------------------------------------------------")
print("| OpenAI Output |")
if response.status_code == 200:
print(response.json()["choices"][0]["message"]["content"])
else:
print(response.content)
print("==========================================================================================")
deployment_name = "gpt-4o" # deployment_id could be one of {gpt-4o or gpt-4o-mini}
openai_url = mlflow_env_configs.workload_endpoint + f"cognitive/openai/openai/deployments/{deployment_name}/chat/completions?api-version=2025-04-01-preview"
payload = {
"messages": [
{"role": "system", "content": "You are an AI assistant that helps people find information."},
{"role": "user", "content": "Does Azure OpenAI support customer managed keys?"}
]
}
response = requests.post(openai_url, headers=auth_headers, json=payload)
print_chat_result(payload["messages"], response.status_code, response)
Udgang
==========================================================================================
| OpenAI Input |
[System] You are an AI assistant that helps people find information.
Q: Does Azure OpenAI support customer managed keys?
------------------------------------------------------------------------------------------
| Response Status | 200
------------------------------------------------------------------------------------------
| OpenAI Output |
As of my last training cut-off in October 2023, Azure OpenAI Service did not natively support customer-managed keys (CMK) for encryption of data at rest. Data within Azure OpenAI is typically encrypted using Microsoft-managed keys.
However, you should verify this information on the official Azure documentation or by contacting Microsoft support, as cloud service features and capabilities are frequently updated.
==========================================================================================
Integreringer
En integrering er et særligt datarepræsentationsformat, som modeller til maskinel indlæring og algoritmer nemt kan bruge. Den indeholder oplysningerrig semantisk betydning af en tekst, der repræsenteres af en vektor af flydende tal. Afstanden mellem to integreringer i vektorområdet er relateret til den semantiske lighed mellem to oprindelige input. Hvis to tekster f.eks. er ens, bør deres vektorrepræsentationer også være ens.
Hvis du vil have adgang til Azure OpenAI-slutpunktet for integreringer i Fabric, kan du sende en API-anmodning i følgende format:
POST <url_prefix>/openai/deployments/<deployment_name>/embeddings?api-version=2024-02-01
deployment_name
kan være text-embedding-ada-002
.
import requests
def print_embedding_result(prompts, response_code, response):
print("==========================================================================================")
print("| OpenAI Input |\n\t" + "\n\t".join(prompts))
print("------------------------------------------------------------------------------------------")
print("| Response Status |", response_code)
print("------------------------------------------------------------------------------------------")
print("| OpenAI Output |")
if response_code == 200:
for data in response.json()['data']:
print("\t[" + ", ".join([f"{n:.8f}" for n in data["embedding"][:10]]) + ", ... ]")
else:
print(response.content)
print("==========================================================================================")
deployment_name = "text-embedding-ada-002"
openai_url = mlflow_env_configs.workload_endpoint + f"cognitive/openai/openai/deployments/{deployment_name}/embeddings?api-version=2025-04-01-preview"
payload = {
"input": [
"empty prompt, need to fill in the content before the request",
"Once upon a time"
]
}
response = requests.post(openai_url, headers=auth_headers, json=payload)
print_embedding_result(payload["input"], response.status_code, response)
Udgang:
==========================================================================================
| OpenAI Input |
empty prompt, need to fill in the content before the request
Once upon a time
------------------------------------------------------------------------------------------
| Response Status | 200
------------------------------------------------------------------------------------------
| OpenAI Output |
[-0.00258819, -0.00449802, -0.01700411, 0.00405622, -0.03064079, 0.01899395, -0.01295485, -0.01426286, -0.03512142, -0.01831212, ... ]
[0.02129045, -0.02013996, -0.00462094, -0.01146069, -0.01123944, 0.00199124, 0.00228992, -0.01370478, 0.00855917, -0.01470356, ... ]
==========================================================================================
Relateret indhold
- Brug færdigbyggede Tekstanalyse i Fabric med REST API
- Brug færdigbyggede Tekstanalyse i Fabric med SynapseML
- Brug færdigbygget Azure AI Translator i Fabric med REST API
- Brug færdigbygget Azure AI Translator i Fabric med SynapseML
- Brug færdigbygget Azure OpenAI i Fabric med Python SDK
- Brug færdigbygget Azure OpenAI i Fabric med SynapseML