将 Fabric 中的 Azure OpenAI 与 Python SDK 和 Synapse ML 结合使用(预览版)

重要

此功能目前为预览版

本文介绍如何在 Fabric 中使用 Azure OpenAI、 OpenAI Python SDK 和 SynapseML。

先决条件

未在默认运行时中安装 OpenAI Python SDK,需要先进行安装。 将环境更改为运行时版本 1.3 或更高版本。

%pip install -U openai

聊天

在 Fabric 笔记本中创建一个新单元格以使用此代码,与上一步中所述的单元分开来安装 OpenAI 库。 GPT-4.1 和 GPT-4.1-mini 是针对聊天界面优化的语言模型。 此处提供的示例展示了简单的聊天完成操作,并不旨在充当教程。

注释

OpenAI Python SDK 的不同版本可能具有不同的方法名称和参数。 请参阅你正在使用的版本 的官方文档

import openai

response = openai.ChatCompletion.create(
    deployment_id="gpt-4.1",
    messages=[
        
{
  "role": "user",
  "content": """Analyze the following text and return a JSON array of issue insights.

Each item must include:
- issue_brief (1 sentence)
- scenario
- severity (high | medium | low)
- verbatim_quotes (list)
- recommended_fix

Text:
We booked the hotel room in advance for our family trip. The check-in the great however the room service was slow and pool was closed

Return JSON only.
"""
}

    ],

)
print(f"{response.choices[0].message.role}: {response.choices[0].message.content}")

输出


assistant: [
  {
    "issue_brief": "Room service was slow during the stay.",
    "scenario": "Guests experienced delays in receiving room service after check-in.",
    "severity": "medium",
    "verbatim_quotes": [
      "the room service was slow"
    ],
    "recommended_fix": "Improve staffing or training for room service to ensure timely delivery of services."
  },
  {
    "issue_brief": "The hotel pool was unavailable during the stay.",
    "scenario": "Guests were unable to use the pool because it was closed.",
    "severity": "medium",
    "verbatim_quotes": [
      "pool was closed"
    ],
    "recommended_fix": "Notify guests in advance about facility closures and provide alternative amenities or compensation if possible."
  }
]

嵌入

在 Fabric 笔记本中创建一个新单元格以使用此代码,与上一步骤中所述的单元分开来安装 openai 库。 嵌入是机器学习模型和算法可以轻松使用的一种特殊数据表示格式。 它包含信息丰富的文本语义,由浮点数向量表示。 向量空间中两个嵌入之间的距离与两个原始输入之间的语义相似性有关。 例如,如果两个文本相似,则它们的向量表示形式也应该相似。

此处演示的示例展示了如何获取嵌入,并不旨在充当教程。

response = openai.embeddings.create(
         input="The food was delicious and the waiter...",
         model="text-embedding-ada-002"  # Or another embedding model
     )

print(response)

输出

    CreateEmbeddingResponse(
        data=[
            Embedding(
                embedding=[
                    0.0022756962571293116,
                    -0.009305915795266628,
                    0.01574261300265789,
                    ...
                    -0.015387134626507759,
                    -0.019424352794885635,
                    -0.0028009789530187845
                ],
                index=0,
                object='embedding'
        )
        ],
        model='text-embedding-ada-002',
        object='list',
        usage=Usage(
            prompt_tokens=8,
            total_tokens=8
        )
    )