Overview of AI and LLM configurations in Microsoft Cloud for Sovereignty (preview)

Important

This is a preview feature. This information relates to a prerelease feature that may be substantially modified before it's released. Microsoft makes no warranties, expressed or implied, with respect to the information provided here.

Public Sector organizations can take advantage of the latest AI innovation in the public cloud, while managing their data in accordance with their local policies and regulatory requirements with the help of Microsoft Cloud for Sovereignty.

Microsoft Cloud for Sovereignty offers agility and flexibility, advanced cybersecurity features, and access to the latest innovations, like Azure OpenAI, to accelerate digital transformation and the delivery of essential public services. It enables customers to build and digitally transform workloads in the Microsoft cloud while helping to meet many of their specific compliance, security, and policy requirements.

Azure OpenAI Service provides access to OpenAI's powerful language models including GPT-4, GPT-3.5 (ChatGPT), Codex, and Embeddings model series. These foundational language models are pretrained on vast amounts of data to perform tasks, such as content generation, summarization, semantic search, and natural language to code translation. You can use Azure OpenAI Service to access the pretrained models and build AI-enabled applications more quickly and with minimal effort, while using Microsoft Cloud for Sovereignty to enforce compliance, security and policy requirements with enterprise scale sovereign controls and cloud architecture.

Benefits

You can use Azure OpenAI services on your data to:

  • Increase employee productivity by reducing the time they need to find critical information in your organization's collective knowledge base.

  • Increase constituent satisfaction by simplifying complex regulation or program requirements.

Example use case

Sovereign use cases are best implemented based on Sovereign Landing Zone (SLZ). The SLZ consists of a management group hierarchy and common platform resources that facilitate networking, logging, and managed service identities. The following diagram shows the reference architecture of sovereign AI and LLM configurations.

Reference architecture of Sovereign AI and LLM configurations.

The root management group of an SLZ is commonly referred to as a landing zone, or enterprise scale landing zone. Individual subscriptions residing in one of the child management groups underneath the parent are commonly referred to as application landing zones, or workload landing zones. Application workloads can be deployed into an SLZ environment in one of the four default landing zones:

  • Corp (corporate) - Non-internet facing, nonconfidential workloads

  • Online - Internet facing, nonconfidential workloads

  • Confidential corp - Non-internet facing, confidential workloads (only allows confidential computing resources to be used)

  • Confidential online - Internet facing, confidential workloads (only allows confidential computing resources to be used)

The main difference between the Corp and Online management groups is how they handle public endpoints. The Online environment permits the use of public endpoints, whereas the Corp environment doesn't. Learn more about the architecture of the SLZ.

In an SLZ environment, you should deploy LLM-based solutions as dedicated workloads in their own subscriptions within the Corp or Online management group hierarchy.

We recommend using the Corp environment as the secure standard pattern for implementing LLM RAG-based applications like copilots for internal organization use. You need ExpressRoute or VPN based connections to access the front-end APIs or user interfaces that connect to Azure AI services and provide LLM capabilities to the end users or consumers.

To offer LLM or RAG-based applications to the public, use workload landing zones in the Online management group hierarchy. However, you must access all services required for implementation through private endpoints securely in the virtual network. Only provide the API or front-end web application through a public endpoint to the end users or consumers.

In this case, you should protect the public endpoint with a Web Application Firewall. You should also apply and configure appropriate DDoS and other security services. Depending on your preferences, this configuration might happen centrally in the hub virtual network, or decentralized in the workload’s virtual network.

If you need to integrate data from Confidential landing zones with LLM-based workloads, you must run the transformation processes that process and store the data in services like Azure AI services, such as Azure AI Search or Azure OpenAI, within a Confidential landing zone. Additionally, these processes should actively filter and manage data to prevent sending confidential data that must be encrypted in use to nonconfidential services and workloads. You need to implement this filtering in custom business logic on a case-by-case basis.

If you need to ingest, transform, and consume data by LLM-based workloads, we recommend that you deploy a data landing zone aligned to the data domains. A data landing zone has several layers that enable agility for the servicing data integrations and data products it contains.

You can deploy a new data landing zone with a standard set of services that let the data landing zone begin ingesting and analyzing data. You can connect the data landing zone to the LLM data landing zone and all other data landing zones with virtual network peering. This mechanism lets you share data securely through the Azure internal network while achieving lower latency and higher throughput than going through the hub.

Some implementations might require the use of sensitive or confidential data that requires encryption in use, which is available with confidential computing. For this scenario, you can run virtual machine-based data solutions in landing zones under the Confidential Management Group. Some PaaS data services might not be able to run in confidential virtual machines.

Next steps