Phi Silica in the Windows App SDK

Use the Windows App SDK to connect your Windows app to on-device language models, including Phi Silica, our most powerful NPU-tuned local language model yet.

Windows App SDK 1.6 Experimental 2 will ship with several Artificial Intelligence (AI) APIs for accessing these models to enable local processing and generation of chat, math solving, code generation, reasoning over text, and more.

Important

The experimental channel includes APIs and features in early stages of development. All APIs in the experimental channel are subject to extensive revisions and breaking changes and may be removed from subsequent releases at any time. They are not supported for use in production environments, and apps that use experimental features cannot be published to the Microsoft Store.

Prerequisites

  • Device with a Neural Processing Unit (NPU).
  • Windows App SDK 1.6 Experimental 2.

What can I do with Phi Silica and the Windows App SDK?

With a local Phi Silica language model and the Windows App SDK you can generate text responses to user prompts.

Get a single, complete response based on a string prompt

This example shows how to generate a response to a Q&A prompt where the full response is generated before the result is returned.

  1. First, we ensure the language model is available by calling the IsAvailable method and waiting for the MakeAvailableAsync method to return successfully.
  2. Once the language model is available, we create a LanguageModel object to reference it.
  3. Finally, we submit a string prompt to the model using the GenerateResponseAsync method, which returns the complete result.
using Microsoft.Windows.AI.Generative; 
 
 
if (!LanguageModel.IsAvailable()) 
{ 
   var op = await LanguageModel.MakeAvailableAsync(); 
} 
 
using LanguageModel languageModel = await LanguageModel.CreateAsync(); 
 
string prompt = "Provide the molecular formula for glucose"; 
 
var result = await languageModel.GenerateResponseAsync(prompt); 
 
Console.WriteLine(result.Response); 

The response generated by this example is:

The molecular formula for glucose is C6H12O6.

Get a stream of partial results based on a string prompt

This example shows how to generate a response to a Q&A prompt where the response is returned as a stream of partial results.

  1. First we create a LanguageModel object to reference the local language model (we already checked for the presence of the language model in the previous snippet).
  2. Then we asynchronously retrieve the LanguageModelResponse in a call to GenerateResponseWithProgressAsync and write it to the console as the response is generated.
using Microsoft.Windows.AI.Generative.LanguageModel languageModel = 
     await Microsoft.Windows.AI.Generative.LanguageModel.CreateAsync(); 
 
 string prompt = "Q:Provide the molecular formula for glucose.\nA:"; 
 
 
 AsyncOperationProgressHandler<Microsoft.Windows.AI.Generative.LanguageModelResponse, string> 
 progressHandler = (asyncInfo, delta) => 
 { 
     Console.WriteLine($"Progress: {delta}"); 
     Console.WriteLine($"Response so far: {asyncInfo.GetResults().Response()}"); 
 }; 
 
var asyncOp = languageModel.GenerateResponseWithProgressAsync(prompt); 
 
 asyncOp.Progress = progressHandler; 
 
 var result = await asyncOp;  
 
 Console.WriteLine(result.Response);

Additional resources

Access files and folders with Windows App SDK and WinRT APIs