Chaining functions together

pink circles of semantic kernel

In previous articles, we showed how you could invoke a Semantic Kernel function (whether semantic or native) individually. Oftentimes, however, you may want to string multiple functions together into a single pipeline to simplify your code. In this article, we'll put this knowledge to use by demonstrating how you could refactor the code from the calling nested functions article to make it more readable and maintainable.

If you want to see the final solution to this article, you can check out the following samples in the public documentation repository. Use the link to the previous solution if you want to follow along.

Language Link to previous solution Link to final solution
C# Open solution in GitHub Open solution in GitHub
Python Open solution in GitHub Open solution in GitHub

Passing data to semantic functions with input

Semantic Kernel was designed in the spirit of UNIX's piping and filtering capabilities. To replicate this behavior, we've added a special variable called input into the kernel's context object that allows you to stream output from one semantic function to the next.

Passing data with $input in Semantic Kernel

For example we can make three inline semantic functions and string their outputs into the next by adding the input variable into each prompt.

Create and register the semantic functions.

string myJokePrompt = """
Tell a short joke about {{$input}}.
""";
string myPoemPrompt = """
Take this "{{$input}}" and convert it to a nursery rhyme.
""";
string myMenuPrompt = """
Make this poem "{{$input}}" influence the three items in a coffee shop menu. 
The menu reads in enumerated form:

""";

var myJokeFunction = kernel.CreateSemanticFunction(myJokePrompt, maxTokens: 500);
var myPoemFunction = kernel.CreateSemanticFunction(myPoemPrompt, maxTokens: 500);
var myMenuFunction = kernel.CreateSemanticFunction(myMenuPrompt, maxTokens: 500);

Run the functions sequentially. Notice how all of the functions share the same context.

var context = kernel.CreateNewContext("Charlie Brown");
await myJokeFunction.InvokeAsync(context);
await myPoemFunction.InvokeAsync(context);
await myMenuFunction.InvokeAsync(context);

Console.WriteLine(context);

Which would result in something like:

1. Colossus of Memnon Latte - A creamy latte with a hint of sweetness, just like the awe-inspiring statue.

2. Gasp and Groan Mocha - A rich and indulgent mocha that will make you gasp and groan with delight.

3. Heart Skipping a Beat Frappuccino - A refreshing frappuccino with a hint of sweetness that will make your heart skip a beat.

Using the RunAsync method to simplify your code

Running each function individually can be very verbose, so Semantic Kernel also provides the RunAsync method in C# or run_async method in Python that automatically calls a series of functions sequentially, all with the same context object.

var myOutput = await kernel.RunAsync(
    new ContextVariables("Charlie Brown"),
    myJokeFunction,
    myPoemFunction,
    myMenuFunction);

Console.WriteLine(myOutput);

Passing more than just input with native functions

In the previous articles, we've already seen how you can update and retrieve additional properties from the context object within native functions. We can use this same technique to pass additional data between functions within a pipeline.

We'll demonstrate this by updating the code written in the calling nested functions article to use the RunAsync method with multiple functions. Use the link to the previous completed solution at the top of the page if you want to follow along.

Adding a function that changes variables during runtime

In the previous example, we used the RouteRequest function to individually call each of the Semantic Kernel functions, and in between calls, we manually updated the variables before running the next function.

We can simplify this code by creating a new native function that performs the same context update operations as part of a chain. We'll call this function ExtractNumbersFromJson and it will take the JSON string from the input variable and extract the numbers from it into the context object.

Add the following code to your OrchestratorPlugin class.

[SKFunction, Description("Extracts numbers from JSON")]
public static SKContext ExtractNumbersFromJson(SKContext context)
{
    JObject numbers = JObject.Parse(context.Variables["input"]);

    // Loop through numbers and add them to the context
    foreach (var number in numbers)
    {
        if (number.Key == "number1")
        {
            context.Variables["input"] = number.Value!.ToString();
            continue;
        }

        context.Variables[number.Key] = number.Value!.ToString();
    }
    return context;
}

Using the RunAsync method to chain our functions

Now that we have a function that can extract numbers, we can update our RouteRequest function to use the RunAsync method to call the functions in a pipeline. Update the RouteRequest function to the following. Notice how we can now call all of our functions in a single call to RunAsync.

[SKFunction, Description("Routes the request to the appropriate function.")]
public async Task<string> RouteRequestAsync(
    [Description("The user request")] string input
)
{
    // Save the original user request
    string request = input;

    // Retrieve the intent from the user request
    var getIntent = _kernel.Functions.GetFunction("OrchestratorPlugin", "GetIntent");
    var getIntentVariables = new ContextVariables
    {
        ["input"] = input,
        ["options"] = "Sqrt, Multiply"
    };
    string intent = (await _kernel.RunAsync(getIntentVariables, getIntent)).GetValue<string>()!.Trim();

    // Call the appropriate function
    ISKFunction MathFunction;
    switch (intent)
    {
        case "Sqrt":
            MathFunction = this._kernel.Functions.GetFunction("MathPlugin", "Sqrt");
            break;
        case "Multiply":
            MathFunction = this._kernel.Functions.GetFunction("MathPlugin", "Multiply");
            break;
        default:
            return "I'm sorry, I don't understand.";
    }

    // Get remaining functions
    var createResponse = this._kernel.Functions.GetFunction("OrchestratorPlugin", "CreateResponse");
    var getNumbers = this._kernel.Functions.GetFunction("OrchestratorPlugin", "GetNumbers");
    var extractNumbersFromJson = this._kernel.Functions.GetFunction("OrchestratorPlugin", "ExtractNumbersFromJson");

    // Run the pipeline
    var output = await this._kernel.RunAsync(
        request,
        getNumbers,
        extractNumbersFromJson,
        MathFunction,
        createResponse
    );
    return output.GetValue<string>()!;
}

After making these changes, you should be able to run the code again and see the same results as before. Only now, the RouteRequest is easier to read and you've created a new native function that can be reused in other pipelines.

Starting a pipeline with additional context variables

So far, we've only passed in a string to the RunAsync method. You can, however, also pass in a context object to start the pipeline with additional information.

This is helpful because it can allow us to persist the initial $input variable across all functions in the pipeline without it being overwritten. For example, in our current pipeline, the user's original request is overwritten by the output of the GetNumbers function. This makes it difficult to retrieve the original request later in the pipeline to create a natural sounding response. By storing the original request as another variable, we can retrieve it later in the pipeline.

Passing a context object to RunAsync

To pass a context object to RunAsync, you can create a new context object and pass it as the first parameter. This will start the pipeline with the variables in the context object. We'll be creating a new variable called original_request to store the original request. Later, we'll show where to add this code in the RouteRequest function.

var pipelineContext = new ContextVariables(request);
pipelineContext["original_request"] = request;

Creating a semantic function that uses the new context variables

Now that we have a variable with the original request, we can use it to create a more natural sounding response. We'll create a new semantic function called CreateResponse that will use the original_request variable to create a response in the OrchestratorPlugin.

Start by creating a new folder called CreateResponse in your OrchestratorPlugin folder. Then create the config.json and skprompt.txt files and paste the following code into the config.json file. Notice how we now have two input variables, input and original_request.

{
     "schema": 1,
     "type": "completion",
     "description": "Creates a response based on the original request and the output of the pipeline",
     "completion": {
          "max_tokens": 256,
          "temperature": 0.0,
          "top_p": 0.0,
          "presence_penalty": 0.0,
          "frequency_penalty": 0.0
     },
     "input": {
          "parameters": [
               {
                    "name": "input",
                    "description": "The user's request.",
                    "defaultValue": ""
               },
               {
                    "name": "original_request",
                    "description": "The original request from the user.",
                    "defaultValue": ""
               }
          ]
     }
}

Next, copy and paste the following prompt into skprompt.txt.

The answer to the users request is: {{$input}}
The bot should provide the answer back to the user.

User: {{$original_request}}
Bot:

You can now update the RouteRequest function to include the CreateResponse function in the pipeline. Update the RouteRequest function to the following:

[SKFunction, Description("Routes the request to the appropriate function.")]
public async Task<string> RouteRequestAsync(
    [Description("The user request")] string input
)
{
    // Save the original user request
    string request = input;

    // Retrieve the intent from the user request
    var getIntent = _kernel.Functions.GetFunction("OrchestratorPlugin", "GetIntent");
    var getIntentVariables = new ContextVariables
    {
        ["input"] = input,
        ["options"] = "Sqrt, Multiply"
    };
    string intent = (await _kernel.RunAsync(getIntentVariables, getIntent)).GetValue<string>()!.Trim();

    // Call the appropriate function
    ISKFunction MathFunction;
    switch (intent)
    {
        case "Sqrt":
            MathFunction = this._kernel.Functions.GetFunction("MathPlugin", "Sqrt");
            break;
        case "Multiply":
            MathFunction = this._kernel.Functions.GetFunction("MathPlugin", "Multiply");
            break;
        default:
            return "I'm sorry, I don't understand.";
    }

    // Get remaining functions
    var createResponse = this._kernel.Functions.GetFunction("OrchestratorPlugin", "CreateResponse");
    var getNumbers = this._kernel.Functions.GetFunction("OrchestratorPlugin", "GetNumbers");
    var extractNumbersFromJson = this._kernel.Functions.GetFunction("OrchestratorPlugin", "ExtractNumbersFromJson");

    // Create a new context with the original request
    var pipelineContext = new ContextVariables(request);
    pipelineContext["original_request"] = request;
    // Run the pipeline
    output = await this._kernel.RunAsync(
        pipelineContext,
        getNumbers,
        extractNumbersFromJson,
        MathFunction,
        createResponse
    );

    return output.GetValue<string>()!;
}

Testing the new pipeline

Now that we've updated the pipeline, we can test it out. Run the following code in your main file.

using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Plugins.Core;
using Plugins.OrchestratorPlugin;

IKernel kernel = new KernelBuilder()
    // Add a text or chat completion service using either:
    // .WithAzureTextCompletionService()
    // .WithAzureChatCompletionService()
    // .WithOpenAITextCompletionService()
    // .WithOpenAIChatCompletionService()
    .Build();

var pluginsDirectory = Path.Combine(System.IO.Directory.GetCurrentDirectory(), "plugins");

// Import the semantic functions
kernel.ImportSemanticFunctionsFromDirectory(pluginsDirectory, "OrchestratorPlugin");

// Import the native functions
var mathPlugin = kernel.ImportFunctions(new Plugins.MathPlugin.Math(), "MathPlugin");
var orchestratorPlugin = kernel.ImportFunctions(new Orchestrator(kernel), "OrchestratorPlugin");
var conversationSummaryPlugin = kernel.ImportFunctions(new ConversationSummaryPlugin(kernel), "ConversationSummaryPlugin");

// Make a request that runs the Sqrt function
var result1 = await kernel.RunAsync("What is the square root of 524?", orchestratorPlugin["RouteRequest"]);
Console.WriteLine(result1);

// Make a request that runs the Add function
var result2 = await kernel.RunAsync("How many square feet would the room be if its length was 12.25 feet and its width was 17.33 feet?", orchestratorPlugin["RouteRequest"]);
Console.WriteLine(result2);

You should get a response like the following. Notice how the response is now more natural sounding.

The square root of 524 is 22.891046284519195.
The room would be approximately 212.2925 square feet.

Take the next step

You are now becoming familiar with orchestrating both semantic and non-semantic functions. Up until now, however, you've had to manually orchestrate the functions. In the next section, you'll learn how to use planner to orchestrate functions automatically.