Integrate with LangChain

Prompt Flow can also be used together with the LangChain python library, which is the framework for developing applications powered by LLMs, agents and dependency tools. In this document, we'll show you how to supercharge your LangChain development on our prompt Flow.

Screenshot of flows with the LangChain python library.

We introduce the following sections:

Benefits of LangChain integration

We consider the integration of LangChain and prompt flow as a powerful combination that can help you to build and test your custom language models with ease, especially in the case where you may want to use LangChain modules to initially build your flow and then use our prompt Flow to easily scale the experiments for bulk testing, evaluating then eventually deploying.

  • For larger scale experiments - Convert existed LangChain development in seconds. If you have already developed demo prompt flow based on LangChain code locally, with the streamlined integration in prompt Flow, you can easily convert it into a flow for further experimentation, for example you can conduct larger scale experiments based on larger data sets.
  • For more familiar flow engineering - Build prompt flow with ease based on your familiar Python SDK. If you're already familiar with the LangChain SDK and prefer to use its classes and functions directly, the intuitive flow building python node enables you to easily build flows based on your custom python code.

How to convert LangChain code into flow

Assume that you already have your own LangChain code available locally, which is properly tested and ready for deployment. To convert it to a runnable flow on our platform, you need to follow the steps below.

Prerequisites for environment and runtime

You can customize the environment that you use to run this flow by adding packages in the requirements.txt file in the flow folder. Learn more about automatic runtime

Convert credentials to prompt flow connection

When developing your LangChain code, you might have defined environment variables to store your credentials, such as the AzureOpenAI API KEY, which is necessary for invoking the AzureOpenAI model.

Screenshot of Azure OpenAI example in LangChain.

Instead of directly coding the credentials in your code and exposing them as environment variables when running LangChain code in the cloud, it is recommended to convert the credentials from environment variables into a connection in prompt flow. This allows you to securely store and manage the credentials separately from your code.

Create a connection

Create a connection that securely stores your credentials, such as your LLM API KEY or other required credentials.

  1. Go to prompt flow in your workspace, then go to connections tab.
  2. Select Create and select a connection type to store your credentials. (Take custom connection as an example) Screenshot of flows on the connections tab highlighting the custom button in the create drop-down menu.
  3. In the right panel, you can define your connection name, and you can add multiple Key-value pairs to store your credentials and keys by selecting Add key-value pairs. Screenshot of add custom connection point to the add key-value pairs button.

Note

  • You can set one Key-Value pair as secret by is secret checked, which will be encrypted and stored in your key value.
  • Make sure at least one key-value pair is set as secret, otherwise the connection will not be created successfully.

Then this custom connection is used to replace the key and credential you explicitly defined in LangChain code, if you already have a LangChain integration Prompt flow, you can jump to​​​​​​​ Configure connection, input and output.

LangChain code conversion to a runnable flow

All LangChain code can directly run in the Python tools in your flow as long as your runtime environment contains the dependency packages, you can easily convert your LangChain code into a flow by following the steps below.

Convert LangChain code to flow structure

Note

There are two ways to convert your LangChain code into a flow.

  • To simplify the conversion process, you can directly initialize the LLM model for invocation in a Python node by utilizing the LangChain integrated LLM library.
  • Another approach is converting your LLM consuming from LangChain code to our LLM tools in the flow, for better further experimental management.

For quick conversion of LangChain code into a flow, we recommend two types of flow structures, based on the use case:

Types Desc Case
Type A A flow that includes both prompt nodes and python nodes You can extract your prompt template from your code into a prompt node, then combine the remaining code in a single Python node or multiple Python tools. This structure is ideal for who want to easily tune the prompt by running flow variants and then choose the optimal one based on evaluation results.
Type B A flow that includes python nodes only You can create a new flow with python nodes only, all code including prompt definition will run in python nodes. This structure is suitable for who don't need to explicit tune the prompt in workspace, but require faster batch testing based on larger scale datasets.

For example the type A flow from the chart is like:

Screenshot of flows highlighting the prompt button and system template.

Screenshot of system template showing variant one and zero with the finish tuning button highlighted.

While the type B flow would look like:

Screenshot of flows showing the LangChain code node and graph.

To create a flow in Azure Machine Learning, you can go to your workspace, then select Prompt flow in the left navigation, then select Create to create a new flow. More detailed guidance on how to create a flow is introduced in Create a Flow.

Configure connection, input and output

After you have a properly structured flow and are done moving the code to specific tool nodes, you need to replace the original environment variables with the corresponding key in the connection, and configure the input and output of the flow.

Configure connection

To utilize a connection that replaces the environment variables you originally defined in LangChain code, you need to import promptflow connection library promptflow.connections in the python node.

For example:

If you have a LangChain code that consumes the AzureOpenAI model, you can replace the environment variables with the corresponding key in the Azure OpenAI connection:

Import library from promptflow.connections import AzureOpenAIConnection

Screenshot of LangChain code in prompt flow.

For custom connection, you need to follow the steps:

  1. Import library from promptflow.connections import CustomConnection, and define an input parameter of type CustomConnection in the tool function. Screenshot of doc search chain node highlighting the custom connection.
  2. Parse the input to the input section, then select your target custom connection in the value dropdown. Screenshot of the chain node highlighting the connection.
  3. Replace the environment variables that originally defined the key and credential with the corresponding key added in the connection.
  4. Save and return to authoring page, and configure the connection parameter in the node input.

Configure input and output

Before running the flow, configure the node input and output, as well as the overall flow input and output. This step is crucial to ensure that all the required data is properly passed through the flow and that the desired results are obtained.

Next steps