Kernel Class
The Kernel of Semantic Kernel.
This is the main entry point for Semantic Kernel. It provides the ability to run functions and manage filters, plugins, and AI services.
Initialize a new instance of the Kernel class.
Constructor
Kernel(plugins: KernelPlugin | dict[str, KernelPlugin] | list[KernelPlugin] | None = None, services: AI_SERVICE_CLIENT_TYPE | list[AI_SERVICE_CLIENT_TYPE] | dict[str, AI_SERVICE_CLIENT_TYPE] | None = None, ai_service_selector: AIServiceSelector | None = None, *, retry_mechanism: RetryMechanismBase = None, function_invocation_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], Awaitable[None]]], Awaitable[None]]]] = None, prompt_rendering_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], Awaitable[None]]], Awaitable[None]]]] = None, auto_function_invocation_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], Awaitable[None]]], Awaitable[None]]]] = None)
Parameters
Name | Description |
---|---|
plugins
|
The plugins to be used by the kernel, will be rewritten to a dict with plugin name as key Default value: None
|
services
|
The services to be used by the kernel, will be rewritten to a dict with service_id as key Default value: None
|
ai_service_selector
|
The AI service selector to be used by the kernel, default is based on order of execution settings. Default value: None
|
**kwargs
Required
|
Additional fields to be passed to the Kernel model, these are limited to filters. |
Keyword-Only Parameters
Name | Description |
---|---|
retry_mechanism
Required
|
|
function_invocation_filters
Required
|
|
prompt_rendering_filters
Required
|
|
auto_function_invocation_filters
Required
|
|
add_embedding_to_object |
Gather all fields to embed, batch the embedding generation and store. |
invoke |
Execute a function and return the FunctionResult. |
invoke_function_call |
Processes the provided FunctionCallContent and updates the chat history. |
invoke_prompt |
Invoke a function from the provided prompt. |
invoke_prompt_stream |
Invoke a function from the provided prompt and stream the results. |
invoke_stream |
Execute one or more stream functions. This will execute the functions in the order they are provided, if a list of functions is provided. When multiple functions are provided only the last one is streamed, the rest is executed as a pipeline. |
add_embedding_to_object
Gather all fields to embed, batch the embedding generation and store.
async add_embedding_to_object(inputs: TDataModel | Sequence[TDataModel], field_to_embed: str, field_to_store: str, execution_settings: dict[str, PromptExecutionSettings], container_mode: bool = False, cast_function: Callable[[list[float]], Any] | None = None, **kwargs: Any)
Parameters
Name | Description |
---|---|
inputs
Required
|
|
field_to_embed
Required
|
|
field_to_store
Required
|
|
execution_settings
Required
|
|
container_mode
Required
|
Default value: False
|
cast_function
Required
|
Default value: None
|
invoke
Execute a function and return the FunctionResult.
async invoke(function: KernelFunction | None = None, arguments: KernelArguments | None = None, function_name: str | None = None, plugin_name: str | None = None, metadata: dict[str, Any] = {}, **kwargs: Any) -> FunctionResult | None
Parameters
Name | Description |
---|---|
function
|
<xref:semantic_kernel.kernel.KernelFunction>
The function or functions to execute, this value has precedence when supplying both this and using function_name and plugin_name, if this is none, function_name and plugin_name are used and cannot be None. Default value: None
|
arguments
|
<xref:semantic_kernel.kernel.KernelArguments>
The arguments to pass to the function(s), optional Default value: None
|
function_name
|
<xref:<xref:semantic_kernel.kernel.str | None>>
The name of the function to execute Default value: None
|
plugin_name
|
<xref:<xref:semantic_kernel.kernel.str | None>>
The name of the plugin to execute Default value: None
|
metadata
|
The metadata to pass to the function(s) Default value: {}
|
kwargs
Required
|
arguments that can be used instead of supplying KernelArguments Default value: None
|
Exceptions
Type | Description |
---|---|
If an error occurs during function invocation |
invoke_function_call
Processes the provided FunctionCallContent and updates the chat history.
async invoke_function_call(function_call: FunctionCallContent, chat_history: ChatHistory, *, arguments: KernelArguments | None = None, execution_settings: PromptExecutionSettings | None = None, function_call_count: int | None = None, request_index: int | None = None, is_streaming: bool = False, function_behavior: FunctionChoiceBehavior = None) -> AutoFunctionInvocationContext | None
Parameters
Name | Description |
---|---|
function_call
Required
|
|
chat_history
Required
|
|
Keyword-Only Parameters
Name | Description |
---|---|
arguments
Required
|
|
execution_settings
Required
|
|
function_call_count
Required
|
|
request_index
Required
|
|
is_streaming
Required
|
|
function_behavior
Required
|
|
invoke_prompt
Invoke a function from the provided prompt.
async invoke_prompt(prompt: str, function_name: str | None = None, plugin_name: str | None = None, arguments: KernelArguments | None = None, template_format: Literal['semantic-kernel', 'handlebars', 'jinja2'] = 'semantic-kernel', **kwargs: Any) -> FunctionResult | None
Parameters
Name | Description |
---|---|
prompt
Required
|
The prompt to use |
function_name
Required
|
The name of the function, optional Default value: None
|
plugin_name
Required
|
The name of the plugin, optional Default value: None
|
arguments
Required
|
<xref:<xref:semantic_kernel.kernel.KernelArguments | None>>
The arguments to pass to the function(s), optional Default value: None
|
template_format
Required
|
<xref:<xref:semantic_kernel.kernel.str | None>>
The format of the prompt template Default value: semantic-kernel
|
kwargs
Required
|
arguments that can be used instead of supplying KernelArguments |
Returns
Type | Description |
---|---|
The result of the function(s) |
invoke_prompt_stream
Invoke a function from the provided prompt and stream the results.
async invoke_prompt_stream(prompt: str, function_name: str | None = None, plugin_name: str | None = None, arguments: KernelArguments | None = None, template_format: Literal['semantic-kernel', 'handlebars', 'jinja2'] = 'semantic-kernel', return_function_results: bool | None = False, **kwargs: Any) -> AsyncIterable[list[StreamingContentMixin] | FunctionResult | list[FunctionResult]]
Parameters
Name | Description |
---|---|
prompt
Required
|
The prompt to use |
function_name
Required
|
The name of the function, optional Default value: None
|
plugin_name
Required
|
The name of the plugin, optional Default value: None
|
arguments
Required
|
<xref:<xref:semantic_kernel.kernel.KernelArguments | None>>
The arguments to pass to the function(s), optional Default value: None
|
template_format
Required
|
<xref:<xref:semantic_kernel.kernel.str | None>>
The format of the prompt template Default value: semantic-kernel
|
return_function_results
Required
|
If True, the function results are yielded as a list[FunctionResult] Default value: False
|
kwargs
Required
|
arguments that can be used instead of supplying KernelArguments |
Returns
Type | Description |
---|---|
The content of the stream of the last function provided. |
invoke_stream
Execute one or more stream functions.
This will execute the functions in the order they are provided, if a list of functions is provided. When multiple functions are provided only the last one is streamed, the rest is executed as a pipeline.
async invoke_stream(function: KernelFunction | None = None, arguments: KernelArguments | None = None, function_name: str | None = None, plugin_name: str | None = None, metadata: dict[str, Any] = {}, return_function_results: bool = False, **kwargs: Any) -> AsyncGenerator[list[StreamingContentMixin] | FunctionResult | list[FunctionResult], Any]
Parameters
Name | Description |
---|---|
function
|
<xref:semantic_kernel.kernel.KernelFunction>
The function to execute, this value has precedence when supplying both this and using function_name and plugin_name, if this is none, function_name and plugin_name are used and cannot be None. Default value: None
|
arguments
|
<xref:<xref:semantic_kernel.kernel.KernelArguments | None>>
The arguments to pass to the function(s), optional Default value: None
|
function_name
|
<xref:<xref:semantic_kernel.kernel.str | None>>
The name of the function to execute Default value: None
|
plugin_name
|
<xref:<xref:semantic_kernel.kernel.str | None>>
The name of the plugin to execute Default value: None
|
metadata
|
The metadata to pass to the function(s) Default value: {}
|
return_function_results
|
If True, the function results are yielded as a list[FunctionResult] Default value: False
|
content
Required
|
content (<xref:in addition to the streaming>) Default value: None
|
yielded.
Required
|
yielded. (<xref:otherwise only the streaming content is>) Default value: None
|
kwargs
Required
|
arguments that can be used instead of supplying KernelArguments Default value: None
|
retry_mechanism
Data descriptor used to emit a runtime deprecation warning before accessing a deprecated field.
retry_mechanism: RetryMechanismBase
function_invocation_filters
Filters applied during function invocation, from KernelFilterExtension.
function_invocation_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], Awaitable[None]]], Awaitable[None]]]]
prompt_rendering_filters
Filters applied during prompt rendering, from KernelFilterExtension.
prompt_rendering_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], Awaitable[None]]], Awaitable[None]]]]
auto_function_invocation_filters
Filters applied during auto function invocation, from KernelFilterExtension.
auto_function_invocation_filters: list[tuple[int, Callable[[FILTER_CONTEXT_TYPE, Callable[[FILTER_CONTEXT_TYPE], Awaitable[None]]], Awaitable[None]]]]
plugins
A dict with the plugins registered with the Kernel, from KernelFunctionExtension.
plugins: dict[str, KernelPlugin]
services
A dict with the services registered with the Kernel, from KernelServicesExtension.
services: dict[str, AIServiceClientBase]
ai_service_selector
The AI service selector to be used by the kernel, from KernelServicesExtension.
ai_service_selector: AIServiceSelector
msg
The deprecation message to be emitted.
wrapped_property
The property instance if the deprecated field is a computed field, or None.
field_name
The name of the field being deprecated.