Skip to content

Repo sync for protected branch #276

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 9 commits into
base: main
Choose a base branch
from
15 changes: 15 additions & 0 deletions semantic-kernel/Frameworks/agent/agent-orchestration/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,6 +97,21 @@ await runtime.stop_when_idle()

::: zone-end

::: zone pivot="programming-language-csharp"

## Preparing Your Development Environment

Add the following packages to your project before you proceed:

```pwsh
dotnet add package Microsoft.SemanticKernel.Agents.Orchestration --prerelease
dotnet add package Microsoft.SemanticKernel.Agents.Runtime.InProcess --prerelease
```

Depending on the agent types you use, you may also need to add the respective packages for the agents. Please refer to the [Agents Overview](../agent-architecture.md#agent-types-in-semantic-kernel) for more details.

::: zone-end

## Next steps

> [!div class="nextstepaction"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -290,7 +290,7 @@ await thread.delete() if thread else None
## Deleting an `OpenAIAssistantAgent`

Since the assistant's definition is stored remotely, it will persist if not deleted.
Deleting an assistant definition may be performed directly with the `AssistantClient`.
Deleting an assistant definition may be performed directly with the client.

> Note: Attempting to use an agent instance after being deleted will result in a service exception.

Expand All @@ -308,9 +308,7 @@ await client.DeleteAssistantAsync("<assistant id>");
::: zone pivot="programming-language-python"

```python
await agent.delete()

is_deleted = agent._is_deleted
await client.beta.assistants.delete(agent.id)
```

::: zone-end
Expand Down
8 changes: 8 additions & 0 deletions semantic-kernel/Frameworks/process/process-best-practices.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,14 @@ Organizing your project files in a logical and maintainable structure is crucial

An organized structure not only simplifies navigation within the project but also enhances code reusability and facilitates collaboration among team members.

### Kernel Instance Isolation

> [!Important]
> Do not share a single Kernel instance between the main Process Framework and any of its dependencies (such as agents, tools, or external services).

Sharing a Kernel across these components can result in unexpected recursive invocation patterns, including infinite loops, as functions registered in the Kernel may inadvertently invoke each other. For example, a Step may call a function that triggers an agent, which then re-invokes the same function, creating a non-terminating loop.

To avoid this, instantiate separate Kernel objects for each independent agent, tool, or service used within your process. This ensures isolation between the Process Framework’s own functions and those required by dependencies, and prevents cross-invocation that could destabilize your workflow. This requirement reflects a current architectural constraint and may be revisited as the framework evolves.

### Common Pitfalls
To ensure smooth implementation and operation of the Process Framework, be mindful of these common pitfalls to avoid:
Expand Down
40 changes: 19 additions & 21 deletions semantic-kernel/concepts/planning.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,9 @@ To use automatic function calling in Semantic Kernel, you need to do the followi
2. Create an execution settings object that tells the AI to automatically call functions
3. Invoke the chat completion service with the chat history and the kernel

> [!TIP]
> The following code sample uses the `LightsPlugin` defined [here](./plugins/adding-native-plugins.md#defining-a-plugin-using-a-class).

::: zone pivot="programming-language-csharp"

```csharp
Expand Down Expand Up @@ -127,31 +130,25 @@ do {
import asyncio

from semantic_kernel import Kernel
from semantic_kernel.functions import kernel_function
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion
from semantic_kernel.connectors.ai.function_choice_behavior import FunctionChoiceBehavior
from semantic_kernel.connectors.ai import FunctionChoiceBehavior
from semantic_kernel.connectors.ai.chat_completion_client_base import ChatCompletionClientBase
from semantic_kernel.contents.chat_history import ChatHistory
from semantic_kernel.functions.kernel_arguments import KernelArguments

from semantic_kernel.connectors.ai.open_ai.prompt_execution_settings.azure_chat_prompt_execution_settings import (
from semantic_kernel.connectors.ai.open_ai import (
AzureChatCompletion,
AzureChatPromptExecutionSettings,
)
from semantic_kernel.contents import ChatHistory
from semantic_kernel.functions import kernel_function

async def main():
# 1. Create the kernel with the Lights plugin
kernel = Kernel()
kernel.add_service(AzureChatCompletion(
deployment_name="your_models_deployment_name",
api_key="your_api_key",
base_url="your_base_url",
))
kernel.add_service(AzureChatCompletion())
kernel.add_plugin(
LightsPlugin(),
plugin_name="Lights",
)

chat_completion : AzureChatCompletion = kernel.get_service(type=ChatCompletionClientBase)
chat_completion: AzureChatCompletion = kernel.get_service(type=ChatCompletionClientBase)

# 2. Enable automatic function calling
execution_settings = AzureChatPromptExecutionSettings()
Expand All @@ -173,12 +170,11 @@ async def main():
history.add_user_message(userInput)

# 3. Get the response from the AI with automatic function calling
result = (await chat_completion.get_chat_message_contents(
result = await chat_completion.get_chat_message_content(
chat_history=history,
settings=execution_settings,
kernel=kernel,
arguments=KernelArguments(),
))[0]
)

# Print the results
print("Assistant > " + str(result))
Expand Down Expand Up @@ -263,14 +259,16 @@ if __name__ == "__main__":

When you use automatic function calling, all of the steps in the automatic planning loop are handled for you and added to the `ChatHistory` object. After the function calling loop is complete, you can inspect the `ChatHistory` object to see all of the function calls made and results provided by Semantic Kernel.

## What about the Function Calling Stepwise and Handlebars planners?
## What happened to the Stepwise and Handlebars planners?

The Stepwise and Handlebars planners have been deprecated and removed from the Semantic Kernel package. These planners are no longer supported in either Python, .NET, or Java.

The Stepwise and Handlebars planners are still available in Semantic Kernel. However, we recommend using function calling for most tasks as it is more powerful and easier to use. Both the Stepwise and Handlebars planners will be deprecated in a future release of Semantic Kernel.
We recommend using **function calling**, which is both more powerful and easier to use for most scenarios.

Learn how to [migrate Stepwise Planner to Auto Function Calling](../support/migration/stepwise-planner-migration-guide.md).
To update existing solutions, follow our [Stepwise Planner Migration Guide](../support/migration/stepwise-planner-migration-guide.md).

> [!CAUTION]
> If you are building a new AI agent, we recommend that you _not_ use the Stepwise or Handlebars planners. Instead, use function calling as it is more powerful and easier to use.
> [!TIP]
> For new AI agents, use function calling instead of the deprecated planners. It offers better flexibility, built-in tool support, and a simpler development experience.

## Next steps
Now that you understand how planners work in Semantic Kernel, you can learn more about how influence your AI agent so that they best plan and execute tasks on behalf of your users.
23 changes: 15 additions & 8 deletions semantic-kernel/concepts/plugins/adding-native-plugins.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,9 @@ Below, we'll walk through the two different ways of providing your AI agent with

The easiest way to create a native plugin is to start with a class and then add methods annotated with the `KernelFunction` attribute. It is also recommended to liberally use the `Description` annotation to provide the AI agent with the necessary information to understand the function.

> [!TIP]
> The following `LightsPlugin` uses the `LightModel` defined [here](./index.md#1-define-your-plugin).

::: zone pivot="programming-language-csharp"

```csharp
Expand Down Expand Up @@ -81,22 +84,24 @@ public class LightsPlugin
::: zone pivot="programming-language-python"

```python
from typing import List, Optional, Annotated
from typing import Annotated

from semantic_kernel.functions import kernel_function

class LightsPlugin:
def __init__(self, lights: List[LightModel]):
def __init__(self, lights: list[LightModel]):
self._lights = lights

@kernel_function
async def get_lights(self) -> List[LightModel]:
async def get_lights(self) -> list[LightModel]:
"""Gets a list of lights and their current state."""
return self._lights

@kernel_function
async def change_state(
self,
change_state: LightModel
) -> Optional[LightModel]:
) -> LightModel | None:
"""Changes the state of the light."""
for light in self._lights:
if light["id"] == change_state["id"]:
Expand Down Expand Up @@ -538,22 +543,24 @@ This approach eliminates the need to manually provide and update the return type
When creating a plugin in Python, you can provide additional information about the functions in the `kernel_function` decorator. This information will be used by the AI agent to understand the functions better.

```python
from typing import List, Optional, Annotated
from typing import Annotated

from semantic_kernel.functions import kernel_function

class LightsPlugin:
def __init__(self, lights: List[LightModel]):
def __init__(self, lights: list[LightModel]):
self._lights = lights

@kernel_function(name="GetLights", description="Gets a list of lights and their current state")
async def get_lights(self) -> List[LightModel]:
async def get_lights(self) -> list[LightModel]:
"""Gets a list of lights and their current state."""
return self._lights

@kernel_function(name="ChangeState", description="Changes the state of the light")
async def change_state(
self,
change_state: LightModel
) -> Optional[LightModel]:
) -> LightModel | None:
"""Changes the state of the light."""
for light in self._lights:
if light["id"] == change_state["id"]:
Expand Down
Loading