Skip to content

Update plugins index.md #246

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 11 additions & 6 deletions semantic-kernel/concepts/plugins/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,7 @@ public class LightModel

::: zone pivot="programming-language-python"
```python
from typing import TypedDict, Annotated
from typing import TypedDict, Annotated, List, Optional

class LightModel(TypedDict):
id: int
Expand Down Expand Up @@ -302,8 +302,8 @@ from semantic_kernel.functions import kernel_function
from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion
from semantic_kernel.connectors.ai.function_choice_behavior import FunctionChoiceBehavior
from semantic_kernel.connectors.ai.chat_completion_client_base import ChatCompletionClientBase
from semantic_kernel.contents.chat_history import ChatHistory
from semantic_kernel.functions.kernel_arguments import KernelArguments
from semantic_kernel.contents.chat_history import ChatHistory, ChatMessageContent
from semantic_kernel.contents.utils.author_role import AuthorRole

from semantic_kernel.connectors.ai.open_ai.prompt_execution_settings.azure_chat_prompt_execution_settings import (
AzureChatPromptExecutionSettings,
Expand All @@ -317,7 +317,7 @@ async def main():
chat_completion = AzureChatCompletion(
deployment_name="your_models_deployment_name",
api_key="your_api_key",
base_url="your_base_url",
endpoint="your_base_url",
)
kernel.add_service(chat_completion)

Expand All @@ -333,7 +333,12 @@ async def main():

# Create a history of the conversation
history = ChatHistory()
history.add_message("Please turn on the lamp")
history.add_message(
ChatMessageContent(
role=AuthorRole.USER,
content="Please turn on the lamp"
)
)

# Get the response from the AI
result = await chat_completion.get_chat_message_content(
Expand Down Expand Up @@ -437,4 +442,4 @@ By storing data locally, you can keep the information private and secure while a
Use one of the techniques described in the [Providing functions return type schema to LLM](./adding-native-plugins.md#provide-function-return-type-information-in-function-description) section to provide the function's return type schema to the AI model.

By utilizing a well-defined return type schema, the AI model can accurately identify the intended properties, eliminating potential inaccuracies that may arise when the model makes assumptions based on incomplete or ambiguous information in the absence of the schema.
Consequently, this enhances the accuracy of function calls, leading to more reliable and precise outcomes.
Consequently, this enhances the accuracy of function calls, leading to more reliable and precise outcomes.