Skip to content

Update Azure.AI.OpenAI to 2.1.0 and add tests #150

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 22 commits into from
Apr 30, 2025
Merged
Show file tree
Hide file tree
Changes from 16 commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 10 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,9 +11,18 @@ Starting v0.1.0 for Microsoft.Azure.WebJobs.Extensions.OpenAI.AzureAISearch, it

## v0.19.0 - Unreleased

### Breaking

- Renamed model properites to `chatModel` and `embeddingsModel` in AssistantPost, Embeddings and TextCompletion bindings.
- Renamed connectionName to `searchConnectionName` in SemanticSearch binding.
- Renamed connectionName to `storeConnectionName` in EmbeddingsStore binding.
- Renamed ChatMessage entity to AssistantMessage.
- Managed identity support through config section and binding parameter `aiConnectionName` in AssistantPost, Embeddings, EmbeddingsStore, SemanticSearch and TextCompletion bindings.

### Changed

- Updated Azure.Data.Tables from 12.9.1 to 12.10.0, Azure.Identity from 1.12.1 to 1.13.2, Microsoft.Extensions.Azure from 1.7.5 to 1.8.0
- Updated Azure.AI.OpenAI from 1.0.0-beta.15 to 2.1.0
- Updated Azure.Data.Tables from 12.9.1 to 12.10.0, Azure.Identity from 1.12.1 to 1.13.2, Microsoft.Extensions.Azure from 1.7.5 to 1.10.0

## v0.18.0 - 2024/10/08

Expand Down
7 changes: 7 additions & 0 deletions OpenAI-Extension.sln
Original file line number Diff line number Diff line change
Expand Up @@ -118,6 +118,8 @@ Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "KustoSearchLegacy", "sample
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "TextCompletionLegacy", "samples\textcompletion\csharp-legacy\TextCompletionLegacy.csproj", "{73C26271-7B8D-4B38-B454-D9902B92270F}"
EndProject
Project("{FAE04EC0-301F-11D3-BF4B-00C04F79EFBC}") = "WebJobsOpenAIUnitTests", "tests\UnitTests\WebJobsOpenAIUnitTests.csproj", "{52337999-5676-27FD-AE3D-CAAE406AE337}"
EndProject
Global
GlobalSection(SolutionConfigurationPlatforms) = preSolution
Debug|Any CPU = Debug|Any CPU
Expand Down Expand Up @@ -216,6 +218,10 @@ Global
{73C26271-7B8D-4B38-B454-D9902B92270F}.Debug|Any CPU.Build.0 = Debug|Any CPU
{73C26271-7B8D-4B38-B454-D9902B92270F}.Release|Any CPU.ActiveCfg = Release|Any CPU
{73C26271-7B8D-4B38-B454-D9902B92270F}.Release|Any CPU.Build.0 = Release|Any CPU
{52337999-5676-27FD-AE3D-CAAE406AE337}.Debug|Any CPU.ActiveCfg = Debug|Any CPU
{52337999-5676-27FD-AE3D-CAAE406AE337}.Debug|Any CPU.Build.0 = Debug|Any CPU
{52337999-5676-27FD-AE3D-CAAE406AE337}.Release|Any CPU.ActiveCfg = Release|Any CPU
{52337999-5676-27FD-AE3D-CAAE406AE337}.Release|Any CPU.Build.0 = Release|Any CPU
EndGlobalSection
GlobalSection(SolutionProperties) = preSolution
HideSolutionNode = FALSE
Expand Down Expand Up @@ -258,6 +264,7 @@ Global
{BB20B2C0-35AE-CC75-61CE-C2713C83E4F6} = {4F208B03-C74C-458B-8300-9FF82F3C6325}
{C5ACB61F-CDAC-E93F-895E-4D46F086DE61} = {B87CBFFB-8221-45E8-8631-4BB1685D50F0}
{73C26271-7B8D-4B38-B454-D9902B92270F} = {5315AE81-BC33-49F2-935B-69287BB44CBD}
{52337999-5676-27FD-AE3D-CAAE406AE337} = {B99948E2-9853-4D9C-B784-19C2503CDA8B}
EndGlobalSection
GlobalSection(ExtensibilityGlobals) = postSolution
SolutionGuid = {05A7A679-3A53-45B5-AE93-4313655E127D}
Expand Down
75 changes: 66 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,18 +29,75 @@ Add following section to `host.json` of the function app for non dotnet language

## Requirements

* [.NET 6 SDK](https://dotnet.microsoft.com/download/dotnet/6.0) or greater (Visual Studio 2022 recommended)
* [.NET 8 SDK](https://dotnet.microsoft.com/download/dotnet/8.0) or greater (Visual Studio 2022 recommended)
* [Azure Functions Core Tools v4.x](https://learn.microsoft.com/azure/azure-functions/functions-run-local?tabs=v4%2Cwindows%2Cnode%2Cportal%2Cbash)
* Azure Storage emulator such as [Azurite](https://learn.microsoft.com/azure/storage/common/storage-use-azurite) running in the background
* The target language runtime (e.g. dotnet, nodejs, powershell, python, java) installed on your machine. Refer the official supported versions.
* Update settings in Azure Function or the `local.settings.json` file for local development with the following keys:
1. For Azure, `AZURE_OPENAI_ENDPOINT` - [Azure OpenAI resource](https://learn.microsoft.com/azure/ai-services/openai/how-to/create-resource?pivots=web-portal) (e.g. `https://***.openai.azure.com/`) set.
1. For Azure, assign the user or function app managed identity `Cognitive Services OpenAI User` role on the Azure OpenAI resource. It is strongly recommended to use managed identity to avoid overhead of secrets maintenance, however if there is a need for key based authentication add the setting `AZURE_OPENAI_KEY` and its value in the settings.
1. For non- Azure, `OPENAI_API_KEY` - An OpenAI account and an [API key](https://platform.openai.com/account/api-keys) saved into a setting.
If using environment variables, Learn more in [.env readme](./env/README.md).
1. Update `CHAT_MODEL_DEPLOYMENT_NAME` and `EMBEDDING_MODEL_DEPLOYMENT_NAME` keys to Azure Deployment names or override default OpenAI model names.
1. If using user assigned managed identity, add `AZURE_CLIENT_ID` to environment variable settings with value as client id of the managed identity.
1. Visit binding specific samples README for additional settings that might be required for each binding.
* Azure Storage emulator such as [Azurite](https://learn.microsoft.com/azure/storage/common/storage-use-azurite) running in the background
* The target language runtime (e.g. dotnet, nodejs, powershell, python, java) installed on your machine. Refer the official supported versions.
1. Refer [Configuring AI Service Connections](#configuring-ai-service-connections)

## Configuring AI Service Connections

The Azure Functions OpenAI Extension provides flexible options for configuring connections to AI services through the `AIConnectionName` property in the AssistantPost, TextCompletion, SemanticSearch, EmbeddingsStore, Embeddings bindings

### Managed Identity Role

Strongly recommended to use managed identity and ensure the user or function app's managed identity has the role - `Cognitive Services OpenAI User`

### AIConnectionName Property

The optional `AIConnectionName` property specifies the name of a configuration section that contains connection details for the AI service:

#### For Azure OpenAI Service

* If specified, the extension looks for `Endpoint` and `Key` values in the named configuration section
* If not specified or the configuration section doesn't exist, the extension falls back to environment variables:
* `AZURE_OPENAI_ENDPOINT` and/or
* `AZURE_OPENAI_KEY`
* For user-assigned managed identity authentication, a configuration section is required

```json
"<ConnectionNamePrefix>__endpoint": "Placeholder for the Azure OpenAI endpoint value",
"<ConnectionNamePrefix>__credential": "managedidentity",
"<ConnectionNamePrefix>__managedIdentityResourceId": "Resource Id of managed identity",
"<ConnectionNamePrefix>__managedIdentityClientId": "Client Id of managed identity"
```

* Only one of managedIdentityResourceId or managedIdentityClientId should be specified, not both.
* If no Resource Id or Client Id is specified, the system-assigned managed identity will be used by default.
* Pass the configured `ConnectionNamePrefix` value, example `AzureOpenAI` to the `AIConnectionName` property.

#### For OpenAI Service (non-Azure)

* Set the `OPENAI_API_KEY` environment variable

### Configuration Examples

#### Example: Using a configuration section

In `local.settings.json` or app environment variables:

```json
"AzureOpenAI__endpoint": "Placeholder for the Azure OpenAI endpoint value",
"AzureOpenAI__credential": "managedidentity",
```

Specifying credential is optional for system assigned managed identity

Function usage example:

```csharp
[Function(nameof(PostUserResponse))]
public static IActionResult PostUserResponse(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = "chats/{chatId}")] HttpRequestData req,
string chatId,
[AssistantPostInput("{chatId}", "{Query.message}", AIConnectionName = "AzureOpenAI", ChatModel = "%CHAT_MODEL_DEPLOYMENT_NAME%", ChatStorageConnectionSetting = DefaultChatStorageConnectionSetting, CollectionName = DefaultCollectionName)] AssistantState state)
{
return new OkObjectResult(state.RecentMessages.LastOrDefault()?.Content ?? "No response returned.");
}
```

## Features

Expand Down Expand Up @@ -270,7 +327,7 @@ public static async Task<EmbeddingsStoreOutputResponse> IngestFile(

public class EmbeddingsStoreOutputResponse
{
[EmbeddingsStoreOutput("{url}", InputType.Url, "AISearchEndpoint", "openai-index", Model = "%EMBEDDING_MODEL_DEPLOYMENT_NAME%")]
[EmbeddingsStoreOutput("{url}", InputType.Url, "AISearchEndpoint", "openai-index", EmbeddingsModel = "%EMBEDDING_MODEL_DEPLOYMENT_NAME%")]
public required SearchableDocument SearchableDocument { get; init; }

public IActionResult? HttpResponse { get; set; }
Expand Down
5 changes: 5 additions & 0 deletions eng/ci/templates/build-local.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,8 +20,13 @@ jobs:
dotnet build $(System.DefaultWorkingDirectory)/src/WebJobs.Extensions.OpenAI.Kusto/WebJobs.Extensions.OpenAI.Kusto.csproj --configuration $(config) -p:Version=$(fakeWebJobsPackageVersion) -p:AzureAISearchVersion=$(fakeWebJobsPackageVersion) -p:KustoVersion=$(fakeWebJobsPackageVersion)
dotnet build $(System.DefaultWorkingDirectory)/src/WebJobs.Extensions.OpenAI.AzureAISearch/WebJobs.Extensions.OpenAI.AzureAISearch.csproj --configuration $(config) -p:Version=$(fakeWebJobsPackageVersion) -p:AzureAISearchVersion=$(fakeWebJobsPackageVersion)
dotnet build $(System.DefaultWorkingDirectory)/src/WebJobs.Extensions.OpenAI.CosmosDBSearch/WebJobs.Extensions.OpenAI.CosmosDBSearch.csproj --configuration $(config) -p:Version=$(fakeWebJobsPackageVersion) -p:CosmosDBSearchVersion=$(fakeWebJobsPackageVersion)
dotnet build $(System.DefaultWorkingDirectory)/tests/UnitTests/WebJobsOpenAIUnitTests.csproj --configuration $(config) -p:WebJobsVersion=$(fakeWebJobsPackageVersion) -p:Version=$(fakeWebJobsPackageVersion)
displayName: Dotnet Build WebJobs.Extensions.OpenAI

- script: |
dotnet test $(System.DefaultWorkingDirectory)/tests/UnitTests/WebJobsOpenAIUnitTests.csproj --configuration $(config) --collect "Code Coverage" --no-build
displayName: Dotnet Test WebJobsOpenAIUnitTests

- task: CopyFiles@2
displayName: 'Copy NuGet WebJobs.Extensions.OpenAI to local directory'
inputs:
Expand Down
10 changes: 9 additions & 1 deletion java-library/CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,17 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## v0.5.0 - Unreleased

### Breaking

- model properties renamed to `chatModel` and `embeddingsModel` in assistantPost, embeddings and textCompletion bindings.
- renamed connectionName to `searchConnectionName` in semanticSearch binding.
- renamed connectionName to `storeConnectionName` in embeddingsStore binding.
- renamed ChatMessage to `AssistantMessage`.
- managed identity support through config section and binding parameter `aiConnectionName` in assistantPost, embeddings, embeddingsStore, semanticSearch and textCompletion bindings.

### Changed

- Update azure-ai-openai from 1.0.0-beta.11 to 1.0.0-beta.14
- Update azure-ai-openai from 1.0.0-beta.11 to 1.0.0-beta.16

## v0.4.0 - 2024/10/08

Expand Down
2 changes: 1 addition & 1 deletion java-library/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@
<dependency>
<groupId>com.azure</groupId>
<artifactId>azure-ai-openai</artifactId>
<version>1.0.0-beta.14</version>
<version>1.0.0-beta.16</version>
<scope>compile</scope>
</dependency>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,35 +11,23 @@
* Chat Message Entity which contains the content of the message, the role of the chat agent, and the name of the calling function if applicable.
* </p>
*/
public class ChatMessage {
public class AssistantMessage {

private String content;
private String role;
private String name;
private String toolCalls;

/**
* Initializes a new instance of the ChatMessage class.
* Initializes a new instance of the AssistantMessage class.
*
* @param content The content of the message.
* @param role The role of the chat agent.
*/
public ChatMessage(String content, String role) {
this.content = content;
this.role = role;
}


/**
* Initializes a new instance of the ChatMessage class.
*
* @param content The content of the message.
* @param role The role of the chat agent.
* @param name The name of the calling function if applicable.
* @param content The content of the message.
* @param role The role of the chat agent.
* @param toolCalls The toolCalls of the calling function if applicable.
*/
public ChatMessage(String content, String role, String name) {
public AssistantMessage(String content, String role, String toolCalls) {
this.content = content;
this.role = role;
this.name = name;
this.toolCalls = toolCalls;
}

/**
Expand Down Expand Up @@ -79,21 +67,20 @@ public void setRole(String role) {
}

/**
* Gets the name of the calling function if applicable.
* Gets the toolCalls of the calling function if applicable.
*
* @return The name of the calling function if applicable.
* @return The toolCalls of the calling function if applicable.
*/
public String getName() {
return name;
public String getToolCalls() {
return toolCalls;
}

/**
* Sets the name of the calling function if applicable.
* Sets the toolCalls of the calling function if applicable.
*
* @param name The name of the calling function if applicable.
* @param toolCalls The toolCalls of the calling function if applicable.
*/
public void setName(String name) {
this.name = name;
public void setToolCalls(String toolCalls) {
this.toolCalls = toolCalls;
}
}

Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@
*
* @return The OpenAI chat model to use.
*/
String model();
String chatModel();

/**
* The user message that user has entered for assistant to respond to.
Expand All @@ -70,6 +70,14 @@
*/
String userMessage();

/**
* The name of the configuration section for AI service connectivity settings.
*
* @return The name of the configuration section for AI service connectivity
* settings.
*/
String aiConnectionName() default "";

/**
* The configuration section name for the table settings for assistant chat
* storage.
Expand All @@ -86,4 +94,39 @@
* returns {@code DEFAULT_COLLECTION}.
*/
String collectionName() default DEFAULT_COLLECTION;

/**
* The sampling temperature to use, between 0 and 2. Higher values like 0.8 will
* make the output
* more random, while lower values like 0.2 will make it more focused and
* deterministic.
* It's generally recommended to use this or {@link #topP()} but not both.
*
* @return The sampling temperature value.
*/
String temperature() default "0.5";

/**
* An alternative to sampling with temperature, called nucleus sampling, where
* the model considers
* the results of the tokens with top_p probability mass. So 0.1 means only the
* tokens comprising the top 10%
* probability mass are considered.
* It's generally recommended to use this or {@link #temperature()} but not
* both.
*
* @return The topP value.
*/
String topP() default "";

/**
* The maximum number of tokens to generate in the completion.
* The token count of your prompt plus max_tokens cannot exceed the model's
* context length.
* Most models have a context length of 2048 tokens (except for the newest
* models, which support 4096).
*
* @return The maxTokens value.
*/
String maxTokens() default "100";
}
Loading