Reputation: 111
Here is my code
response = openai.ChatCompletion.create(
engine="XXX", # The deployment name you chose when you deployed the ChatGPT or GPT-4 model.
messages=[
{"role": "system", "content": "Assistant is a large language model trained by OpenAI."},
{"role": "user", "content": "Calculate the circumference of circle with radius 5?"}
],
functions=[{'name': 'circumference_calc', 'description': 'Calculate the circumference of circle given the radius', 'parameters': {'type': 'object', 'properties': {'radius': {'description': 'The radius of the circle', 'type': 'number'}}}, 'required': ['radius']}]
)
Results in :
InvalidRequestError: Unrecognized request argument supplied: functions
Wrote the above code
Upvotes: 10
Views: 10298
Reputation: 3075
I posted an example using the API here: Recursive Azure OpenAI Function Calling
// *** Define the Functions ***
string fnTodosPOSTDescription = "Creates a new TODO item. ";
fnTodosPOSTDescription += "Use this function to add ";
fnTodosPOSTDescription += "a new TODO item to the list";
string fnTodosGETDescription = "Retrieves the TODO list. ";
fnTodosGETDescription += "Use this function to view the TODO list.";
string fnTodosDELETEDescription = "Deletes a specific TODO item ";
fnTodosDELETEDescription += "from the list. Use this function to ";
fnTodosDELETEDescription += "remove a TODO item from the list.";
var fnTodosPOST = new FunctionDefinition();
fnTodosPOST.Name = "Todos_POST";
fnTodosPOST.Description = fnTodosPOSTDescription;
fnTodosPOST.Parameters = BinaryData.FromObjectAsJson(new JsonObject
{
["type"] = "object",
["properties"] = new JsonObject
{
["TodoRequest"] = new JsonObject
{
["type"] = "object",
["properties"] = new JsonObject
{
["todo"] = new JsonObject
{
["type"] = "string",
["description"] = @"The TODO item to be added."
}
},
["required"] = new JsonArray { "todo" }
}
},
["required"] = new JsonArray { "TodoRequest" }
});
var fnTodosGET = new FunctionDefinition();
fnTodosGET.Name = "Todos_GET";
fnTodosGET.Description = fnTodosGETDescription;
fnTodosGET.Parameters = BinaryData.FromObjectAsJson(new JsonObject
{
["type"] = "object",
["properties"] = new JsonObject { }
});
var fnTodosDELETE = new FunctionDefinition();
fnTodosDELETE.Name = "Todos_DELETE";
fnTodosDELETE.Description = fnTodosDELETEDescription;
fnTodosDELETE.Parameters = BinaryData.FromObjectAsJson(new JsonObject
{
["type"] = "object",
["properties"] = new JsonObject
{
["TodoIndexRequest"] = new JsonObject
{
["type"] = "object",
["properties"] = new JsonObject
{
["todoIdx"] = new JsonObject
{
["type"] = "integer",
["description"] = @"The index of the TODO item to be deleted."
}
},
["required"] = new JsonArray { "todoIdx" }
}
},
["required"] = new JsonArray { "TodoIndexRequest" }
});
// Create a new list of FunctionDefinition objects
List<FunctionDefinition> DefinedFunctions = new List<FunctionDefinition>();
// Add the FunctionDefinition objects to the list
DefinedFunctions.Add(fnTodosPOST);
DefinedFunctions.Add(fnTodosGET);
DefinedFunctions.Add(fnTodosDELETE);
Call Azure OpenAI Service
// Create a new ChatCompletionsOptions object
var chatCompletionsOptions = new ChatCompletionsOptions()
{
Temperature = (float)0.7,
MaxTokens = 2000,
NucleusSamplingFactor = (float)0.95,
FrequencyPenalty = 0,
PresencePenalty = 0,
};
chatCompletionsOptions.Functions = DefinedFunctions;
chatCompletionsOptions.FunctionCall = FunctionDefinition.Auto;
// Add the prompt to the chatCompletionsOptions object
foreach (var message in ChatMessages)
{
chatCompletionsOptions.Messages.Add(message);
}
// Call the GetChatCompletionsAsync method
Response<ChatCompletions> responseWithoutStream =
await client.GetChatCompletionsAsync(
DeploymentOrModelName,
chatCompletionsOptions);
// Get the ChatCompletions object from the response
ChatCompletions result = responseWithoutStream.Value;
// Create a new Message object with the response and other details
// and add it to the messages list
var choice = result.Choices.FirstOrDefault();
if (choice != null)
{
if (choice.Message != null)
{
ChatMessages.Add(choice.Message);
}
}
// Update the total number of tokens used by the API
TotalTokens = TotalTokens + result.Usage.TotalTokens;
See if as a response ChatGPT wants to call a function
if (result.Choices.FirstOrDefault().FinishReason == "function_call")
{
// Chat GPT wants to call a function
// To allow ChatGPT to call multiple functions
// We need to start a While loop
bool FunctionCallingComplete = false;
while (!FunctionCallingComplete)
{
// Call the function
ChatMessages = ExecuteFunction(result, ChatMessages);
// *** Call Azure OpenAI Service ***
// Get a response from ChatGPT
// (now that is has the results of the function)
// Create a new ChatCompletionsOptions object
chatCompletionsOptions = new ChatCompletionsOptions()
{
Temperature = (float)0.7,
MaxTokens = 2000,
NucleusSamplingFactor = (float)0.95,
FrequencyPenalty = 0,
PresencePenalty = 0,
};
chatCompletionsOptions.Functions = DefinedFunctions;
chatCompletionsOptions.FunctionCall = FunctionDefinition.Auto;
// Add the prompt to the chatCompletionsOptions object
foreach (var message in ChatMessages)
{
chatCompletionsOptions.Messages.Add(message);
}
// Call the GetChatCompletionsAsync method
Response<ChatCompletions> responseWithoutStreamFn =
await client.GetChatCompletionsAsync(
DeploymentOrModelName,
chatCompletionsOptions);
// Get the ChatCompletions object from the response
result = responseWithoutStreamFn.Value;
var FunctionResult = result.Choices.FirstOrDefault();
// Create a new Message object with the response and other details
// and add it to the messages list
if (FunctionResult.Message != null)
{
ChatMessages.Add(FunctionResult.Message);
}
if (FunctionResult.FinishReason == "function_call")
{
// Keep looping
FunctionCallingComplete = false;
}
else
{
// Break out of the loop
FunctionCallingComplete = true;
}
}
}
}
catch (Exception ex)
{
// Set ErrorMessage to the exception message if an error occurs
ErrorMessage = ex.Message;
}
finally
{
// Clear the prompt variable
prompt = "";
// Set Processing to false to indicate
// that the method is done processing
Processing = false;
// Call StateHasChanged to refresh the UI
StateHasChanged();
}
Execute a function
private List<ChatMessage> ExecuteFunction(
ChatCompletions ChatCompletionResult, List<ChatMessage> ParamChatPrompts)
{
// Get the arguments
var functionArgs =
ChatCompletionResult.Choices.FirstOrDefault()
.Message.FunctionCall.Arguments.ToString();
// Get the function name
var functionName =
ChatCompletionResult.Choices.FirstOrDefault()
.Message.FunctionCall.Name;
// Variable to hold the function result
string functionResult = "";
// Use select case to call the function
switch (functionName)
{
case "Todos_POST":
var NewTODO =
JsonSerializer.Deserialize<ToDoAddRequest>(functionArgs);
if (NewTODO != null)
{
functionResult = AddTodo(NewTODO.TodoRequest.todo);
}
break;
case "Todos_GET":
functionResult = GetTodos();
break;
case "Todos_DELETE":
var DeleteTODO =
JsonSerializer.Deserialize<ToDoRemoveRequest>(functionArgs);
if (DeleteTODO != null)
{
functionResult =
DeleteTodo(DeleteTODO.TodoIndexRequest.todoIdx);
}
break;
default:
break;
}
// Return with the results of the function
var ChatFunctionMessage = new ChatMessage();
ChatFunctionMessage.Role = ChatRole.Function;
ChatFunctionMessage.Content = functionResult;
ChatFunctionMessage.Name = functionName;
ParamChatPrompts.Add(ChatFunctionMessage);
return ParamChatPrompts;
}
Upvotes: 0
Reputation: 31
This has started working now. Make sure that your model version is 0613. I have tested with gpt-35-turbo.
import openai
openai.api_type = "azure"
openai.api_key = "XXX"
openai.api_base = "https://XXXX.openai.azure.com/"
openai.api_version = "2023-07-01-preview"
response = openai.ChatCompletion.create(
engine="XXX",
messages=[{"role": "user", "content": "what is current date?"}],
functions=[
{
"name": "get_current_date",
"description": "Get the current date in YYYY-MM-DD format",
"parameters": {"type": "object", "properties": {}, "required": []},
}
],
)
print(response["choices"][0])
Response:
<OpenAIObject at 0x10d2c1790> JSON: {
"index": 0,
"finish_reason": "function_call",
"message": {
"role": "assistant",
"function_call": {
"name": "get_current_date",
"arguments": "{}"
}
},
"content_filter_results": {}
}
Upvotes: 3
Reputation: 767
UPDATE (13. July 2023): Azure now supports function calling in API version 2023-07-01-preview
on these models:
gpt-4-0613
gpt-4-32k-0613
gpt-35-turbo-0613
(thanks Joel)gpt-35-turbo-16k-0613
Only version gpt-3.5-turbo-0613
and gpt-4-0613
has function calling (ref), and Azure OpenAI Services only supports these for now (ref):
gpt-3.5-turbo-0301
gpt-4-0314
gpt-4-32k-0314
Hopefully, the Azure OpenAI Service team will support the 0613
version soon.
UPDATE (3. July 2023): Even though 0613
is out now, it seems like they still don't support functions
as a parameter, as some others have pointed out here. I have tested using 2023-06-01-preview
, which is supposed to be the newest API interface (according to these specs) with no luck.. I also found that version 2023-07-01-preview
is working too, now, but no luck there either. (This has now changed. See update on the top^)
Upvotes: 25
Reputation: 1
The SDK reference was updated to reflect Azure Github 2023-07-01-preview the inclusion of function calling.
From this thread https://github.com/Azure/azure-rest-api-specs/pull/24534 it's clear that for now the Azure OpenAI 0613 version does not follow that updated API yet but should be available very shortly from what I understand.
Upvotes: 0
Reputation: 33
We will wait with attention for the new version of the nuget https://github.com/Azure/azure-sdk-for-net/blob/Azure.AI.OpenAI_1.0.0-beta.5/sdk/openai/Azure.AI.OpenAI/ README.md
Which I suppose that Azure.AI.OpenAI_1.0.0-beta.6 will include the call to functions.
Upvotes: 0
Reputation: 11
Although the model version 0613 is available now, the api does not support function calling with the api version "2023-06-01-preview". Refer to: https://github.com/Azure/azure-rest-api-specs/blob/main/specification/cognitiveservices/data-plane/AzureOpenAI/inference/preview/2023-06-01-preview/inference.json
Upvotes: 1