Reputation: 581
Why do they have 20 different codes for 20 different models in OpenAI? In any case, I've tried to mimic my code as close as possible to existing codes found on the internet but I keep getting errors. I realize that there are different endpoints (not that I know what an endpoint is) and you have to write code that fits to a specific endpoint but I think I've found the correct code for model gpt-4 and I still cannot get the code to work:
model = 'gpt-4'
prompt=f"Translate the following from Ancient Greek into English: {txt}\n",
messages = [{'role':'user','content': prompt}]
obj = client.chat.completions.create(model=model, messages=messages)
I've also tried:
messages = [{'role':'system','content': prompt}]
The error message I'm getting is:
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid type for 'messages[0].content[0]': expected an object, but got a string instead.", 'type': 'invalid_request_error', 'param': 'messages[0].content[0]', 'code': 'invalid_type'}}
I see very little python code on the chatgpt website regarding what you have do with different endpoints. I was able to get some other OpenAI code to work using the Babbage model but it provided bad translations.
Upvotes: 1
Views: 1266
Reputation: 880
Try this:
client = OpenAI(
api_key=API_KEY
)
response = client.chat.completions.create(
model="gpt-4",
messages=[
{
"role": "system",
"content": "You are a helpful assistant"
},
{
"role": "user",
"content": "Test"
}
],
temperature=0.5,
max_tokens=64,
)
print(response.choices[0].message.content)
This gives me:
Hello! How can I assist you today?
And it should work with all chat completion models.
On further testing of your code, I have found the culprit: the comma at the end of the line:
prompt=f"Translate the following from Ancient Greek into English: {txt}\n",
If you remove this, your error disappears.
Upvotes: 2