Exploring
Exploring

Reputation: 3399

Context window length in OpenAI API Codex models

Is the completion window length included in the context window length for OpenAI Codex models?

For da-vinci, the context window length is set to 4000 tokens.

From what I understand, as an example, if the prompt length is 3500 tokens, then the remaining 500 is for the completion. And there is no way use the whole 4000 token as the prompt.

I am pretty sure in my understanding, but it would be helpful to have it confirmed by someone knowledgeable.

Upvotes: 0

Views: 1765

Answers (1)

Kane Hooper
Kane Hooper

Reputation: 1909

The context length for da-vinci is 4096 tokens. The prompt tokens and max_tokens for the response cannot be greater than the context length.

This is from the OpenAI API docs:

The token count of your prompt plus max_tokens cannot exceed the model's context length. Most models have a context length of 2048 tokens (except for the newest models, which support 4096).

Ref: https://platform.openai.com/docs/api-reference/completions/create

Upvotes: 0

Related Questions