Aydin Abiar
Aydin Abiar

Reputation: 374

Open AI generate longer text with GPT-3

I'm playing with the GPT-3 API of OPENAI but I struggle to find a way to make long enough generated text.

Here is my piece of code :

import os
import openai

# export OPENAI_API_KEY='get_key_from_openai'

openai.api_key = os.getenv("OPENAI_API_KEY")

response = openai.Completion.create(
  model="text-davinci-002",
  prompt="How to choose a student loan",
  temperature=0.6,
  max_tokens=512,
  top_p=1,
  frequency_penalty=1,
  presence_penalty=1,
  n= 10
)

print(response['choices'][0]['text'])

An example output I have is

"There are a few things to consider when choosing a student loan, including the interest rate, repayment options, and whether the loan is federal or private. You should also compare loans to see which one will cost you the least amount of money in the long run"

However, there are ~50 words which shouldn't be close to 80-100 tokens. I also thought that the n parameter was supposed to run n consecutive generated texts ?

Can someone explain how to make this generated text longer (ideally ~1000 tokens) ? Some huggingface models have a min_tokens parameter but I couldn't find it there.

Thanks a lot

Upvotes: 4

Views: 6025

Answers (1)

Aydin Abiar
Aydin Abiar

Reputation: 374

From the openAI docs

Note: There is not currently a way to set a minimum number of tokens.

Source : https://help.openai.com/en/articles/5072518-controlling-the-length-of-completions

A way I found is to create a while loop until the generated text is long enough...

Let's say I want 1000 characters then my loop would be

full_text = "How to choose a student loan ?"

while len(full_text) < 1000 :
  response = openai.Completion.create(
    model="text-ada-001",
    prompt=full_text,
    temperature=0.6,
    max_tokens=300,
    top_p=1,
    frequency_penalty=1,
    presence_penalty=1,
  )

  full_text = full_text + response['choices'][0]['text']

print(full_text)

No need to add '\n' between the texts since the api already add them conveniently in the response.

Upvotes: 2

Related Questions