James
James

Reputation: 526

OpenAI API: Can I remove the line break from the response with a parameter?

I've starting using OpenAI API in R. I downloaded the openai package. I keep getting a double linebreak in the text response. Here's an example of my code:


library(openai)

vector = create_completion(
  model = "text-davinci-003",
  prompt = "Tell me what the weather is like in London, UK, in Celsius in 5 words.",
  max_tokens = 20,
  temperature = 0,
  echo = FALSE
)


vector_2 = vector$choices[1]

vector_2$text


[1] "\n\nRainy, mild, cool, humid."

Is there a way to get rid of this without 'correcting' the response text using other functions?

Upvotes: 4

Views: 6165

Answers (4)

Avirup Ghosh
Avirup Ghosh

Reputation: 31

Save the response in a variable and split it based on the line break. For example, you save the GPT response in the "answer" variable, like:

answer=chain.run("France")
print(answer.split("\n\n")[-1])

Upvotes: 0

Biology Blogger
Biology Blogger

Reputation: 1

You should add a restart sequence for the language model to know better how to begin with its answer. The restart sequence may one escaped line break (\n) or a white space at the very least.

As the model is completing, it does not want to add the text immediately after the input text, as an answer is not usually appended to the same line of a question.

Upvotes: 0

CryptoDevWill
CryptoDevWill

Reputation: 349

There is away to take out the line breaks in the openai response. Add this param to your function

stop=["\n"]

 completion = openai.Completion.create(
    engine="text-davinci-002",
    prompt=prompt,
    max_tokens=1024,
    n=1,
    stop=["\n"],
    temperature=0.7,
)

This will take out the line breaks that come with the response. Hope that helps!

Upvotes: 0

Rok Benko
Rok Benko

Reputation: 22920

No, it's not possible.

The OpenAI API returns the completion with a starting \n\n by default. There's no parameter for the Completions endpoint to control this.

You need to remove the line break manually.

An example response looks like this:

{
  "id": "cmpl-uqkvlQyYK7bGYrRHQ0eXlWi7",
  "object": "text_completion",
  "created": 1589478378,
  "model": "text-davinci-003",
  "choices": [
    {
      "text": "\n\nThis is indeed a test",
      "index": 0,
      "logprobs": null,
      "finish_reason": "length"
    }
  ],
  "usage": {
    "prompt_tokens": 5,
    "completion_tokens": 7,
    "total_tokens": 12
  }
}

Upvotes: 4

Related Questions