Joseph J
Joseph J

Reputation: 21

Understanding FewShotPromptTemplate

I want to understand when we are running a FewShotPromptTemplate with 10 examples:

examples = [
    {
        "query": "What's the weather like?",
        "answer": "It's raining cats and dogs, better bring an umbrella!"
    }, {
        "query": "How old are you?",
        "answer": "Age is just a number, but I'm timeless."
    }
]

And when we are calling

few_shot_prompt_template = FewShotPromptTemplate(
    examples=examples,
    example_prompt=example_prompt,
    prefix=prefix,
    suffix=suffix,
    input_variables=["query"],
    example_separator="\n\n"
)

and passing this:

from langchain.chat_models import ChatOpenAI
from langchain import LLMChain

# load the model
chat = ChatOpenAI(model_name="gpt-4", temperature=0.0)

chain = LLMChain(llm=chat, prompt=few_shot_prompt_template)
chain.run("What's the meaning of life?")

My question is would it run all the 2 examples along with a prompt like "What's the meaning of life?" everytime I run a chain?

If yes, isnt this inefficient and keep on increasing our cost per call?

I am learning this and trying to understand how the function works, so your help will be appreciated

Upvotes: 0

Views: 75

Answers (0)

Related Questions