Reputation: 9
import google.generativeai as genai
def query_gemini(model_name="models/chat-bison-001", temperature=0.7, top_k=40, top_p=0.95, max_output_tokens=1024):
"""Configures parameters for the Gemini model."""
return {
'model': model_name,
'temperature': temperature,
'max_output_tokens': max_output_tokens,
'top_p': top_p,
'top_k': top_k
}
def get_completion(params, messages):
"""Generates text using the Gemini API."""
model_name = params['model']
# Create model instance from the provided model name
model = genai.GenerativeModel(model_name=model_name)
# Create a request object and populate it
request = {
"temperature": params['temperature'],
"candidate_count": 1,
"top_k": params['top_k'],
"top_p": params['top_p'],
"max_output_tokens": params['max_output_tokens'],
}
# Generate text using the model and request object, with messages as the only positional argument
response = model.generate_content(messages, **request)
# Check if response is not None before trying to access it
if response:
return response.candidates[0].output # Extract text from response
else:
return None
params = query_gemini()
messages = [
{"role": "user", "content": "What are the main benefits of renewable energy?"}
]
response_text = get_completion(params, messages)
print(response_text)
I am trying to pass argument to generate_content() function but returning same error . Can someone help? response = model.generate_content(messages, **request) following line returning error
Upvotes: 0
Views: 865
Reputation: 126
The way to call Palm/Bison family models and Gemini family models are different.
Here you have an example about how to do a request to Bison: https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/text#sample_request
And here you have an example about how to do the same with Gemini: https://cloud.google.com/vertex-ai/generative-ai/docs/reference/python/latest#chat
Tip: Gemini family models have a better accuracy than Palm/Bison
Upvotes: 0