cem
cem

Reputation: 331

Can't set custom OpenAI model with langchain on nodejs

I am trying to set "gpt-3.5-turbo" model in my OpenAI instance using langchain in node.js, but below way sends my requests defaultly as text-davinci model.

const { OpenAI } = require("langchain/llms");
const { ConversationChain } = require("langchain/chains");
const { BufferMemory } = require("langchain/memory");

const model = new OpenAI({ model:"gpt-3.5-turbo", openAIApiKey: "###", temperature: 0.9 });
const memory = new BufferMemory();

const chain = new ConversationChain({llm:model, memory: memory});

async function x(){
const res = await chain.call({input:"Hello this is xyz!"});
const res2 = await chain.call({input:"Hello what was my name?"});
console.log(res);
console.log(res2);
}

x();

On documentation, i found the way to setting model with python. It sets with model_name attribute on the instance. But this way doesn't work with nodejs. Is there any way to setting custom models with langchain node.js ?

Upvotes: 1

Views: 2253

Answers (1)

cmgchess
cmgchess

Reputation: 10287

I looked at the codebase and it seems like modelName is the parameter that you should use.

It's strange that the search function in the docs give no results for modelName. I guess it is the same parameters used in python API but in camel case.

Relationship with Python LangChain

edit

Docs have it here

Upvotes: 2

Related Questions