aestheticsData
aestheticsData

Reputation: 747

NextJS v14 architecture to call a LLM

In a NextJS v14 application, I need to call a proxy API that interacts with an LLM. The API returns an NDJSON response, which needs to be processed using the ndjsonStream function from the can-ndjson-stream npm library.

I have a client component with a form that sends a server action containing the question to the LLM, and another client component to display the result.

I am unsure how to integrate these components effectively.

My server action calls the LLM proxy API, but I am uncertain where to place the ndjsonStream processing. Additionally, how do I send the processed stream back to the client component?

Upvotes: 0

Views: 358

Answers (1)

aestheticsData
aestheticsData

Reputation: 747

A solution to this problem is available on Vercel Github : https://github.com/vercel/next.js/discussions/67501

Upvotes: 0

Related Questions