Andrew Nguyen
Andrew Nguyen

Reputation: 11

API endpoint for AI chat completions worked on local but 500 on Cloudflare production

I am having problems with deploying my NextJS app, which is using an API endpoint for chat completions from OpenRouter AI. The API endpoint below worked well on my local, but when I wanted to deploy my app on Cloudflare, the Internal Server Error 500 keeps showing up.

// app/api/ai/route.ts
import { NextResponse } from "next/server";

// Specify Edge runtime
export const runtime = "edge";

export async function POST(req: Request) {
  try {
    const { message } = await req.json();
    
    const response = await fetch("https://openrouter.ai/api/v1/chat/completions", {
      method: "POST",
      headers: {
        "Authorization": `Bearer ${process.env.NEXT_PUBLIC_OPENROUTER_API_KEY}`,
        "Content-Type": "application/json",
      },
      body: JSON.stringify({
        model: "meta-llama/llama-3.3-70b-instruct:free",
        messages: [
          {
            role: "user",
            content: systemPrompt, // I have a custom prompt here
          },
        ],
      }),
    });
    
    if (!response.ok) {
      throw new Error(`OpenRouter API responded with status: ${response.status}`);
    }
    
    const result = await response.json();
    return NextResponse.json({ content: result.choices[0].message.content });
    
  } catch (error: any) {
    console.error("AI API error:", error.message);
    return NextResponse.json({ error: error.message }, { status: 500 });
  }
}

The client calling the API is just basic:

try {
      const res = await fetch("/api/chat", {
        method: "POST",
        headers: {
          "Content-Type": "application/json",
        },
        body: JSON.stringify({ messages: transformedData }),
      });
      toast.dismiss();
      if (!res.ok) {
        const errorText = await res.text();
        toast.error(`AI failed to think..., ${errorText}`);
        throw new Error(errorText);
      }

      const responseText = await res.json();

      const clean = DOMPurify.sanitize(responseText.content, {
        USE_PROFILES: { html: true },
      });
      setResponse(clean);
      toast.success("AI has responded!");
    } finally {
      setProcessing(false);
    }

I am suspecting some problems with the Edge runtime compatibility between Cloudflare and NextJS. But I still struggle a lot and can't still figure it out.

I have tried to deploy just a bare simple app which only has the UI and the API on Vercel, but the error Internal Server Error 500 is still there.

I also tried to make the API call's base URL through AI gateway of Cloudflare (https://developers.cloudflare.com/ai-gateway/providers/openrouter/), but it is not helping.

In addition, I tried to use the OpenAI SDK to make the chat completions, but it still worked well on local, but can't on Cloudflare.

Packages:

"openai": "^4.85.1",
"next": "^14.2.3",
"@cloudflare/next-on-pages": "^1.13.8",

What am I missing here?

Upvotes: 1

Views: 61

Answers (1)

X8inez
X8inez

Reputation: 1885

You are experiencing an Internal server error - status 500.

From the code above, you seem to be using NEXT_PUBLIC_OPENROUTER_API_KEY. I’ll suggest using OPENROUTER_API_KEY to store the api key as in NextJs, attaching NEXT_PUBLIC to your ENVs


// app/api/ai/route.ts
import { NextResponse } from "next/server";

// Specify Edge runtime
export const runtime = "edge";

export async function POST(req: Request) {
  try {
    const { message } = await req.json();
    
    const response = await fetch("https://openrouter.ai/api/v1/chat/completions", {
      method: "POST",
      headers: {
        "Authorization": `Bearer ${process.env.OPENROUTER_API_KEY}`,
        "Content-Type": "application/json",
      },
      body: JSON.stringify({
        model: "meta-llama/llama-3.3-70b-instruct:free",
        messages: [
          {
            role: "user",
            content: systemPrompt, // I have a custom prompt here
          },
        ],
      }),
    });
    
    if (!response.ok) {
      throw new Error(`OpenRouter API responded with status: ${response.status}`);
    }
    
    const result = await response.json();
    return NextResponse.json({ content: result.choices[0].message.content });
    
  } catch (error: any) {
    console.error("AI API error:", error.message);
    return NextResponse.json({ error: error.message }, { status: 500 });
  }
}

Upvotes: 0

Related Questions