user3102556
user3102556

Reputation: 79

Ollama + OpenWebUI on VPS : which endpoint for curl requests to embedding model?

I have a VPS with Ollama and OpenWebUI running. I want to use "granite-embedding:278m" as embedding model for curl requests. I have the same setup running locally on my Mac and the endpoint is http://localhost:11434/v1/embeddings. But I can't find any similar endpoint in the docs for the OpenWebUI API.

So I guess I need to use the Ollama API and install a Flask server. Is there any potential conflicts or incompatibilities I should care of ?

Any better advice ? Thank you in advance.

Upvotes: 0

Views: 56

Answers (0)

Related Questions