YOU WANG
YOU WANG

Reputation: 41

integrate huggingface inference endpoint with flowise

I am trying to integrate mode : mistralai/Mixtral-8x7B-Instruct-v0.1 from hugging dace which I have deployed as an inference endpoint already, and I got a URL which i can put into flowise Add Endpoint URL

but it just doesn't work as expected, when i trying to run the model, it shows that error,so i just set the max_token to 250, now I can generate the text but still the model works randomly with no expected outputmodel output

it is supposed to be a translator that translates everything into English, this is how the model will behave when I used free inference API from hugging face:behaved correctly

there's a limit for free inference API, that why I paid for the inference endpoint, but it just behaved so differently, so confused, please help me out if anyone knows, thanks so much!

Upvotes: 0

Views: 99

Answers (0)

Related Questions