Using Flowise with infinity as embeddings server #77
SQLExceptionPhil
started this conversation in
General
Replies: 1 comment 3 replies
-
I implemented a langchain version using Infinity embeddings. Its in all langchain packages since around ~0.0.330. |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hey, I'm trying to setup https://github.com/FlowiseAI/Flowise with infinity as an embeddings server in the backend. While trying to get everything working, I got into an error namly:
Flowise is using langchain OpenAIEmbeddings and it somehow doesn't send a list of strings, when sending the prompt (it sends the input as a simple string).
With the huggingface/text-inference it was working - possibly they implemented both request types.
Does anyone know any kind of workaround for this kind of issue?
Beta Was this translation helpful? Give feedback.
All reactions