-
-
Notifications
You must be signed in to change notification settings - Fork 18.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FEATURE] Add support for llama.cpp and Weaviate #41
Comments
Thanks for the suggestion! As this project is built on top of LangChainJS, llama.cpp is still not yet supported for now. It is also requested here: langchain-ai/langchainjs#710 We can definitely add Weaviate DB soon. For now, Supabase, Chroma are open source which you can run locally as well |
Weaviate was recently added to LangChainJS, same interface as the other VectorStores. Also perhaps have a look at https://github.com/go-skynet/LocalAI it's a way of running open models locally with an API that is compatible with OpenAIs API interfaces. Should be a great stopgap while there is no llama.cpp |
Weaviate is now added to Flowise - #45 |
Closing this as you can now use LocalAI #123 to run local LLMs and embeddings. So far this is the only viable solution for JS/TS based project |
Im desperate in need of a tutorial to use it with flowise, |
Seems to be still not listed in the docs. Any particular reason or just an oversight? |
* feat:自定义历史聊天标题 * Update chat.ts * perf:自定义聊天标题 * feat: google auth * perf:将修改标题移入右键菜单 * perf:updatetitle --------- Co-authored-by: archer <[email protected]>
* move components to web package (FlowiseAI#37) * move components * fix * fix: cq connection * fix pagination (FlowiseAI#41) * doc * openapi config * fix team share app lose (FlowiseAI#42) * fix: ts * doc * doc --------- Co-authored-by: heheer <[email protected]> Co-authored-by: yst <[email protected]>
Describe the feature you'd like
Could you please add support for
Additional context
I got LangChain + OpenAI + Pinecone working for conversational Q&A retrieval against enterprise knowledge base, but would like to use open source and locally run alternative components (llama.cpp for embedding and LLM, Weaviate for vector DB). Thus my enterprise data will be on premise. Thank you.
The text was updated successfully, but these errors were encountered: