Skip to content

LocalRAGify is a zero-cost, local RAG chatbot built with open-source tools like OpenSearch for retrieval and Ollama for generative AI. It enables you to create and deploy an AI assistant on your own machine with a simple, user-friendly interface powered by Streamlit. No cloud or paid services required.

License

Notifications You must be signed in to change notification settings

Toluhunter/LocalRAGify

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LocalRAGify: Zero-Cost Local RAG Chatbot

LocalRAGify is a free, local Retrieval-Augmented Generation (RAG) system that combines open-source tools like OpenSearch and Ollama with a user-friendly chat interface powered by Streamlit. This project enables you to set up a cost-effective chatbot capable of answering queries using both retrieval and generative AI techniques.

Alt Text

Features

  • Zero-Cost: Fully local deployment using free and open-source tools.
  • Generative AI: Powered by the Ollama LLM for interactive responses.
  • Custom Retrieval: Integrates with OpenSearch for document retrieval.
  • User-Friendly Interface: Built with Streamlit for seamless interaction.

Getting Started

Follow these steps to set up and run the application locally using Docker Compose.

Prerequisites

  1. Docker and Docker Compose installed on your machine.
  2. Python 3.6+ installed (for local edits or testing).

File Structure

Ensure the following files are present in your project directory:

.
├── dockerfile
├── compose.yml
├── app.py
├── llama.py
├── requirements.txt

Steps to Run the App

  1. Clone the Repository

    git clone https://github.com/your-repo-name/localragify.git
    cd localragify
  2. Build and Start Services
    Use Docker Compose to build and run the application:

    docker-compose up --build

    This command:

    • Pulls the required images for OpenSearch and Ollama.
    • Builds the LocalRAGify app Docker image.
    • Starts both the OpenSearch vector store and the chatbot application.
  3. Access the Chat Interface
    Once the containers are up and running, access the chatbot via your browser at:

    http://localhost:8501
    
  4. Verify OpenSearch Service
    The OpenSearch vector store is available at:

    http://localhost:9200
    

Contributing

I welcome contributions! Feel free to open issues or submit pull requests for improvements.


License

This project is licensed under the MIT License.


Acknowledgments

  • Ollama for the local LLM framework.
  • OpenSearch for the vector database functionality.
  • Streamlit for the interactive web interface.

Enjoy building your zero-cost local RAG chatbot with LocalRAGify! 🚀

About

LocalRAGify is a zero-cost, local RAG chatbot built with open-source tools like OpenSearch for retrieval and Ollama for generative AI. It enables you to create and deploy an AI assistant on your own machine with a simple, user-friendly interface powered by Streamlit. No cloud or paid services required.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages