A Flask web application that utilizes the Groq API and Huggingface API to generate human-like responses to user input.
This project uses the Groq API to power a conversational chatbot. The Groq API allows us to generate text based on a given prompt or question, while also providing a wide range of pre-trained models to choose from. Additionally, this project utilizes the Huggingface API to access additional models and fine-tune our responses.
Python 3.8+ Flask (pip install flask) Groq API (install with pip install groq) Huggingface API (install with pip install transformers) You can install the dependencies using pip:
pip install -r requirements.txt
The requirements.txt file contains the following dependencies:
- load-dotenv==0.1.0
- groq~=0.11.0
- requests~=2.32.3
- flask~=3.0.3
- python-dotenv~=1.0.1
- ollama~=0.3.3
- pandas~=2.2.3
- spacy~=3.8.2
- spacytextblob~=4.0.0
- nltk~=3.9.1
- html2text~=2024.2.26
- beautifulsoup4~=4.12.3
- transformers
- torch
- protobuf
Clone the repository: git clone https://github.com/your-username/groq-chatbot.git Copy .env.example to a .env and fill the variables Run the application: python app.py
Open a web browser and navigate to http://localhost:5000/ Select a model from the dropdown menu Enter your question or prompt in the text field Click "Ask Question" to generate a response
The following models are currently available:
- Meta-llama3 (llama3-8b-8192)
- Gemma 2 9B (gemma2-9b-it)
- GPT2 (openai-community/gpt2)
- Zephyr-7B (HuggingFaceH4/zephyr-7b-beta)
- Gemma-7B (google/gemma-7b)
- DialoGPT (microsoft/DialoGPT-medium)
Pull requests are welcome! Please fork the repository and submit a pull request with your changes.
This project is licensed under the MIT License. See LICENSE for details.