Skip to content

Compare and validate QA tasks using 3 local (Ollama) or cloud (Groq API) LLMs side-by-side. Designed for QA automation, test case generation, and bug triage.

License

Notifications You must be signed in to change notification settings

aiqualitylab/llm-qa-assistant

Repository files navigation

QA Prompt Comparison (Local Or Cloud LLMs)

Compare outputs from 3 locally hosted LLMs (via Ollama) or 3 cloud-hosted LLMs (via Groq API). This tool helps QA professionals validate AI outputs for test automation, bug triage, and more.


🚀 Features

  • Compare 3 LLMs at once (local or cloud)
  • Toggle between local (Ollama) and cloud (Groq API) models in the UI
  • Fully offline with Ollama, or use Groq for cloud LLMs
  • Fast, lightweight, and private
  • Simple, responsive UI with QA-focused tools

🧱 Requirements

For Local LLMs:

  • Ollama (for running models locally)

For Cloud LLMs:

  • Groq API Key (free tier available)
  • Node.js (for the API proxy server)

For Both:


🎯 QA-Specific Tasks Supported

  • Generate test cases from requirements
  • Convert scenarios to Gherkin
  • Summarize test failures or logs
  • Generate assertion code from plain language
  • Compare coverage across models
  • Explain automation errors (stack traces, type errors)

📦 Setup Instructions

1. Clone the Repo

git clone https://github.com/your-username/llm-qa-assistant.git
cd llm-qa-assistant
▶️ Local LLMs (Ollama)
ollama pull phi3:medium-128k
ollama pull deepseek-r1:8b
ollama pull qwen:1.8b
ollama serve

Open index.html with Live Server in VS Code.
Type a QA prompt and click "Compare Models".

☁️ Cloud LLMs (Groq API)
  1. Create a .env file in the project root:
    GROQ_API_KEY=your-groq-api-key-here
  2. Install dependencies:
    npm install
  3. Start the Groq API proxy server:
    npm start
  4. Open index.html with Live Server in VS Code.
  5. Check the "Use Groq-hosted LLMs" box in the UI before comparing.


🖼️ Screenshots

Local LLMs (Ollama)
QA LLM Comparator UI - Local
Cloud LLMs (Groq API)
QA LLM Comparator UI - Cloud

About

Compare and validate QA tasks using 3 local (Ollama) or cloud (Groq API) LLMs side-by-side. Designed for QA automation, test case generation, and bug triage.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published