Skip to content

An open-source AI summarization tool that runs locally, using Ollama!

License

Notifications You must be signed in to change notification settings

aravhawk/SumQuick

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SumQuick

License Version Streamlit Version OpenAI Version Requests Version BeautifulSoup Version

A GPU is highly recommended for running SumQuick (or any Ollama/AI model), otherwise inference may take quite a while. Compatible GPUs listed at https://github.com/ollama/ollama/blob/main/docs/gpu.md

About

An open-source AI summarization tool that runs locally, using Ollama!

Features

  • Local execution for privacy and security
  • Summarizes text efficiently using AI
  • User-friendly interface with Streamlit

Installation

Prerequisites

Clone the Repository

git clone https://github.com/aravhawk/SumQuick.git
cd SumQuick
pip3 install -r requirements.txt
streamlit run main.py

Use the App

On the host computer

  • Visit localhost:8501 or 127.0.0.1:8501 in a web browser

On another device within the network

  • Visit [host-computer-ip]:8501 in a web browser

Tweaks & Using Other Models

  • To use (or experiment with) other Ollama-compatible models in SumQuick, simply edit lines 8 and 9 in main.py, with your model choice and one of its corresponding tags.
  • For example:
model = 'llama3'
tag = 'latest'

or

model = 'llama3'
tag = '8b-instruct-q8_0'

A full catalog of all models, tags, etc. is available at https://ollama.com/library.

About

An open-source AI summarization tool that runs locally, using Ollama!

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages