This plugin was developed to make a better way to use Local LLM models with Obsidian.
- Download/Clone the plugin into your plugins folder
cd ./obsidian/plugins
git clone https://github.com/Sparky4567/obsidian_ai_plugin.git
cd obsidian_ai_plugin
npm install
npm run build
open Obsidian app
enable community plugin support
enable LLM plugin
choose a module within the settings tab
- Ensure that you have Ollama installed
https://ollama.com/download
Read documentation accordingly.
- Ensure that Ollama model is running in the background
ollama run tinyllama
(Example)
- If you downloaded this plugin from GitHub repo, copy it to your .obsidian/plugins, don't forget to run npm install within the plugins directory
npm install
to install all needed dependencies.
- Ensure that the plugin is activated
- Choose the right endpoint and model in plugins settings
- Write something into editor field (Simple text)
- Select the text with your mouse
- Press CTRL+P after selection
- Type in ASK LLM and choose your wanted command (There aren't many at the moment)
- Press Enter to confirm
- Wait for a while to get the result
Your text will be changed with the text from LLM (Default is tinyllama)
If you have any questions related to the plugin or want to extend the functionality, write an email to admin@artefaktas.eu and I will try to respond as soon, as I can.
- A laptop with at least 8GB of RAM and a decent processor (for local usage)