-
Notifications
You must be signed in to change notification settings - Fork 51
How to setup with LM Studio
Long Huynh edited this page Jul 12, 2024
·
6 revisions
- Go to https://lmstudio.ai/ and download the application based on your operating system.
- Search for a model to download.
- After your model finished downloading, go to 'Local Server.'
-
Go to 'Local Inference Server'. Adjust 'Server Port' if needed. Start Server.
Make sure CORS is on for streaming.
- Go to Obsidian > BMO Chatbot > REST API Connection > REST API URL and insert the server url (e.g.
http://localhost:1234/v1
). NOTE: The REST API URL uses/chat/completions
endpoints.
- Select your model and start chatting.