Skip to content

lmstudio-ai/lmstudio-js

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

lmstudio javascript library logo

Use local LLMs in JS/TS/Node

LM Studio Client SDK

lmstudio-ts is LM Studio's official JavaScript/TypeScript client SDK, it allows you to

Using python? See lmstudio-python

Installation

npm install @lmstudio/sdk --save

Quick Example

import { LMStudioClient } from "@lmstudio/sdk";
const client = new LMStudioClient();

const model = await client.llm.model("llama-3.2-1b-instruct");
const result = await model.respond("What is the meaning of life?");

console.info(result.content);

For more examples and documentation, visit lmstudio-js docs.

Why use lmstudio-js over openai sdk?

Open AI's SDK is designed to use with Open AI's proprietary models. As such, it is missing many features that are essential for using LLMs in a local environment, such as:

  • Managing loading and unloading models from memory
  • Configuring load parameters (context length, gpu offload settings, etc.)
  • Speculative decoding
  • Getting information (such as context length, model size, etc.) about a model
  • ... and more

In addition, while openai sdk is automatically generated, lmstudio-js is designed from ground-up to be clean and easy to use for TypeScript/JavaScript developers.

Contributing

You can build the project locally by following these steps:

git clone https://github.com/lmstudio-ai/lmstudio-js.git --recursive
cd lmstudio-js
npm install
npm run build

See CONTRIBUTING.md for more information.

Community

Discuss all things lmstudio-js in #dev-chat in LM Studio's Community Discord server.

Discord