Skip to content

Commit

Permalink
feat: update plugin options and add readme
Browse files Browse the repository at this point in the history
  • Loading branch information
ahonn committed Mar 20, 2023
1 parent 4bbdce5 commit 7ac2163
Show file tree
Hide file tree
Showing 8 changed files with 163 additions and 52 deletions.
8 changes: 1 addition & 7 deletions logo.svg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
5 changes: 3 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"name": "logseq-plugin-ai-prompt",
"name": "logseq-plugin-ai-assistant",
"version": "0.0.0",
"main": "dist/index.html",
"scripts": {
Expand Down Expand Up @@ -39,7 +39,8 @@
"vite-plugin-logseq": "1.1.2"
},
"logseq": {
"id": "logseq-plugin-ai-prompt",
"id": "logseq-plugin-ai-assistant",
"title": "AI Assistant",
"icon": "./logo.svg"
}
}
86 changes: 86 additions & 0 deletions readme.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,86 @@
# Logseq AI Assistant

A powerful tool that enhances your Logseq experience by allowing you to interact with AI models like OpenAI's `gpt-3.5-turbo`.

With this plugin, you can effortlessly generate or transform text using custom prompts,
enabling you to achieve more efficient and creative workflows within Logseq.

![](https://user-images.githubusercontent.com/9718515/226260897-d5e39c09-4714-4d23-b004-28a2391512c4.gif)

> Inspired by [Notion AI](https://www.notion.so/product/ai) and [Raycast AI](https://www.raycast.com/ai)
## Features
- Seamless integration with Logseq
- Customizable prompt support
- Easy-to-use built-in prompts

## Install

### ~~Option 1: directly install via Marketplace (Coming Soon)~~

### Option 2: manually load

- turn on Logseq developer mode
- [download the prebuilt package here](https://github.com/ahonn/logseq-plugin-todo/releases)
- unzip the zip file and load from Logseq plugins page

## Configuration
Before using the plugin, you need to configure it according to your preferences.

- **API Key**: Enter your OpenAI API key in this field. If you don't have an API key yet, visit the [OpenAI](https://platform.openai.com/account/api-keys) to obtain one.
- **Model**: Choose the OpenAI model you want to use, such as "gpt-3.5-turbo". Different models may offer varying levels of performance and text generation capabilities.
- **Custom Prompts**: Enable this option if you want to use custom prompts for generating or transforming text. You can add, edit, or remove prompts in the prompts array.

## Built-in Prompts

The Logseq-plugin-ai-prompt comes with several built-in prompts to enhance your text editing experience

- **Ask AI**: Ask a question, and the AI will provide a helpful answer. The answer will be inserted after the selected text.
- **Summarize**: Provide a concise summary of the selected text. The summary will be added as a property.
- **Make Shorter**: Shorten the selected text while maintaining its key points. The shorter text will replace the original.
- **Make Longer**: Expand the selected text, providing more details and depth. The expanded text will replace the original.
- **Change Tone to Friendly**: Rewrite the selected text with a friendly tone. The friendly version will replace the original.
- **Change Tone to Confident**: Rewrite the selected text with a confident tone. The confident version will replace the original.
- **Change Tone to Casual**: Rewrite the selected text with a casual tone. The casual version will replace the original.
- **Change Tone to Professional**: Rewrite the selected text with a more professional tone. The professional version will replace the original.
- **Explain This**: Provide a clear explanation for the selected text or code snippet. The explanation will be inserted after the selected text.
- **Generate Ideas**: Generate creative ideas related to the selected topic. The ideas will be inserted after the selected text.

See all built-in prompts [here](https://github.com/ahonn/logseq-plugin-ai-assistant/blob/master/src/preset.ts)

## How to Use a Custom Prompt

- Open the plugin settings and locate "customPrompts" field.

- Add the following JSON object to the "prompts" array:

```json
{
"apiKey": "<your-api-key>",
"model": "gpt-3.5-turbo",
"customPrompts": {
"enable": true, // <- Make sure to enable this.
"prompts": [
{
"name": "Markdown Table",
"prompt": "Please generate a {{text}} Markdown table",
"output": "replace" // "property", "replace" or "insert"
}
]
},
"disabled": false
}
```

- In the Logseq editor, focus the cursor on the place where you want to generate the table and do the following.
![](https://user-images.githubusercontent.com/9718515/226259576-a1193b51-8a57-4cad-9270-f5bc30a5ba29.gif)

## Contribution
Issues and PRs are welcome!

## Buy me a coffee

If my plugin solve your situation and you will, you can choose to [buy me a coffee](https://www.buymeacoffee.com/yuexunjiang).

## Licence
MIT
4 changes: 2 additions & 2 deletions release.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -20,13 +20,13 @@ module.exports = {
"@semantic-release/exec",
{
prepareCmd:
"zip -qq -r logseq-plugin-ai-prompt-${nextRelease.version}.zip dist readme.md logo.svg LICENSE package.json",
"zip -qq -r logseq-plugin-ai-assistant-${nextRelease.version}.zip dist readme.md logo.svg LICENSE package.json",
},
],
[
"@semantic-release/github",
{
assets: "logseq-plugin-ai-prompt-*.zip",
assets: "logseq-plugin-ai-assistant-*.zip",
},
],
],
Expand Down
22 changes: 14 additions & 8 deletions src/main.tsx
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import '@logseq/libs';
import openai from './openai';
import { Configuration, OpenAIApi } from 'openai';
import * as presetPrompts from './preset';
import settings, {
IPromptOptions,
Expand All @@ -9,14 +9,20 @@ import settings, {
import { getBlockContent } from './utils';

function main() {
const { customPrompts } = logseq.settings as unknown as ISettings;
const { apiKey, model, customPrompts } = logseq.settings as unknown as ISettings;
const prompts = [...Object.values(presetPrompts)];

if (customPrompts.enable) {
prompts.push(...customPrompts.prompts);
}

prompts.map(({ name, prompt, output }: IPromptOptions) => {
const configuration = new Configuration({
apiKey,
});

const openai = new OpenAIApi(configuration);

logseq.Editor.registerSlashCommand(
name,
async ({ uuid }: { uuid: string }) => {
Expand All @@ -29,7 +35,7 @@ function main() {

const content = await getBlockContent(block);
const completion = await openai.createChatCompletion({
model: 'gpt-3.5-turbo',
model,
messages: [
{
role: 'user',
Expand All @@ -43,20 +49,20 @@ function main() {
if (!message) {
return;
}
const content = message.content.trim();

switch (output) {
case PromptOutputType.property:
await logseq.Editor.updateBlock(
uuid,
block?.content +
`\n ${name.toLowerCase()}:: ${message.content}`,
block?.content + `\n ${name.toLowerCase()}:: ${content}`,
);
break;
case PromptOutputType.appendChild:
await logseq.Editor.insertBlock(uuid, message.content);
case PromptOutputType.insert:
await logseq.Editor.insertBlock(uuid, content);
break;
case PromptOutputType.replace:
await logseq.Editor.updateBlock(uuid, message.content);
await logseq.Editor.updateBlock(uuid, content);
break;
}
}
Expand Down
9 changes: 0 additions & 9 deletions src/openai.ts

This file was deleted.

61 changes: 39 additions & 22 deletions src/preset.ts
Original file line number Diff line number Diff line change
@@ -1,9 +1,21 @@
import { PromptOutputType } from "./settings";
import { PromptOutputType } from './settings';

export const ask = {
name: 'Ask AI',
prompt: `
I have a question:
"""
{{text}}
"""
Please provide a helpful answer.
`,
output: PromptOutputType.insert,
};

export const summarize = {
name: 'Summarize',
prompt: `
Summarize the following text:
Please provide a concise summary of the following text:
"""
{{text}}
"""
Expand All @@ -14,82 +26,87 @@ export const summarize = {
export const makeShorter = {
name: 'Make Shorter',
prompt: `
Shorten the following text:
Please shorten the following text while maintaining its key points:
"""
{{text}}
"""
`,
output: PromptOutputType.replace,
}
};

export const makeLonger = {
name: 'Make Longer',
prompt: `
Expand the following text:
Please expand the following text, providing more details and depth:
"""
{{text}}
"""
`,
output: PromptOutputType.replace,
}
};

export const changeTone2Friendly = {
name: 'Change Tone to Friendly',
prompt: `
Change tone to friendly of the following text:
Please rewrite the following text with a friendly tone:
"""
{{text}}
"""
`,
output: PromptOutputType.replace,
}
};

export const changeTone2Confident = {
name: 'Change Tone to Confident',
prompt: `
Change tone to confident of the following text:
Please rewrite the following text with a confident tone:
"""
{{text}}
"""
`,
output: PromptOutputType.replace,
}
};

export const changeTone2Casual = {
name: 'Change Tone to Casual',
prompt: `
Change tone to casual of the following text:
Please rewrite the following text with a casual tone:
"""
{{text}}
"""
`,
output: PromptOutputType.replace,
}
};

export const changeTone2Professional = {
name: 'Change Tone to Professional',
prompt: `
Change tone to Professional of the following text:
Please rewrite the following text with a more professional tone:
"""
{{text}}
"""
`,
output: PromptOutputType.replace,
}
};

export const explainThis = {
name: 'Explain This',
prompt: `
Explain This following text or code:
Please provide a clear explanation for the following text or code snippet:
"""
{{text}}
"""
`,
output: PromptOutputType.appendChild,
}
output: PromptOutputType.insert,
};

export const ask = {
name: 'Ask',
prompt: `{{text}}`,
output: PromptOutputType.appendChild,
}
export const generateIdeas = {
name: 'Generate Ideas',
prompt: `
Please creative ideas related to the following topic:
"""
{{text}}
"""
`,
output: PromptOutputType.insert,
};
20 changes: 18 additions & 2 deletions src/settings.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ import { SettingSchemaDesc } from '@logseq/libs/dist/LSPlugin.user';
export enum PromptOutputType {
property = 'property',
replace = 'replace',
appendChild = 'appendChild',
insert = 'insert',
}

export interface IPromptOptions {
Expand All @@ -13,18 +13,34 @@ export interface IPromptOptions {
}

export interface ISettings {
apiKey: string;
model: string;
customPrompts: {
enable: boolean;
prompts: IPromptOptions[];
};
}

const settings: SettingSchemaDesc[] = [
{
key: 'apiKey',
type: 'string',
title: 'API Key',
description: 'Enter your OpenAI API key.',
default: '',
},
{
key: 'model',
type: 'string',
title: 'Model',
description: 'Choose the OpenAI model (e.g., "gpt-3.5-turbo").',
default: 'gpt-3.5-turbo',
},
{
key: 'customPrompts',
type: 'object',
title: 'Custom Prompts',
description: 'Custom Prompts',
description: 'Enable and manage custom prompts.',
default: {
enable: false,
prompts: [],
Expand Down

0 comments on commit 7ac2163

Please sign in to comment.