Skip to content

Commit

Permalink
support X AI Grok LLM
Browse files Browse the repository at this point in the history
  • Loading branch information
eliranwong committed Nov 28, 2024
1 parent 769715a commit f94c634
Show file tree
Hide file tree
Showing 11 changed files with 107 additions and 9 deletions.
11 changes: 10 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This single project has two major interfaces:

Qt-based Multi-Window Desktop Application:

<b>Tested in:</b> <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Windows">Windows 10</a>, <a href="https://github.com/eliranwong/wsl2/blob/master/bible_apps/desktop.md">Windows WSL2</a>, <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-macOS">macOS [Sierra+]</a> and <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Linux">Linux</a> (Arch, Debian, Ubuntu & Mint), <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Chrome-OS">Chrome OS</a> (Debian 10), <a href="https://github.com/eliranwong/UniqueBible/wiki/Android-iOS-Version">Android / iOS</a>
<b>Platforms:</b> <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Windows">Windows 10</a>, <a href="https://github.com/eliranwong/wsl2/blob/master/bible_apps/desktop.md">Windows WSL2</a>, <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-macOS">macOS [Sierra+]</a> and <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Linux">Linux</a> (Arch, Debian, Ubuntu & Mint), <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Chrome-OS">Chrome OS</a> (Debian 10), <a href="https://github.com/eliranwong/UniqueBible/wiki/Android-iOS-Version">Android / iOS</a>

Unique Bible App can <a href="https://github.com/eliranwong/UniqueBible/wiki/UBA-Run-Modes">runs in different modes</a>, both online and offline, for examples:

Expand All @@ -24,6 +24,15 @@ Unique Bible App can <a href="https://github.com/eliranwong/UniqueBible/wiki/UBA
> <a href="https://github.com/eliranwong/UniqueBible/wiki/UBA-Run-Modes">other modes ...</a>
# AI Features

AI Features have been integrated into UniqueBible App. Five backends are supported:
1. OpenAI / ChatGPT
2. Google AI / Gemini
3. X AI / Grok
4. Grok Cloud API
5. Mistral AI API.

# Development Team

Eliran Wong (https://github.com/eliranwong)
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
# https://packaging.python.org/en/latest/guides/distributing-packages-using-setuptools/
setup(
name=package,
version="0.2.3",
version="0.2.4",
python_requires=">=3.8, <3.13",
description=f"UniqueBible App is a cross-platform & offline bible application, integrated with high-quality resources and unique features. Developers: Eliran Wong and Oliver Tseng",
long_description=long_description,
Expand Down
11 changes: 10 additions & 1 deletion uniquebible/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This single project has two major interfaces:

Qt-based Multi-Window Desktop Application:

<b>Tested in:</b> <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Windows">Windows 10</a>, <a href="https://github.com/eliranwong/wsl2/blob/master/bible_apps/desktop.md">Windows WSL2</a>, <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-macOS">macOS [Sierra+]</a> and <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Linux">Linux</a> (Arch, Debian, Ubuntu & Mint), <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Chrome-OS">Chrome OS</a> (Debian 10), <a href="https://github.com/eliranwong/UniqueBible/wiki/Android-iOS-Version">Android / iOS</a>
<b>Platforms:</b> <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Windows">Windows 10</a>, <a href="https://github.com/eliranwong/wsl2/blob/master/bible_apps/desktop.md">Windows WSL2</a>, <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-macOS">macOS [Sierra+]</a> and <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Linux">Linux</a> (Arch, Debian, Ubuntu & Mint), <a href="https://github.com/eliranwong/UniqueBible/wiki/Install-on-Chrome-OS">Chrome OS</a> (Debian 10), <a href="https://github.com/eliranwong/UniqueBible/wiki/Android-iOS-Version">Android / iOS</a>

Unique Bible App can <a href="https://github.com/eliranwong/UniqueBible/wiki/UBA-Run-Modes">runs in different modes</a>, both online and offline, for examples:

Expand All @@ -24,6 +24,15 @@ Unique Bible App can <a href="https://github.com/eliranwong/UniqueBible/wiki/UBA
> <a href="https://github.com/eliranwong/UniqueBible/wiki/UBA-Run-Modes">other modes ...</a>
# AI Features

AI Features have been integrated into UniqueBible App. Five backends are supported:
1. OpenAI / ChatGPT
2. Google AI / Gemini
3. X AI / Grok
4. Grok Cloud API
5. Mistral AI API.

# Development Team

Eliran Wong (https://github.com/eliranwong)
Expand Down
17 changes: 16 additions & 1 deletion uniquebible/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ def isServerAlive(ip, port):
import unicodedata, traceback, markdown
from uniquebible.util.BibleVerseParser import BibleVerseParser

config.llm_backends = ["openai", "google", "groq", "mistral"]
config.llm_backends = ["openai", "google", "grok", "groq", "mistral"]

def is_CJK(self, text):
for char in text:
Expand All @@ -146,6 +146,8 @@ def isLLMReady(backend=""):
return True
elif backend == "mistral" and config.mistralApi_key:
return True
elif backend == "grok" and config.grokApi_key:
return True
elif backend == "groq" and config.groqApi_key:
return True
elif backend == "google" and config.googleaiApi_key:
Expand Down Expand Up @@ -217,6 +219,19 @@ def getChatResponse(backend, chatMessages) -> Optional[str]:
max_tokens=config.openaiApi_chat_model_max_tokens,
stream=False,
)
elif backend == "grok":
grokClient = OpenAI(
api_key=config.grokApi_key,
base_url="https://api.x.ai/v1",
)
completion = grokClient.chat.completions.create(
model=config.grokApi_chat_model,
messages=chatMessages,
n=1,
temperature=config.grokApi_llmTemperature,
max_tokens=config.grokApi_chat_model_max_tokens,
stream=False,
)
elif backend == "google":
# https://ai.google.dev/gemini-api/docs/openai
googleaiClient = OpenAI(
Expand Down
13 changes: 13 additions & 0 deletions uniquebible/gui/Worker.py
Original file line number Diff line number Diff line change
Expand Up @@ -197,6 +197,19 @@ def getMistralApi_key():
max_tokens=config.googleaiApi_chat_model_max_tokens,
stream=True,
)
elif config.llm_backend == "grok":
grokClient = OpenAI(
api_key=config.grokApi_key,
base_url="https://api.x.ai/v1",
)
return grokClient.chat.completions.create(
model=config.grokApi_chat_model,
messages=thisMessage,
n=1,
temperature=config.grokApi_llmTemperature,
max_tokens=config.grokApi_chat_model_max_tokens,
stream=True,
)
elif config.llm_backend == "mistral":
return Mistral(api_key=getMistralApi_key()).chat.stream(
model=config.mistralApi_chat_model,
Expand Down
4 changes: 4 additions & 0 deletions uniquebible/latest_changes.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
PIP package:

0.2.4

* added support of using X AI Grok model

0.2.1-0.2.3

* added support of xonsh auto-completions
Expand Down
37 changes: 32 additions & 5 deletions uniquebible/plugins/menu/Bible Chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,8 @@ def __init__(self, parent=None):
self.apiKeyEdit = QLineEdit(config.googleaiApi_key)
elif config.llm_backend == "mistral":
self.apiKeyEdit = QLineEdit(str(config.mistralApi_key))
elif config.llm_backend == "grok":
self.apiKeyEdit = QLineEdit(str(config.grokApi_key))
elif config.llm_backend == "groq":
self.apiKeyEdit = QLineEdit(str(config.groqApi_key))
self.apiKeyEdit.setEchoMode(QLineEdit.Password)
Expand All @@ -90,6 +92,12 @@ def __init__(self, parent=None):
if key == config.mistralApi_chat_model:
initialIndex = index
index += 1
elif config.llm_backend == "grok":
for key in ("grok-beta",):
self.apiModelBox.addItem(key)
if key == config.grokApi_chat_model:
initialIndex = index
index += 1
elif config.llm_backend == "groq":
for key in ("gemma2-9b-it", "gemma-7b-it", "llama-3.1-70b-versatile", "llama-3.1-8b-instant", "llama-3.2-1b-preview", "llama-3.2-3b-preview", "llama-3.2-11b-vision-preview", "llama-3.2-90b-vision-preview", "llama3-70b-8192", "llama3-8b-8192", "mixtral-8x7b-32768"):
self.apiModelBox.addItem(key)
Expand Down Expand Up @@ -121,6 +129,8 @@ def __init__(self, parent=None):
self.maxTokenEdit = QLineEdit(str(config.googleaiApi_chat_model_max_tokens))
elif config.llm_backend == "mistral":
self.maxTokenEdit = QLineEdit(str(config.mistralApi_chat_model_max_tokens))
elif config.llm_backend == "grok":
self.maxTokenEdit = QLineEdit(str(config.grokApi_chat_model_max_tokens))
elif config.llm_backend == "groq":
self.maxTokenEdit = QLineEdit(str(config.groqApi_chat_model_max_tokens))
self.maxTokenEdit.setToolTip("The maximum number of tokens to generate in the completion.\nThe token count of your prompt plus max_tokens cannot exceed the model's context length. Most models have a context length of 2048 tokens (except for the newest models, which support 4096).")
Expand Down Expand Up @@ -476,13 +486,15 @@ def setupUI(self):
self.backends.setCurrentIndex(0)
elif config.llm_backend == "google":
self.backends.setCurrentIndex(1)
elif config.llm_backend == "groq":
elif config.llm_backend == "grok":
self.backends.setCurrentIndex(2)
elif config.llm_backend == "mistral":
elif config.llm_backend == "groq":
self.backends.setCurrentIndex(3)
elif config.llm_backend == "mistral":
self.backends.setCurrentIndex(4)
else:
config.llm_backend == "groq"
self.backends.setCurrentIndex(2)
self.backends.setCurrentIndex(3)
self.fontSize = QComboBox()
self.fontSize.addItems([str(i) for i in range(1, 51)])
self.fontSize.setCurrentIndex((config.chatGPTFontSize - 1))
Expand All @@ -492,6 +504,8 @@ def setupUI(self):
self.temperature.setCurrentIndex(int(config.openaiApi_llmTemperature * 10))
elif config.llm_backend == "google":
self.temperature.setCurrentIndex(int(config.googleaiApi_llmTemperature * 10))
elif config.llm_backend == "grok":
self.temperature.setCurrentIndex(int(config.grokApi_llmTemperature * 10))
elif config.llm_backend == "groq":
self.temperature.setCurrentIndex(int(config.groqApi_llmTemperature * 10))
elif config.llm_backend == "mistral":
Expand Down Expand Up @@ -700,6 +714,8 @@ def showApiDialog(self):
config.openaiApi_key = dialog.api_key()
elif config.llm_backend == "google":
config.googleaiApi_key = dialog.api_key()
elif config.llm_backend == "grok":
config.grokApi_key = dialog.api_key()
elif config.llm_backend == "mistral":
config.mistralApi_key = dialog.api_key()
try:
Expand Down Expand Up @@ -727,6 +743,10 @@ def showApiDialog(self):
config.googleaiApi_chat_model_max_tokens = int(dialog.max_token())
if config.googleaiApi_chat_model_max_tokens < 20:
config.googleaiApi_chat_model_max_tokens = 20
elif config.llm_backend == "grok":
config.grokApi_chat_model_max_tokens = int(dialog.max_token())
if config.grokApi_chat_model_max_tokens < 20:
config.grokApi_chat_model_max_tokens = 20
elif config.llm_backend == "mistral":
config.mistralApi_chat_model_max_tokens = int(dialog.max_token())
if config.mistralApi_chat_model_max_tokens < 20:
Expand All @@ -753,6 +773,8 @@ def showApiDialog(self):
config.openaiApi_chat_model = dialog.apiModel()
elif config.llm_backend == "google":
config.googleaiApi_chat_model = dialog.apiModel()
elif config.llm_backend == "grok":
config.grokApi_chat_model = dialog.apiModel()
elif config.llm_backend == "mistral":
config.mistralApi_chat_model = dialog.apiModel()
elif config.llm_backend == "groq":
Expand All @@ -768,7 +790,7 @@ def showApiDialog(self):
self.parent.reloadMenubar()
config.mainWindow.runBibleChatPlugins()
#config.chatGPTApiPredefinedContext = dialog.predefinedContext()
config.chatGPTApiContextInAllInputs = dialog.contextInAllInputs()
#config.chatGPTApiContextInAllInputs = dialog.contextInAllInputs()
config.chatGPTApiContext = dialog.context()
#config.chatGPTApiAudioLanguage = dialog.language()
self.newData()
Expand All @@ -782,13 +804,17 @@ def updateBackend(self, index):
elif index == 1:
config.llm_backend = "google"
elif index == 2:
config.llm_backend = "groq"
config.llm_backend = "grok"
elif index == 3:
config.llm_backend = "groq"
elif index == 4:
config.llm_backend = "mistral"

def updateTemperature(self, index):
if config.llm_backend == "mistral":
config.mistralApi_llmTemperature = float(index / 10)
elif config.llm_backend == "grok":
config.grokApi_llmTemperature = float(index / 10)
elif config.llm_backend == "groq":
config.groqApi_llmTemperature = float(index / 10)
elif config.llm_backend == "openai":
Expand Down Expand Up @@ -973,6 +999,7 @@ def newData(self):
1) Register and get an API key in one of the following websites:
OpenAI - https://platform.openai.com/account/api-keys
Google - https://ai.google.dev/
Grok - https://docs.x.ai/docs
Groq - https://console.groq.com/keys
Mistral - https://console.mistral.ai/api-keys/
2) Select a backend below
Expand Down
4 changes: 4 additions & 0 deletions uniquebible/startup/nonGui.py
Original file line number Diff line number Diff line change
Expand Up @@ -227,6 +227,7 @@ def run_terminal_mode():
# api-client mode

def run_api_client_mode():
cwd = os.getcwd()

def getApiOutput(command: str):
private = f"private={config.web_api_private}&" if config.web_api_private else ""
Expand Down Expand Up @@ -373,9 +374,11 @@ def changeSettings():
#import traceback
#print(traceback.format_exc())
print(f"Failed to connect '{config.web_api_endpoint}' at the moment!")
os.chdir(cwd)

# stream mode
def run_stream_mode():
cwd = os.getcwd()
# standard input
stdin_text = sys.stdin.read() if not sys.stdin.isatty() else ""

Expand All @@ -393,6 +396,7 @@ def run_stream_mode():
# run terminal mode if no command is given
config.runMode = "terminal"
run_terminal_mode()
os.chdir(cwd)

# ssh-server
# read setup guide at https://github.com/eliranwong/UniqueBible/wiki/Run-SSH-Server
Expand Down
17 changes: 17 additions & 0 deletions uniquebible/util/ConfigUtil.py
Original file line number Diff line number Diff line change
Expand Up @@ -205,6 +205,10 @@ def updateModules(module, isInstalled):
# config.mistralApi_llmTemperature
# config.mistralApi_chat_model
# config.mistralApi_chat_model_max_tokens
# config.grokApi_key
# config.grokApi_llmTemperature
# config.grokApi_chat_model
# config.grokApi_chat_model_max_tokens
# config.openaiApi_key
# config.openaiApi_llmTemperature
# config.openaiApi_chat_model
Expand Down Expand Up @@ -282,6 +286,19 @@ def updateModules(module, isInstalled):
setConfig("groqApi_llmTemperature", """
# Groq Chat Temperature""",
0.3) # 0.2-0.8 is good to use
setConfig("grokApi_key", """
# Grok X AI API Keys""",
"")
setConfig("grokApi_chat_model", """
# Grok X AI Chat Model""",
"grok-beta")
setConfig("grokApi_chat_model_max_tokens", """
# Grok X AI Chat Maximum Output Tokens""",
127999) # maximum 127999, greater than this value causes errors
setConfig("grokApi_llmTemperature", """
# Grok X AI Chat Temperature""",
0.3)
# mistral
setConfig("mistralApi_key", """
# Mistral AI API Keys""",
"")
Expand Down
File renamed without changes.
File renamed without changes.

0 comments on commit f94c634

Please sign in to comment.