This repository contains the code for a simple Telegram bot that integrates with the Ollama API to process user inputs and provide intelligent responses. The bot is built using the node-telegram-bot-api
library and uses axios
for HTTP requests.
- Accepts user input via Telegram chat.
- Sends user input to the Ollama API for processing.
- Returns responses generated by the Ollama API to the Telegram chat.
- Handles basic commands such as
/start
and/stop
.
- Node.js: Ensure you have Node.js installed on your system.
- Telegram Bot Token: Obtain a bot token by creating a bot using BotFather.
- Ollama API: Ensure you have access to the Ollama API and know the endpoint URL.
-
Clone the repository:
git clone https://github.com/AsbDaryaee/telegram-ollama-bot.git cd telegram-ollama-bot
-
Install dependencies:
npm install
OR
pnpm i
-
Create a
.env
file in the root directory and add the following environment variables:TELEGRAM_TOKEN=your-telegram-bot-token OLLAMA_API_URL=http://localhost:11434/api/generate # Replace with your Ollama API endpoint PROXY=http://your-proxy-url # Optional, if you need a proxy
-
Run the bot:
node server.js
The bot will start polling Telegram for messages.
- Start the bot by sending the
/start
command in the Telegram chat. - Send your questions or prompts to the bot.
- The bot processes your input using the Ollama API and replies with the generated response.
.
├── index.js # Main script for the bot
├── package.json # Project dependencies and metadata
├── .env # Environment variables (not included in the repository)
User: /start
Bot: Welcome, Send Your First Question or Prompt!
User: What is the capital of France?
Bot: The capital of France is Paris.
-
If the bot encounters an issue communicating with the Ollama API, it sends the following message to the user:
Sorry, I couldn't process your message. Please try again later.
-
If a non-text message is sent, the bot responds with:
I can only process text messages.
This project is licensed under the MIT License. See the LICENSE file for details.
Contributions are welcome! Feel free to open issues or submit pull requests.