Help Me With This is an AI-powered Telegram bot that turns messages into answers. You can forward messages or reply to the bot, and it will respond to you based on the initial instructions (see OPENAI_PROMPT). The current implementation relies on ChatGPT as its language model backend.
- Summarize provided content. You can either share messages from other channels or paste text directly to the bot, but keep in mind the max tokens number
- Explain a foreign word, including what it means and the contexts in which it can be applied
- Compare two or more foreign words and describe the contexts in which they are used
- Any messages from users
- Any responses from ChatGPT API
- The user's
Telegram IDfor user identification - LLM usage statistics:
- Used model name (string). For example,
gpt-5,gpt-4.1 - Number of tokens in the prompt (number)
- Number of tokens in the generated completion (number)
- Used model name (string). For example,
- Node.js >= 20
- pnpm >= 9
- PostgreSQL >= 17
- OpenAI API key
- Telegram Bot token (@BotFather). For convenience, it is recommended to create two bots (keys): one for local development and one for production
- Make (not mandatory but highly recommended)
Fork and clone the repository
git clone [email protected]:nlevchuk/help-me-with-this-ai-bot.gitInstall dependencies
make installConfigure environment variables. Copy the .env.template file to .env at the project root and supply the following variables
| Variable | Description |
|---|---|
OPENAI_PROMPT* |
Initial instructions for ChatGPT. This instructions define the bot's purpose and are prepended to every message before being sent to ChatGPT |
TELEGRAM_BOT_TOKEN, TELEGRAM_BOT_USERNAME |
Bot token and username obtained from @BotFather. |
OPENAI_API_KEY |
API key used to call the OpenAI completions API |
OPENAI_MODEL, OPENAI_TEMPERATURE, OPENAI_MAX_TOKENS, OPENAI_TOP_P, OPENAI_FREQ_PENALTY, OPENAI_PRES_PENALTY |
Model configuration for OpenAI prompts |
TRANSLATION_LANGUAGES |
Comma-separated list of translation languages for the translation plugin |
TRANSLATION_PROMPT |
Base prompt for the translation plugin |
POSTGRES_HOST, POSTGRES_PORT, POSTGRES_USER, POSTGRES_PASSWORD, POSTGRES_DB |
Connection details for the PostgreSQL database |
POSTGRES_CA_CERT_PATH |
Path to the CA certificate |
HONEYBADGER_API_KEY |
Error reporting |
* Examples of initial instructions:
- "Summarize the following text into two or three sentences"
- "Check that the following sentences are grammatically correct and sound natural in Spanish"
- "Get the main idea behind the text"
The bot authenticates users with tokens to prevent unexpected usage. Authentication can be omitted depending on the purpose of the bot, but it is recommended when you plan to use a paid language model backend such as ChatGPT.
To add a new user, generate an invitation token link and share it with them. Run the command locally:
make generate-invitation-linkYou will see a link similar to https://t.me/<YOUR_BOT_NAME>?start=<INVITATION_TOKEN>
Then connect to the database using your preferred tool and add a new record in the InvitationTokens table. Use the INVITATION_TOKEN part from the generated link and place it in the token column. Additionally, you can specify extra parameters for the token:
-
allowedMultipleInvites=TRUE|FALSE(default: FALSE)
Allow multiple users to connect to your bot using the token URL. WhenFALSE, the link can only be used once -
defaultApiCallsLimit=number
If set, defines the maximum number of messages a user can send through your bot per day -
defaultMaxMessageLength=number
If set, defines the maximum message length sent to your bot. The bot shows an appropriate message to the user if it exceeds the limit -
comment=string(max length: 256)
A comment to remind yourself what the token is for
Compile the TypeScript sources:
make buildRun database migrations once the data source is configured:
make db-migrateRollbacks are available through make db-rollback.
Start the bot:
make startRunning the bot in the cloud is straightforward. The default workflow in the project uses Github Actions to build the image, push it to the container registry and roll the container out to the server.
The project uses ansible for the initial setup. You can either use the playbooks listed in the repo, or follow the Lightsail instance setup. Both approaches are supported.
Next, the server should be prepared to run the bot. Use the playbook in current repository for it.
Prepare necessary files:
- Copy
group_vars/all/vault.yml.templatetogroup_vars/all/vault.ymland update thecr_tokenvariable. To use Github Container Registry, create a Personal Access Token withread:packagespermission athttps://github.com/settings/tokens/new. Store the token (starts withghp_***) incr_token. For security, encrypt the vault as described in the README file for it. Remember the password or store it in thevault_passwd.txtfile (listed in .gitignore). But shhhh, nobody should know about it:) - Copy
inventory.ini.templatetoinventory.iniand replace<IP>and<USERNAME>with your server's IP address and your container registry username - Populate the
files/directory with:certs/ca.pemissued by your database provider.envcontaining production environment variables.tmux.conf(can be empty if tmux is unused)
Run the playbook from the repository root:
- From the repository root, execute
make run-ansible. The playbook uses podman by default. To use docker instead, runansible-playbook -i ansible/inventory.ini --vault-password-file ansible/vault_passwd.txt ansible/playbook-docker.yml
Review the output:
- Follow the final instructions printed by Ansible
Once the previos steps complete, configure the Github Actions by adding the following Secrets at https://github.com/nlevchuk/help-me-with-this-ai-bot/settings/secrets/actions:
- SERVER_HOST: IP address of your server
- SERVER_USERNAME: User that runs the app (default: ubuntu)
- SERVER_KEY: Contents of the private SSH key, as described in the final Ansible output
- SERVER_KNOWN_HOSTS: Contents of the
known_hostsfile mentioned in the final Ansible output
- Describe the message translator feature and how it works (quoted messages, replied messages)
- Describe the message removal feature and how it works
- Modify the scripts/run_container.sh to ensure it terminates if any command fails (set -e)
- Add support for more LLMs
- Allow translating replies and quotes into languages other than the default on the fly; use
/translate <Language>(repo) - When the Translator plugin is enabled, show system messages in the user's language