A Telegram bot for bill payments powered by the Interswitch Bills Payment API, FastAPI, and a local LLM (Llama). This bot provides an intelligent, conversational interface for users to pay bills directly through Telegram.
⚠️ Note: This project was vibe-coded and may contain bugs. Contributions are welcome!
- 💬 Conversational bill payment interface via Telegram
- 🤖 Local LLM integration (Llama 3.2) for natural language processing
- 💳 Interswitch Bills Payment API integration
- 🔄 Redis-based session management
- 🚀 FastAPI backend with webhook support
- 🐳 Dockerized deployment
- Backend Framework: FastAPI
- Bot Framework: python-telegram-bot
- LLM: Llama 3.2 (1B model via Ollama)
- Payment Gateway: Interswitch Bills Payment API
- Session Storage: Redis
- Package Management: Poetry
- Containerization: Docker & Docker Compose
- Development Tunneling: ngrok
telegram-payment-bot/
├── app/
│ ├── core/
│ │ ├── config.py # Application configuration
│ │ ├── deps.py # Dependency injection
│ │ ├── exceptions.py # Custom exceptions
│ │ ├── logging.py # Logging setup
│ │ └── states.py # Bot conversation states
│ ├── interface/
│ │ ├── telegram/
│ │ │ ├── bot.py # Telegram bot initialization
│ │ │ ├── handlers.py # General message handlers
│ │ │ └── handlers_interswitch.py # Payment-specific handlers
│ │ └── web/
│ │ ├── route.py # Web routes
│ │ └── telegram.py # Telegram webhook endpoint
│ ├── schemas/
│ │ ├── bill_payment_intent.py
│ │ ├── intent.py
│ │ ├── interswitch.py
│ │ ├── session.py
│ │ └── transaction.py
│ ├── services/
│ │ ├── bill_payment_service.py
│ │ ├── interswitch_service.py
│ │ ├── llm_service.py
│ │ ├── payment_service.py
│ │ └── session_service.py
│ └── main.py # Application entry point
├── Dockerfile
├── docker-compose.yml
├── pyproject.toml
├── poetry.lock
└── .env.example
- Docker and Docker Compose
- Ollama with Llama 3.2 model installed locally
- Telegram Bot Token (from @BotFather)
- Interswitch API credentials (from Interswitch API Marketplace)
- ngrok account (for local development webhook)
git clone <your-repo-url>
cd telegram-payment-bot# Install Ollama (macOS)
curl -fsSL https://ollama.com/install.sh | sh
# Pull Llama 3.2 1B model
ollama pull llama3.2:1b
# Verify installation
ollama list- Visit the Interswitch API Marketplace
- Register/Login to your account
- Create a new application
- Copy your
Client ID,Client Secret, and note theBase URL
- Message @BotFather on Telegram
- Send
/newbotand follow the instructions - Copy the bot token provided
# Install ngrok
brew install ngrok # macOS
# or download from https://ngrok.com/download
# Authenticate (get auth token from ngrok.com)
ngrok config add-authtoken <your-auth-token>
# Start ngrok tunnel
ngrok http 8000Copy the HTTPS forwarding URL (e.g., https://your-subdomain.ngrok-free.app)
cp .env.example .envEdit .env file with your credentials:
# App Config
ENV=development
LOG_LEVEL=INFO
# Telegram
TELEGRAM_BOT_TOKEN=your_bot_token_here
TELEGRAM_SECRET_TOKEN=your_secret_token_here # Generate a random secure string
# Redis
REDIS_URL=redis://redis:6379/0
# Webhook
WEBHOOK_URL=https://your-ngrok-url.ngrok-free.app/api/webhook
# Interswitch
INTERSWITCH_CLIENT_ID=your_client_id_here
INTERSWITCH_SECRET=your_client_secret_here
INTERSWITCH_BASE_URL=https://sandbox.interswitchng.com/
# LLM Config
LLM_API_URL=http://host.docker.internal:11434/api/generate
LLM_MODEL=llama3.2:1bImportant:
- Replace
TELEGRAM_BOT_TOKENwith your actual bot token - Replace
WEBHOOK_URLwith your ngrok HTTPS URL +/api/webhook - Replace Interswitch credentials with your actual credentials
- Generate a secure random string for
TELEGRAM_SECRET_TOKEN:python -c "import secrets; print(secrets.token_hex(32))"
# Build and start services
docker-compose up --build
# Or run in detached mode
docker-compose up -d --build
# View logs
docker-compose logs -fThe bot should now be running and accessible via Telegram!
- Open Telegram and search for your bot
- Start a conversation with
/start - Follow the conversational prompts to pay bills
- The bot will guide you through the payment process using natural language
If you prefer to run without Docker:
# Install dependencies
poetry install
# Activate virtual environment
poetry shell
# Run the application
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000Note: You'll need to run Redis separately:
# Install Redis
brew install redis # macOS
# Start Redis
redis-serverpoetry add <package-name>- Handlers: Telegram message and callback handlers
- Services: Business logic for payments, LLM, and sessions
- Schemas: Pydantic models for data validation
- Core: Configuration, dependencies, and utilities
| Variable | Description | Required |
|---|---|---|
ENV |
Environment (development/production) | Yes |
LOG_LEVEL |
Logging level (DEBUG/INFO/WARNING/ERROR) | Yes |
TELEGRAM_BOT_TOKEN |
Telegram bot token from BotFather | Yes |
TELEGRAM_SECRET_TOKEN |
Secret token for webhook validation | Yes |
REDIS_URL |
Redis connection URL | Yes |
WEBHOOK_URL |
Public HTTPS URL for webhook | Yes |
INTERSWITCH_CLIENT_ID |
Interswitch API client ID | Yes |
INTERSWITCH_SECRET |
Interswitch API secret | Yes |
INTERSWITCH_BASE_URL |
Interswitch API base URL | Yes |
LLM_API_URL |
Ollama API endpoint | Yes |
LLM_MODEL |
Ollama model name | Yes |
# Check webhook status
curl https://api.telegram.org/bot<YOUR_BOT_TOKEN>/getWebhookInfo
# Delete webhook
curl https://api.telegram.org/bot<YOUR_BOT_TOKEN>/deleteWebhook# Test Redis connection
docker exec -it bot_redis redis-cli ping
# Should return: PONG# Verify Ollama is running
curl http://localhost:11434/api/tags
# Test model
ollama run llama3.2:1b "Hello"# Rebuild containers
docker-compose down
docker-compose up --build
# View logs
docker-compose logs -f bot
# Clean up everything
docker-compose down -v
docker system prune -aFor production deployment:
- Replace ngrok with a proper domain and SSL certificate
- Set
ENV=productionin.env - Use production Interswitch credentials
- Configure proper Redis persistence
- Set up monitoring and logging
- Implement rate limiting
- Add error tracking (e.g., Sentry)
Contributions are welcome! This project was vibe-coded and could definitely use improvements.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
- Additional payment methods
- Better error handling
- Comprehensive testing
- UI/UX improvements
- Documentation
- Security enhancements
- Performance optimizations
For issues and questions:
- Open an issue on GitHub
Disclaimer: This is a development project and should be thoroughly tested before use in production. The developers are not responsible for any financial transactions or data loss.