Skip to content

emeleonufavour/Paya

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Paya (An AI Powered Telegram Bills Payment Bot)

A Telegram bot for bill payments powered by the Interswitch Bills Payment API, FastAPI, and a local LLM (Llama). This bot provides an intelligent, conversational interface for users to pay bills directly through Telegram.

⚠️ Note: This project was vibe-coded and may contain bugs. Contributions are welcome!

Screenshots

Bot Interface Screenshot 1 Bot Interface Screenshot 2

Features

  • 💬 Conversational bill payment interface via Telegram
  • 🤖 Local LLM integration (Llama 3.2) for natural language processing
  • 💳 Interswitch Bills Payment API integration
  • 🔄 Redis-based session management
  • 🚀 FastAPI backend with webhook support
  • 🐳 Dockerized deployment

Tech Stack

  • Backend Framework: FastAPI
  • Bot Framework: python-telegram-bot
  • LLM: Llama 3.2 (1B model via Ollama)
  • Payment Gateway: Interswitch Bills Payment API
  • Session Storage: Redis
  • Package Management: Poetry
  • Containerization: Docker & Docker Compose
  • Development Tunneling: ngrok

Project Structure

telegram-payment-bot/
├── app/
│   ├── core/
│   │   ├── config.py          # Application configuration
│   │   ├── deps.py            # Dependency injection
│   │   ├── exceptions.py      # Custom exceptions
│   │   ├── logging.py         # Logging setup
│   │   └── states.py          # Bot conversation states
│   ├── interface/
│   │   ├── telegram/
│   │   │   ├── bot.py         # Telegram bot initialization
│   │   │   ├── handlers.py    # General message handlers
│   │   │   └── handlers_interswitch.py  # Payment-specific handlers
│   │   └── web/
│   │       ├── route.py       # Web routes
│   │       └── telegram.py    # Telegram webhook endpoint
│   ├── schemas/
│   │   ├── bill_payment_intent.py
│   │   ├── intent.py
│   │   ├── interswitch.py
│   │   ├── session.py
│   │   └── transaction.py
│   ├── services/
│   │   ├── bill_payment_service.py
│   │   ├── interswitch_service.py
│   │   ├── llm_service.py
│   │   ├── payment_service.py
│   │   └── session_service.py
│   └── main.py                # Application entry point
├── Dockerfile
├── docker-compose.yml
├── pyproject.toml
├── poetry.lock
└── .env.example

Prerequisites

  • Docker and Docker Compose
  • Ollama with Llama 3.2 model installed locally
  • Telegram Bot Token (from @BotFather)
  • Interswitch API credentials (from Interswitch API Marketplace)
  • ngrok account (for local development webhook)

Setup Instructions

1. Clone the Repository

git clone <your-repo-url>
cd telegram-payment-bot

2. Install Ollama and Llama Model

# Install Ollama (macOS)
curl -fsSL https://ollama.com/install.sh | sh

# Pull Llama 3.2 1B model
ollama pull llama3.2:1b

# Verify installation
ollama list

3. Get Interswitch API Credentials

  1. Visit the Interswitch API Marketplace
  2. Register/Login to your account
  3. Create a new application
  4. Copy your Client ID, Client Secret, and note the Base URL

4. Create Telegram Bot

  1. Message @BotFather on Telegram
  2. Send /newbot and follow the instructions
  3. Copy the bot token provided

5. Setup ngrok

# Install ngrok
brew install ngrok  # macOS
# or download from https://ngrok.com/download

# Authenticate (get auth token from ngrok.com)
ngrok config add-authtoken <your-auth-token>

# Start ngrok tunnel
ngrok http 8000

Copy the HTTPS forwarding URL (e.g., https://your-subdomain.ngrok-free.app)

6. Configure Environment Variables

cp .env.example .env

Edit .env file with your credentials:

# App Config
ENV=development
LOG_LEVEL=INFO

# Telegram
TELEGRAM_BOT_TOKEN=your_bot_token_here
TELEGRAM_SECRET_TOKEN=your_secret_token_here  # Generate a random secure string

# Redis
REDIS_URL=redis://redis:6379/0

# Webhook
WEBHOOK_URL=https://your-ngrok-url.ngrok-free.app/api/webhook

# Interswitch
INTERSWITCH_CLIENT_ID=your_client_id_here
INTERSWITCH_SECRET=your_client_secret_here
INTERSWITCH_BASE_URL=https://sandbox.interswitchng.com/

# LLM Config
LLM_API_URL=http://host.docker.internal:11434/api/generate
LLM_MODEL=llama3.2:1b

Important:

  • Replace TELEGRAM_BOT_TOKEN with your actual bot token
  • Replace WEBHOOK_URL with your ngrok HTTPS URL + /api/webhook
  • Replace Interswitch credentials with your actual credentials
  • Generate a secure random string for TELEGRAM_SECRET_TOKEN:
    python -c "import secrets; print(secrets.token_hex(32))"

7. Build and Run with Docker

# Build and start services
docker-compose up --build

# Or run in detached mode
docker-compose up -d --build

# View logs
docker-compose logs -f

The bot should now be running and accessible via Telegram!

Usage

  1. Open Telegram and search for your bot
  2. Start a conversation with /start
  3. Follow the conversational prompts to pay bills
  4. The bot will guide you through the payment process using natural language

Development

Running Without Docker

If you prefer to run without Docker:

# Install dependencies
poetry install

# Activate virtual environment
poetry shell

# Run the application
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000

Note: You'll need to run Redis separately:

# Install Redis
brew install redis  # macOS

# Start Redis
redis-server

Adding Dependencies

poetry add <package-name>

Code Structure

  • Handlers: Telegram message and callback handlers
  • Services: Business logic for payments, LLM, and sessions
  • Schemas: Pydantic models for data validation
  • Core: Configuration, dependencies, and utilities

Environment Variables

Variable Description Required
ENV Environment (development/production) Yes
LOG_LEVEL Logging level (DEBUG/INFO/WARNING/ERROR) Yes
TELEGRAM_BOT_TOKEN Telegram bot token from BotFather Yes
TELEGRAM_SECRET_TOKEN Secret token for webhook validation Yes
REDIS_URL Redis connection URL Yes
WEBHOOK_URL Public HTTPS URL for webhook Yes
INTERSWITCH_CLIENT_ID Interswitch API client ID Yes
INTERSWITCH_SECRET Interswitch API secret Yes
INTERSWITCH_BASE_URL Interswitch API base URL Yes
LLM_API_URL Ollama API endpoint Yes
LLM_MODEL Ollama model name Yes

Troubleshooting

Webhook Issues

# Check webhook status
curl https://api.telegram.org/bot<YOUR_BOT_TOKEN>/getWebhookInfo

# Delete webhook
curl https://api.telegram.org/bot<YOUR_BOT_TOKEN>/deleteWebhook

Redis Connection Issues

# Test Redis connection
docker exec -it bot_redis redis-cli ping
# Should return: PONG

Ollama Connection Issues

# Verify Ollama is running
curl http://localhost:11434/api/tags

# Test model
ollama run llama3.2:1b "Hello"

Docker Issues

# Rebuild containers
docker-compose down
docker-compose up --build

# View logs
docker-compose logs -f bot

# Clean up everything
docker-compose down -v
docker system prune -a

Production Deployment

For production deployment:

  1. Replace ngrok with a proper domain and SSL certificate
  2. Set ENV=production in .env
  3. Use production Interswitch credentials
  4. Configure proper Redis persistence
  5. Set up monitoring and logging
  6. Implement rate limiting
  7. Add error tracking (e.g., Sentry)

Contributing

Contributions are welcome! This project was vibe-coded and could definitely use improvements.

How to Contribute

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Areas for Improvement

  • Additional payment methods
  • Better error handling
  • Comprehensive testing
  • UI/UX improvements
  • Documentation
  • Security enhancements
  • Performance optimizations

Acknowledgments

Support

For issues and questions:

  • Open an issue on GitHub

Disclaimer: This is a development project and should be thoroughly tested before use in production. The developers are not responsible for any financial transactions or data loss.

About

Paya is an AI integrated Telegram bot for Bills payment using the Interswitch Bills Payment API

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors