Skip to content

Wendyshiro/M-PESA-Statements-Analyzer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

27 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿงพ M-PESA Statement Analyzer

Go HTML Chart.js OpenAI Docker License

๐Ÿšง Active development in progress. Vercel deployment coming soon.

A secure, production-grade financial analysis platform built with Go. Transform your M-PESA statements into actionable insights with AI-powered categorization and interactive dashboards.

โ€ข Report Bug โ€ข Request Feature


๐Ÿ“– Table of Contents


๐ŸŽฏ About The Project

M-PESA Statement Analyzer is a comprehensive financial management tool that helps users understand their spending patterns by parsing M-PESA PDF statements, automatically categorizing transactions, and presenting beautiful visualizations.

The Problem

  • Manual analysis of M-PESA statements is time-consuming
  • Difficult to track spending across categories
  • No built-in tools for financial trend analysis
  • Hard to identify spending patterns and anomalies

The Solution

This application provides:

  • Automated PDF parsing - Extract all transactions from your statements instantly
  • AI categorization - Intelligently classify transactions (Food, Transport, Bills, etc.)
  • Visual analytics - Interactive charts and graphs to understand your finances
  • Secure storage - Your financial data is encrypted and protected
  • Multi-user support - Each user has their own private workspace

โœจ Features

๐Ÿ” Security & Authentication

  • JWT-based authentication system
  • Bcrypt password hashing
  • Secure file validation and sanitization
  • CORS and security headers protection
  • Input validation on all endpoints
  • Encrypted data storage (coming in Phase 4)

๐Ÿ“Š Financial Analysis

  • PDF statement parsing and text extraction
  • AI-powered transaction categorization using OpenAI
  • Summary reports (total inflows, outflows, balance)
  • Category-wise spending breakdown
  • Time-series trend analysis (coming in Phase 3)
  • Custom date range filtering (coming in Phase 3)

๐Ÿ—๏ธ Architecture & Performance

  • Clean architecture with repository pattern
  • PostgreSQL with optimized indexes
  • Database migration system
  • Redis caching (coming in Phase 4)
  • Async job processing (coming in Phase 3)
  • Docker containerization
  • Horizontal scalability ready

๐Ÿ“ฑ User Experience

  • RESTful API design
  • Interactive web dashboard
  • Real-time job status tracking (coming in Phase 3)
  • Export to CSV/Excel (coming in Phase 5)
  • Mobile-responsive interface

๐Ÿ› ๏ธ Tech Stack

Backend

  • Go 1.21+ - Main programming language
  • PostgreSQL 15 - Primary database
  • Redis 7 - Caching layer (Phase 4)
  • JWT - Authentication tokens
  • Bcrypt - Password encryption

Frontend

  • HTML5/CSS3 - Dashboard UI
  • Chart.js - Data visualization
  • Vanilla JavaScript - Client-side logic

DevOps & Tools

  • Docker & Docker Compose - Containerization
  • golang-migrate - Database migrations
  • Air - Hot reload in development
  • GitHub Actions - CI/CD (Phase 5)

External APIs

  • OpenAI GPT - Transaction categorization
  • pdftotext - PDF parsing

๐Ÿš€ Getting Started

Prerequisites

Make sure you have the following installed:

Installation

  1. Clone the repository

    git clone https://github.com/Wendyshiro/M-PESA-Statements-Analyzer.git
    cd M-PESA-Statements-Analyzer
  2. Install Go dependencies

    go mod download
  3. Set up environment variables

    cp .env.example .env

    Edit .env with your configuration:

    # Server
    PORT=8080
    ENV=development
    
    # Database
    DATABASE_URL=postgresql://mpesa_user:mpesa_pass@localhost:5432/mpesa_analyzer?sslmode=disable
    
    # Security (generate secure keys!)
    JWT_SECRET=your-secret-key-minimum-32-characters
    ENCRYPTION_KEY=exactly-32-characters-required!!
    
    # OpenAI
    OPENAI_API_KEY=sk-your-openai-api-key
  4. Start infrastructure with Docker

    docker-compose up -d

    This starts:

    • PostgreSQL on port 5432
    • Redis on port 6379
    • pgAdmin on port 5050 (optional)
  5. Run database migrations

    # Install migrate tool
    go install -tags 'postgres' github.com/golang-migrate/migrate/v4/cmd/migrate@latest
    
    # Apply migrations
    make migrate-up
    # OR
    migrate -path migrations -database "${DATABASE_URL}" up
  6. Start the application

    # Using Make
    make run
    
    # OR directly with Go
    go run cmd/api/main.go
  7. Access the application

Quick Test

# Register a new user
curl -X POST http://localhost:8080/register \
  -H "Content-Type: application/json" \
  -d '{"email":"test@example.com","password":"password123"}'

# You'll get a token in the response - copy it!

# Upload a statement (replace YOUR_TOKEN)
curl -X POST http://localhost:8080/upload \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -F "file=@path/to/statement.pdf"

๐Ÿ“ Project Structure

M-PESA-Statements-Analyzer/
โ”‚
โ”œโ”€โ”€ cmd/
โ”‚   โ””โ”€โ”€ api/
โ”‚       โ””โ”€โ”€ main.go              # Application entry point
โ”‚
โ”œโ”€โ”€ handlers/                    # HTTP request handlers
โ”‚   โ”œโ”€โ”€ auth.go                  # Registration & login
โ”‚   โ”œโ”€โ”€ upload.go                # File upload handling
โ”‚   โ””โ”€โ”€ health.go                # Health checks
โ”‚
โ”œโ”€โ”€ models/                      # Data models
โ”‚   โ”œโ”€โ”€ user.go                  # User entity
โ”‚   โ”œโ”€โ”€ job.go                   # Processing job entity
โ”‚   โ””โ”€โ”€ transaction.go           # Transaction entity
โ”‚
โ”œโ”€โ”€ repository/                  # Database layer (Data Access)
โ”‚   โ”œโ”€โ”€ user.go                  # User DB operations
โ”‚   โ”œโ”€โ”€ job.go                   # Job DB operations
โ”‚   โ””โ”€โ”€ transaction.go           # Transaction DB operations
โ”‚
โ”œโ”€โ”€ middleware/                  # HTTP middleware
โ”‚   โ”œโ”€โ”€ auth.go                  # JWT authentication
โ”‚   โ”œโ”€โ”€ security.go              # Security headers
โ”‚   โ””โ”€โ”€ validation.go            # Input validation
โ”‚
โ”œโ”€โ”€ auth/                        # Authentication logic
โ”‚   โ””โ”€โ”€ jwt.go                   # Token management
โ”‚
โ”œโ”€โ”€ database/                    # Database connection
โ”‚   โ””โ”€โ”€ database.go              # Pool & health checks
โ”‚
โ”œโ”€โ”€ config/                      # Configuration
โ”‚   โ””โ”€โ”€ config.go                # Env variable loader
โ”‚
โ”œโ”€โ”€ utils/                       # Utility functions
โ”‚   โ”œโ”€โ”€ pdf_parser.go            # PDF extraction
โ”‚   โ””โ”€โ”€ categorizer.go           # AI categorization
โ”‚
โ”œโ”€โ”€ migrations/                  # Database migrations
โ”‚   โ”œโ”€โ”€ 000001_create_users_table.up.sql
โ”‚   โ”œโ”€โ”€ 000001_create_users_table.down.sql
โ”‚   โ”œโ”€โ”€ 000002_create_jobs_table.up.sql
โ”‚   โ””โ”€โ”€ ...
โ”‚
โ”œโ”€โ”€ dashboard/                   # Frontend
โ”‚   โ””โ”€โ”€ index.html
โ”‚
โ”œโ”€โ”€ .env.example                 # Environment template
โ”œโ”€โ”€ .gitignore
โ”œโ”€โ”€ docker-compose.yml           # Docker services
โ”œโ”€โ”€ Dockerfile                   # Application container
โ”œโ”€โ”€ Makefile                     # Development commands
โ”œโ”€โ”€ go.mod
โ”œโ”€โ”€ go.sum
โ””โ”€โ”€ README.md

Architecture Diagram

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                     Client Layer                        โ”‚
โ”‚              (Browser / Mobile / Postman)               โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                     โ”‚ HTTP/HTTPS
                     โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                  API Gateway Layer                      โ”‚
โ”‚              (Middleware: Auth, CORS, etc)              โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                     โ”‚
                     โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                   Handlers Layer                        โ”‚
โ”‚           (auth.go, upload.go, health.go)               โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                     โ”‚
                     โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                  Service Layer                          โ”‚
โ”‚        (Business Logic: PDF Parsing, AI, etc)           โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                     โ”‚
                     โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                 Repository Layer                        โ”‚
โ”‚              (Database Operations)                      โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                     โ”‚
                     โ†“
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                  Database Layer                         โ”‚
โ”‚            PostgreSQL + Redis Cache                     โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

๐Ÿ“š API Documentation

Base URL

http://localhost:8080

Authentication

Most endpoints require authentication. Include the JWT token in the Authorization header:

Authorization: Bearer <your-token>

Endpoints

๐ŸŸข Public Endpoints

POST /register - Register a new user

Request:

curl -X POST http://localhost:8080/register \
  -H "Content-Type: application/json" \
  -d '{
    "email": "user@example.com",
    "password": "securepassword123"
  }'

Response (201 Created):

{
  "token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...",
  "user": {
    "id": "550e8400-e29b-41d4-a716-446655440000",
    "email": "user@example.com",
    "created_at": "2026-02-16T10:30:00Z",
    "updated_at": "2026-02-16T10:30:00Z"
  }
}

Validation Rules:

  • Email must be valid format
  • Password must be at least 8 characters
  • Email must be unique
POST /login - Login existing user

Request:

curl -X POST http://localhost:8080/login \
  -H "Content-Type: application/json" \
  -d '{
    "email": "user@example.com",
    "password": "securepassword123"
  }'

Response (200 OK):

{
  "token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9...",
  "user": {
    "id": "550e8400-e29b-41d4-a716-446655440000",
    "email": "user@example.com",
    "created_at": "2026-02-16T10:30:00Z",
    "updated_at": "2026-02-16T10:30:00Z"
  }
}

Error Responses:

  • 401: Invalid credentials
  • 400: Missing email or password
GET /health - Health check

Request:

curl http://localhost:8080/health

Response (200 OK):

{
  "status": "healthy",
  "services": {
    "database": "healthy"
  }
}

๐Ÿ”’ Protected Endpoints

POST /upload - Upload M-PESA statement

Request:

curl -X POST http://localhost:8080/upload \
  -H "Authorization: Bearer YOUR_TOKEN" \
  -F "file=@statement.pdf"

Response (202 Accepted):

{
  "job_id": "7c9e6679-7425-40de-944b-e07fc1f90ae7",
  "message": "File uploaded successfully and queued for processing",
  "filename": "mpesa_statement.pdf"
}

File Requirements:

  • Must be a PDF file
  • Maximum size: 10MB
  • Must be a valid M-PESA statement

Error Responses:

  • 401: Missing or invalid token
  • 400: Invalid file type or size
  • 413: File too large

Error Response Format

All errors follow this format:

{
  "error": "Human-readable error message",
  "code": "ERROR_CODE"
}

Common Error Codes:

  • UNAUTHORIZED - Missing or invalid authentication
  • INVALID_INPUT - Validation failed
  • USER_EXISTS - Email already registered
  • INVALID_FILE - File validation failed
  • INTERNAL_ERROR - Server error

๐Ÿ—บ๏ธ Development Roadmap

โœ… Phase 1: Basic Security (Completed)

Status: โœ… Complete (Week 1-3)

  • File validation and sanitization
  • JWT-based authentication
  • Security headers (CORS, CSP, XSS protection)
  • Password hashing with bcrypt
  • Input validation middleware
  • Error handling

๐Ÿ”„ Phase 2: Database Integration (In Progress)

Status: ๐Ÿ”„ In Progress (Week 4)

  • PostgreSQL setup with Docker
  • Database migration system
  • User repository implementation
  • Job repository implementation
  • Transaction repository implementation
  • Database indexes and optimization
  • Connection pooling

๐Ÿ“‹ Phase 3: Async Processing (Upcoming)

Status: ๐Ÿ“… Planned (Week 5-6)

  • Redis integration
  • Message queue setup (RabbitMQ or Redis Queue)
  • Background worker for PDF processing
  • Job status tracking and updates
  • Retry logic for failed jobs
  • Job result storage
  • Webhook notifications

โšก Phase 4: Caching & Performance (Upcoming)

Status: ๐Ÿ“… Planned (Week 7-8)

  • Redis caching layer
  • Category result caching
  • Database query optimization
  • API response caching
  • Rate limiting implementation
  • Query result pagination
  • Database connection pooling tuning

๐Ÿงช Phase 5: Testing & CI/CD (Upcoming)

Status: ๐Ÿ“… Planned (Week 9-10)

  • Unit tests for all layers
  • Integration tests
  • End-to-end tests
  • GitHub Actions CI/CD pipeline
  • Code coverage reporting (target: 80%+)
  • Security scanning (gosec)
  • Dependency vulnerability scanning

๐Ÿš€ Phase 6: Production Ready (Upcoming)

Status: ๐Ÿ“… Planned (Week 11-12)

  • Prometheus metrics
  • Structured logging
  • Distributed tracing (OpenTelemetry)
  • Kubernetes deployment manifests
  • Horizontal Pod Autoscaling
  • Database backup automation
  • Disaster recovery procedures
  • Documentation site

๐Ÿ”ฎ Future Enhancements

  • Multi-language support (i18n)
  • PDF statement templates for other providers
  • Budget planning and alerts
  • Spending predictions with ML
  • Mobile app (Flutter)
  • Batch processing for multiple statements
  • Export to CSV, Excel, PDF
  • Email reports
  • Social features (anonymous spending comparisons)

๐Ÿงช Testing

Run Tests

# Run all tests
make test

# Run with coverage
go test -cover ./...

# Run specific package
go test ./handlers/...

# Verbose output
go test -v ./...

# Generate coverage report
go test -coverprofile=coverage.out ./...
go tool cover -html=coverage.out

Manual Testing with Postman

  1. Download Postman
  2. Import the collection from postman_collection.json (if available)
  3. Set up environment variables:
    • baseUrl: http://localhost:8080
    • token: (auto-filled after login)

Testing Workflow:

  1. Register โ†’ Get token
  2. Login โ†’ Verify token
  3. Upload โ†’ Check file validation
  4. Health โ†’ Verify services

See Postman Testing Guide for detailed instructions.


๐Ÿ—„๏ธ Database

Schema Overview

The application uses PostgreSQL with the following main tables:

users - User accounts

  • id (UUID) - Primary key
  • email (VARCHAR) - Unique email
  • password_hash (VARCHAR) - Bcrypt hash
  • Timestamps

jobs - Processing jobs

  • id (UUID) - Primary key
  • user_id (UUID) - Foreign key to users
  • file_path (VARCHAR) - File location
  • status (ENUM) - queued, processing, completed, failed
  • Timestamps

transactions - Parsed M-PESA transactions

  • id (UUID) - Primary key
  • job_id (UUID) - Foreign key to jobs
  • receipt_no (VARCHAR) - M-PESA receipt
  • amount (DECIMAL) - Transaction amount
  • category (VARCHAR) - AI-assigned category
  • Timestamps

Migrations

# Create new migration
migrate create -ext sql -dir migrations -seq migration_name

# Apply migrations
make migrate-up

# Rollback
make migrate-down

# Check version
make migrate-version

๐Ÿณ Docker

Development

# Start all services
docker-compose up -d

# View logs
docker-compose logs -f

# Stop services
docker-compose down

# Rebuild
docker-compose up -d --build

Production

# Build production image
docker build -t mpesa-analyzer:v1.0.0 .

# Run container
docker run -d \
  -p 8080:8080 \
  --env-file .env.production \
  mpesa-analyzer:v1.0.0

# With docker-compose
docker-compose -f docker-compose.prod.yml up -d

๐Ÿค Contributing

Contributions are what make the open-source community amazing! Any contributions you make are greatly appreciated.

How to Contribute

  1. Fork the Project

    # Click the "Fork" button on GitHub
  2. Clone your Fork

    git clone https://github.com/YOUR_USERNAME/M-PESA-Statements-Analyzer.git
    cd M-PESA-Statements-Analyzer
  3. Create a Feature Branch

    git checkout -b feature/AmazingFeature
  4. Make your Changes

    • Write clean, documented code
    • Follow Go best practices
    • Add tests for new features
    • Update documentation
  5. Commit your Changes

    git add .
    git commit -m "Add some AmazingFeature"
  6. Push to your Fork

    git push origin feature/AmazingFeature
  7. Open a Pull Request

    • Go to the original repository
    • Click "New Pull Request"
    • Select your feature branch
    • Describe your changes

Contribution Guidelines

  • Code Style: Follow Go conventions (run gofmt)
  • Testing: Add tests for new features
  • Documentation: Update README and code comments
  • Commits: Use clear, descriptive commit messages
  • Issues: Check existing issues before creating new ones

Good First Issues

Look for issues tagged with good first issue or help wanted:

  • Add input validation for edge cases
  • Improve error messages
  • Write documentation
  • Add unit tests
  • Fix typos

๐Ÿ“„ License

Distributed under the MIT License. See LICENSE for more information.

MIT License

Copyright (c) 2026 Wendyshiro

๐Ÿ“ž Contact

Wendyshiro


๐Ÿ™ Acknowledgments

This project was built with the help of amazing open-source tools and resources:

Technologies & Libraries

Learning Resources

Inspiration

  • M-PESA for revolutionizing mobile money in Kenya and Africa
  • The Go community for excellent documentation and support
  • Open-source contributors worldwide

๐Ÿ“Š Project Stats

GitHub Stars GitHub Forks GitHub Issues GitHub Pull Requests


โญ Star this repository if you found it helpful!

Made with โค๏ธ by Wendyshiro

About

A Go-based tool for parsing, analyzing, and visualizing financial data from M-Pesa statements. It converts PDF statements into structured data, then generates summaries and visual insights to help users track spending, income, and transaction trends

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages