Skip to content

emineugurlu/Lighthouse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚀 Lighthouse: Distributed Log Management System

A professional-grade, asynchronous log collection and processing system built with a modern microservices architecture.

📝 About the Project

Lighthouse is a distributed system designed to handle high-volume log data without blocking main application threads. It uses a Producer-Consumer pattern where log entries are ingested via an API and queued in a high-speed message broker (Redis) before being processed and persisted by an independent worker.

🏗 System Architecture

The project follows a decoupled architecture to ensure high availability:

  1. Producer (FastAPI): Receives logs and instantly pushes them to the queue.
  2. Message Broker (Redis): Acts as a resilient buffer, storing logs in a FIFO (First-In, First-Out) structure.
  3. Consumer (Worker): A standalone background process that retrieves logs from the queue and saves them to permanent storage.

🛠 Tech Stack

  • Python 3.11
  • FastAPI (High-performance web framework)
  • Redis (In-memory data structure store / Message Broker)
  • Docker & Docker Compose (Containerization and orchestration)
  • Pydantic (Data validation and settings management)

🧠 What I Learned During This Project

This project was a significant milestone in my development as a Computer Engineer. Key takeaways include:

  • Asynchronous Processing: Implementing BackgroundTasks to handle operations outside the request-response cycle.
  • Distributed Systems: Learning how to decouple services using a message broker like Redis.
  • Containerization: Managing multi-container environments using Docker Compose.
  • Logic & Flow: Handling data serialization/deserialization with JSON and Python dictionaries.
  • Debugging: Solving real-world connection and import issues in a containerized environment.

🚀 Installation & Usage

Running with Docker

To spin up the entire infrastructure:

docker-compose up --build

1. API Request (Swagger UI)

The entry point where logs are received and validated.


2. Message Queue (Redis Storage)

Logs successfully queued in Redis, verified via redis-cli.


3. Worker Processing (Consumer)

Standalone worker service pulling logs from Redis and processing them. Access API Docs: http://localhost:8000/docs

##Running the Worker Locally

python app/worker.py

Developed by Emine Computer Engineer

About

Lighthouse is designed to handle log data from multiple sources without blocking the main application flow. By utilizing a message queue (Redis), it ensures that log ingestion is decoupled from log processing, providing high availability and fault tolerance. This project serves as a practical implementation of backend engineering patterns, focusing

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors