Skip to content

HarshSharma0801/The-Containerized-Intelligence-Pipeline

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

16 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸš€ The Containerized Intelligence Pipeline

πŸ“‹ Scenario

As a DevOps engineer in a modern tech company, you're tasked with deploying a scalable microservices architecture that demonstrates containerization, service orchestration, and database persistence. This project showcases a complete intelligence pipeline with Node.js Express, Go Gin, and PostgreSQL working together in a containerized environment.

🎯 Core Learning Objectives

  • Containerization fundamentals with Docker
  • Microservices architecture design
  • Service orchestration with Docker Compose
  • Database persistence and volume management
  • Health checks and service dependencies
  • Environment-based configuration management
  • Cross-service communication patterns

πŸ› οΈ Tech Stack & Rationale

Technology Purpose Rationale
Docker Containerization Industry standard for application containerization and deployment
Docker Compose Orchestration Simplifies multi-container application management
Node.js API Gateway Fast, event-driven runtime perfect for API orchestration
Go Computation Service High-performance language ideal for CPU-intensive calculations
PostgreSQL Database Robust, ACID-compliant database with excellent Docker support
Gin Go Web Framework Lightweight, fast HTTP framework for Go services
Express Node.js Framework Minimal, flexible web application framework
GitHub Actions CI/CD Pipeline Tight integration with GitHub, simplest way to create CI/CD pipelines

πŸ“‹ Implementation Steps

Step 1: Environment Configuration Setup

Set up environment variables for secure configuration management

Screenshot 2025-07-27 at 6 55 41β€―PM
# Copy environment template
cp env.example .env

# Configure your environment variables
# - Database credentials
# - Service ports
# - Security settings

Step 2: Docker Compose File

Configure docker-compose.yml for all the services configure PostgreSQL with persistent volume and initialization scripts in database directory

Screenshot 2025-07-27 at 6 56 03β€―PM
-- Automatic table creation on container startup
CREATE TABLE IF NOT EXISTS process_logs (
    id SERIAL PRIMARY KEY,
    time TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT NOW(),
    processing_time INTERVAL NOT NULL
);

Step 3: Spinning up Ec2 (Large Size Recommended for High Computation)

Spin up Ec2 on AWS and install various elements as root user

  1. Docker
  2. git
  3. docker-compose
# Update system
yum update -y        # For Amazon Linux
# OR
apt update && apt upgrade -y  # For Ubuntu

# Install Docker
yum install docker -y
# OR
apt install docker.io -y

# Start Docker
systemctl start docker
systemctl enable docker

# Install docker-compose
curl -L "https://github.com/docker/compose/releases/download/2.24.0/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
chmod +x /usr/local/bin/docker-compose

# Install git
yum install docker -y

# Confirm installation
docker --version
docker-compose --version

Step 4: Attach , Mount and Fomrat EBS Volume for Database for Ec2

Mount and Format attached EBS Volume onto Ec2 on a directory for database storage example below this volume xvdb is mounted on /mnt/xvdb directory

# Commands

# Check the Attached Volume
df -h

# Format the volume attached on Ec2
sudo mkfs.ext4 /dev/xvdb 

# Mount on a directory
sudo mkdir -p /mnt/xvdb
sudo mount /dev/xvdb /mnt/xvdb
Screenshot 2025-07-27 at 7 22 25β€―PM

Step 5: Create deploy User and setup Permissions

create a deploy user and setup permission as it will be responsible for deploying (good practice you can do it via root user too)

# Add a user
sudo adduser deploy

# Add user in user groups for docker and wheel (run sudo commands)
usermod -aG docker deploy
usermod -aG wheel deploy

# login deploy
su deploy

setup ssh key

# Generate ssh key
ssh-keygen
cd ~/.ssh

In authorized_keys remove the restrictions related to ssh

give permission to postgres service (you can find the id once its created on docker-compose) for mounted directory example (/mnt/xvdb/postgres-data) so that it can access and write into it

Step 6: Clone git repository in deploy user and setup Env

# Login as deploy user
su deploy

# Go to ~ directory and clone repository
cd ~ 
git clone https://github.com/HarshSharma0801/The-Containerized-Intelligence-Pipeline.git

# Cd into Repository and setup env

cd The-Containerized-Intelligence-Pipeline 
vim env 

Step 7: Start The Containers and verify

# start docker compose 
docker-compose up -d --build 

# Verify using 
docker ps
Screenshot 2025-07-27 at 7 41 37β€―PM

Step 7: Setup CI/CD Pipeline

  1. Create ./github/workflows/deploy.yml
Screenshot 2025-07-27 at 7 43 09β€―PM
  1. Drive Github Secrets

EC2_SSH_KEY -> private ssh key for deploy user found in .ssh folder in deploy user (needed for ssh into ec2 as deploy user without password) EC2_HOST -> ec2 Host found in ec2 dashboard EC2_USER -> deploy ENV_FILE -> check the env file in .env.example

  1. Push to main branch to see the actions tab

πŸ—οΈ Architecture Overview

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚   Client/User   β”‚    β”‚  Node.js API    β”‚    β”‚  Go Compute     β”‚
β”‚                 │────│  Gateway        │────│  Service        β”‚
β”‚  Port: ANY      β”‚    β”‚  Port: 3000     β”‚    β”‚  Port: 8086     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                              β”‚
                              β–Ό
                       β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
                       β”‚  PostgreSQL     β”‚
                       β”‚  Database       β”‚
                       β”‚  Port: 5432     β”‚
                       β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

πŸ“Š API Endpoints

Health Checks

# Node.js service health
GET http://localhost:3000/health

# Go service health
GET http://localhost:8086/health

Main Pipeline Endpoint

# Trigger computation pipeline
GET http://localhost:3000/calculate

Response:

{
  "result": {
    "time": 0.05234,
    "operation": "prime_calculation",
    "processedAt": "2024-01-15T10:30:00Z"
  },
  "processingTime": 150,
  "timestamp": "2024-01-15T10:30:00.000Z"
}

πŸ—ƒοΈ Database Schema

CREATE TABLE process_logs (
    id SERIAL PRIMARY KEY,
    time TIMESTAMP WITH TIME ZONE NOT NULL DEFAULT NOW(),
    processing_time INTERVAL NOT NULL
);

πŸ”§ Service Configuration

Environment Variables

Variable Default Description
NODE_PORT 3000 Node.js API Gateway port
GO_PORT 8086 Go computation service port
POSTGRES_USER postgres Database username
POSTGRES_PASSWORD password Database password
POSTGRES_DB logs Database name
POSTGRES_PORT 5432 Database port

Docker Services

πŸ’Ύ Data Persistence

PostgreSQL data is persisted using Docker volumes:

  • Volume Mount: /mnt/xvdb/postgres-data:/var/lib/postgresql/data
  • Init Scripts: ./database/init.sql automatically executed on first run

πŸ”„ Service Dependencies

nodejs-server:
  depends_on:
    postgres-db:
      condition: service_healthy
    go-server:
      condition: service_healthy

πŸ₯ Health Monitoring

All services include health checks:

  • Interval: 30 seconds
  • Timeout: 10 seconds
  • Retries: 3 attempts
  • Restart Policy: unless-stopped

πŸ›‘ Managing the Deployment

Start Services

docker-compose up --build

Stop Services

docker-compose down

Remove Everything (including volumes)

docker-compose down -v

View Logs

# All services
docker-compose logs

# Specific service
docker-compose logs nodejs-server
docker-compose logs go-server
docker-compose logs postgres-db

Scale Services

# Scale Go service for higher computation load
docker-compose up --scale go-server=3

πŸ” Troubleshooting

Common Issues

  1. Port conflicts: Ensure ports 3000, 8086, and 5432 are available
  2. Environment variables: Verify .env file exists and is properly configured
  3. Docker permissions: Ensure Docker daemon is running and accessible
  4. Volume permissions: Check filesystem permissions for PostgreSQL data directory

Debug Commands

# Check container status
docker-compose ps

# Inspect container logs
docker-compose logs -f [service-name]

# Execute commands in running containers
docker-compose exec nodejs-server sh
docker-compose exec go-server sh
docker-compose exec postgres-db psql -U postgres -d logs

This containerized intelligence pipeline demonstrates modern DevOps practices with microservices, containerization, and automated orchestration - perfect for learning deployment fundamentals and scaling strategies.

About

πŸš€ A complete DevOps pipeline demonstrating production deployment on AWS with containerized microservices. Features Docker Compose orchestration of Node.js, Go, and PostgreSQL services deployed on EC2 with EBS volume management. Includes automated CI/CD via GitHub Actions, SSH-based deployment automation, and production-ready infrastructure setup.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors