🏆 Submission for the IIT Madras x Appian AI Application Challenge 2026.
APOLLO is a cutting-edge application designed to revolutionize operational management by moving from reactive problem-solving to proactive prediction and optimization. Leveraging a Hybrid Neuro-Symbolic Architecture, APOLLO provides a "time machine" for business processes, allowing managers to foresee SLA breaches, simulate "what-if" scenarios, and optimize resource allocation before issues even arise.
APOLLO integrates advanced AI and simulation techniques to deliver unparalleled operational intelligence:
- Digital Twin Engine (SimPy): A living, breathing replica of your Appian workflows, accurately modeling queues, resources, and process logic.
- Hybrid Forecasting Engine (Prophet + TFT): An intelligent oracle that predicts future case volumes with high accuracy, combining the power of statistical and deep learning models.
- Monte Carlo Risk Engine: Runs thousands of parallel simulations to quantify risk, providing probabilistic forecasts of SLA breaches and bottlenecks.
- What-If Sandbox & Optimizer (PuLP): An interactive control panel allowing managers to test resource allocation changes and receive data-driven recommendations for optimal strategies.
Follow these steps to get APOLLO up and running on your local machine.
- Docker Desktop: Ensure Docker is installed and running on your system.
- Node.js & npm: Required for frontend development (though Docker handles most of this).
- Python 3.11+: For backend development (though Docker handles most of this).
git clone https://github.com/your-username/appian-apollo.git
cd appian-apolloOur backend requires historical data to train its forecasting model.
# Navigate to the data directory
cd data
# Run the Python script to generate sample_operation_log.csv
python synthetic_log_generator.py
# Go back to the project root
cd ..The GenAI Analyst Layer requires an OpenAI API key.
- Create a file named
.envin the root directory of the project (appian-apollo/.env). - Add your OpenAI API key to this file:
OPENAI_API_KEY="your_openai_api_key_here"
This command will build the Docker images for both the backend and frontend, and then start both services.
docker-compose up --build- The first build might take a few minutes.
- Once running, you can access the application:
- Frontend UI:
http://localhost:3000 - Backend API Docs (Swagger UI):
http://localhost:8000/docs
- Frontend UI:
- Backend: Python, FastAPI, SimPy, Prophet, NumPy, Pandas, PuLP, OpenAI API
- Frontend: React, TypeScript, Recharts, React Router
- Containerization: Docker, Docker Compose
(Optional section for future contributions)
This project is licensed under the MIT License.