IFDS is a Streamlit-based image forgery analysis application. It combines classical OpenCV feature-matching methods such as SIFT, SURF, AKAZE, and ORB with optional deep learning models. For each uploaded image, the system produces algorithm-level evidence, a weighted final verdict, a comparative analysis table, and downloadable PDF/HTML reports.
- Classical analysis with SIFT, SURF, AKAZE, and ORB detectors.
- Optional AI analysis with Xception CNN and EfficientNet CNN model wrappers.
- Explainability support through Grad-CAM heatmaps for Xception results.
- Weighted final verdict labels:
Authentic,Tampered, orReview needed. - PDF/HTML reports with image metadata, algorithm results, comparison data, and final verdict details.
- Modern Streamlit dashboard with status cards, analysis controls, result cards, and responsive layout.
- Supported input formats: GIF, JPG/JPEG, PNG, BMP, and TIFF.
| Tool / Metric | Result |
|---|---|
| Pytest | 58 tests passed |
| Pytest coverage | 95.01% |
| SonarQube Quality Gate | Passed / OK |
| SonarQube coverage | 94.8% |
| Bugs | 0 |
| Vulnerabilities | 0 |
| Security Hotspots | 0 |
| Code Smells | 14 |
| Duplications | 3.0% |
| Lines of Code | 2536 |
| Reliability Rating | A |
| Security Rating | A |
| Maintainability Rating | A |
SonarQube notes and evidence:
.
├── app.py # Streamlit application entrypoint
├── config/ # Application settings and model paths
├── data/
│ ├── raw/ # Local raw datasets
│ ├── processed/ # Processed local outputs
│ └── models/ # Model weights
├── docs/ # Documentation and delivery evidence
├── notebooks/ # Training notebooks
├── src/
│ ├── ai_models/ # Xception, EfficientNet, and Grad-CAM components
│ ├── classical/ # SIFT, SURF, AKAZE, and ORB detectors
│ ├── preprocessing/ # Image loading and preprocessing
│ ├── reporting/ # PDF/HTML report generation
│ └── verdict.py # Final verdict service
├── tests/ # Pytest test suite
└── ui/ # Streamlit UI components
Python 3.10 or newer is recommended.
python -m venv .venv
.venv\Scripts\activate
pip install -r requirements.txtFor macOS/Linux activation:
source .venv/bin/activateInstall development and test dependencies when needed:
pip install -r requirements-dev.txtstreamlit run app.pyWindows virtual environment alternative:
.venv\Scripts\python.exe -m streamlit run app.pyAfter the app opens, select the analysis methods from the sidebar, upload a supported image, and press Start Analysis.
python -m pytest tests -q --cov=src --cov-report=xml --cov-report=term-missingLatest verification result:
58 passed
TOTAL: 1202 statements, 60 missing, 95.01% coverage
The project includes the required SonarQube configuration in sonar-project.properties.
Basic workflow:
python -m pytest tests -q --cov=src --cov-report=xml --cov-report=term-missing
sonar-scanner.bat -Dsonar.host.url=http://localhost:9000 -Dsonar.token=<local-token>Latest successful local analysis:
Quality Gate: OK / Passed
SonarQube coverage: 94.8%
Bugs: 0
Vulnerabilities: 0
Security Hotspots: 0
Duplications: 3.0%
AI models are optional. When model weights are missing, the application continues with classical analysis and reporting.
Expected model paths:
data/models/xception_finetuned.h5
data/models/efficientnet_finetuned.h5
If model files are missing after cloning the repository, use Git LFS:
git lfs pullTraining metrics are documented in model_metrics.md.
Main project delivery artifacts prepared for the course:
- User Manual document
- FSM Effort Estimation document
- Doxygen PDF
- Graphviz architecture SVG
- Graphviz architecture PDF
- Doxygen/Graphviz call graph
- Scrum board screenshot
The notebooks/ directory contains notebooks prepared for model training with Kaggle/CASIA datasets. Local datasets can be kept under data/raw/; that directory is intentionally not committed.
- Do not commit
.envfiles, Streamlit secrets, virtual environments, datasets, or large model weights. data/raw/,data/processed/, anddata/models/are reserved for local work.- Large
.h5model files should be shared through Git LFS instead of normal Git history.
This project is released under the MIT License. See the LICENSE file for details.





