MOEX Price Prediction is an end-to-end forecasting solution for Moscow Exchange (MOEX) stocks (e.g., SBER, GAZP, ROSN) using historical data, technical indicators, and deep learning with PyTorch.
- Project Overview
- Key Features
- Architecture & Structure
- Technologies
- Hydra Configuration
- Installation & Setup
- Usage
- API Reference
- Model Training & Visualization
- License
This project fetches MOEX historical end-of-day data (for any ticker available via the MOEX API in app/data.py), computes technical indicators (RSI, SMA, MACD, Bollinger Bands, ATR), trains PyTorch models (LSTM-Attention, TCN, Transformer) via Hydra-powered experiments, and serves predictions through a FastAPI REST API. See Hydra Configuration for details.
- Modular Architecture: Clear separation between data ingestion, preprocessing, model training, and serving
- Hydra Experiments: Easily switch models (
lstm,tcn,tft) and tickers (SBER,GAZP,ROSN, etc.) with command-line overrides - Versioned Artifacts: Models and scalers saved under
saved_models/v{version}; metadata tracks versions and architecture - Auto-Retraining: Optional performance monitoring triggers retraining via MLflow and Optuna
- Live API: FastAPI endpoint for on-demand predictions
MOEX_PREDICT/
├── app/
│ ├── data.py # MOEX & CBR data loader + DataLoader factory
│ ├── preprocessing.py # Indicator calculations
│ ├── models/ # Model definitions and factory
│ │ ├── attention_lstm.py
│ │ ├── tcn.py
│ │ ├── tft.py
│ │ └── factory.py
│ ├── model_manager.py # Loading versioned models & scalers
│ ├── transfer_learning.py # Retraining logic & metadata
│ ├── predict.py # Prediction wrapper
│ ├── monitoring.py # Performance validation
│ └── main.py # FastAPI application
├── conf/ # Hydra configuration
│ ├── config.yaml
│ ├── data/
│ │ └── default.yaml # Data loader settings
│ ├── model/
│ │ ├── lstm.yaml # LSTM params
│ │ ├── tcn.yaml # TCN params
│ │ └── tft.yaml # Transformer params
│ ├── optimization/
│ │ └── default.yaml # HPO settings
│ └── train/
│ └── default.yaml # Training settings & versioning
├── saved_models/ # Versioned model artifacts
│ ├── v1/
│ │ ├── SBER_model.pth
│ │ ├── SBER_scaler_X.pkl
│ │ ├── SBER_scaler_y.pkl
│ │ └── ...
│ └── v2/
│ └── ...
├── train.py # Hydra entrypoint for experiments
├── Makefile # install, run, clean commands
└── README.md # This file
- Python: 3.11
- Config: Hydra (1.3.2), OmegaConf
- Data: Pandas, NumPy, scikit-learn
- Deep Learning: PyTorch 2.6
- Experiment Tracking: MLflow 2.21
- Web API: FastAPI, Uvicorn
- Scheduling: APScheduler
All experiments are driven by conf/config.yaml. Override sections via CLI:
# Train TCN on GAZP, version v2
python3 train.py \
model=tcn \
data.ticker=GAZP \
train.horizon=5 \
train.version=v2
# Train model with HPO
python3 train.py \
model=lstm \
data.ticker=SBER \
data.start_date=2013-01-01 \
train.horizon=5 \
train.epochs=20 \
train.version=v1 \
optimization.enable=true \
optimization.n_trials=20 \
optimization.epochs_per_trial=20See conf/ for defaults.
# 1. Clone repository
git clone https://github.com/NasdormML/Moex_predict.git
cd Moex_predict
# 2. Create virtual environment and install dependencies
make install
# 3. Run MLflow and FastAPI
make run- MLflow UI: http://127.0.0.1:5001
- API Docs: http://127.0.0.1:8000/docs
| Command | Description |
|---|---|
make install |
Create venv and install dependencies |
make run |
Run MLflow + FastAPI (development with reload) |
make run-prod |
Run MLflow + FastAPI (production with workers) |
make api |
Run only FastAPI |
make mlflow |
Run only MLflow server |
make lint |
Run Flake8 linter |
make clean |
Remove __pycache__ and *.pyc files |
python train.py model=lstm data.ticker=SBERcurl -X POST "http://127.0.0.1:8000/predict/SBER/2026-02-10" \
-H "Content-Type: application/json"Generate price forecast for a ticker up to the target date.
Parameters:
| Name | Type | Description |
|---|---|---|
ticker |
string | Stock ticker (e.g., SBER, GAZP) |
target_date |
date | Forecast horizon limit (YYYY-MM-DD) |
Response:
{
"ticker": "SBER",
"known_up_to": "2026-02-03",
"requested_target_date": "2026-02-10",
"forecast_dates": [
"2026-02-04",
"2026-02-05",
"2026-02-06",
"2026-02-07",
"2026-02-10"
],
"predictions": [
307.40,
306.08,
305.13,
305.05,
305.28
]
}Health check endpoint.
Response:
{
"status": "healthy",
"models_loaded": 1
}| Model | MSE | RMSE | MAE | MAPE |
|---|---|---|---|---|
| LSTM-Attn | 19.87 | 4.46 | 3.11 | 1.20% |
| Transformer | 11.37 | 3.37 | 2.42 | 0.97% |
| Model | MSE | RMSE | MAE | MAPE |
|---|---|---|---|---|
| TCN | 18.31 | 4.28 | 3.04 | 2.08% |
| Model | MSE | RMSE | MAE | MAPE |
|---|---|---|---|---|
| Transformer | 61.50 | 7.84 | 5.65 | 1.10% |
This project is licensed under the MIT License.


