A TouchDesigner component (.tox) that implements the MiDaS depth estimation model using ONNX runtime.
This project provides a TouchDesigner implementation of the MiDaS depth estimation model, allowing real-time depth estimation directly within TouchDesigner. The implementation uses ONNX runtime for efficient inference.
├── td_scripts/ # Python scripts for TouchDesigner integration
├── dep/ # Dependencies
│ └── python/ # Python dependencies
├── toxes/ # TouchDesigner components
└── midas-touchdesigner.toe # Main TouchDesigner project file
- TouchDesigner 2022.32660 or later
- Python 3.9.5+
- ONNX Runtime Library
-
Download the required model file:
- Download the models required: DPT Hybrid (
dpt_hybrid.onnx)/DPT Swin2 Tiny (dpt_swin2_tiny_256.onnx)/midas-small (midas-small.onnx)/etc - Place it in the project directory
- Download the models required: DPT Hybrid (
-
Open the TouchDesigner project:
- Open
midas-touchdesigner.toe - Click on the midas Base, go to the
Setupparameter page, and pulse theInstall Dependencies - Navigate to the
depfolder- Windows Users: double click
dep_install_windows.cmd - Mac Users:
- Open Terminal and change directory (
cd) to thedepfolder - For Intel Macs:
chmod +x dep_install_mac_intel.sh ./dep_install_mac_intel.sh
- For Apple Silicon Macs (M1/M2):
chmod +x dep_install_mac_arm.sh ./dep_install_mac_arm.sh
- Open Terminal and change directory (
- Windows Users: double click
- Back to TouchDesigner, go to the midas Base, moving to the
Runtimeparameter page and pulseRun Pathsfirst thenLoad MiDaS Model - Now the model should be loaded and ready to use.
- Open
If you need to convert a PyTorch (.pt) model to ONNX format, you can use the provided conversion script:
- Place your PyTorch model file in the project directory
- Modify the
PT_MODEL_PATHinmodel_converter/pt_to_onnx.pyto point to your model file - Run the conversion script:
python model_converter/pt_to_onnx.py
The script will generate an ONNX model file with the same name as your PT file but with the .onnx extension.
- Import the .tox component into your TouchDesigner project
- Connect your input video/image source to the input TOP
- The depth map will be output through the output TOP
This implementation is based on the MiDaS model: