MTLearn (Morphological Tree Learning) is a C++/Python research library for
learnable connected operators based on morphological trees. The Python package
is published as mtlearn.
The library explores a simple idea: connected morphology can become a structural prior for deep neural networks. Instead of processing images only through local pixel-wise operations, connected operators reason over components, regions, shape, contrast, and hierarchy. This makes them naturally interpretable and well-suited for tasks where structure matters.
Classical connected filters are powerful, but they usually depend on hard keep/discard decisions and manually selected attribute thresholds. This limits their integration into end-to-end trainable neural architectures.
MTLearn provides a stable implementation platform for this research direction. It currently includes Connected Filter Preprocessing (CFP), and is intended to grow toward trainable connected-operator layers, differentiable or learnable attribute criteria, self-dual tree representations, intermediate network insertions, and scalable implementations.
-
Connected Filter Preprocessing (CFP): the current main model, available as
mtlearn.layers.ConnectedFilterPreprocessingLayer. CFP replaces hard connected-filter decisions with a differentiable sigmoid gate over normalized tree-node attributes. -
Stable morphology interface:
mtlearn.morphologybuilds max-trees, min-trees, and tree-of-shapes through a backend-independent API. -
Trainable connected morphology: designed as an implementation platform for connected morphology as a learnable structural prior in deep neural networks.
-
Research-ready validation: includes C++ tests, Python tests, gradient checks, reference implementations, notebook validations, and public dataset download helpers.
The Python package is available from PyPI as mtlearn:
pip install mtlearnSee docs/installation.md for installation instructions and docs/development.md for source builds, validation, and releases.
Build a morphology tree and compute attributes:
import numpy as np
from mtlearn import morphology
image = np.array([[1, 2], [3, 4]], dtype=np.uint8)
tree = morphology.create_max_tree(image)
_, attributes = morphology.compute_attributes(
tree,
[morphology.AttributeType.AREA, morphology.AttributeType.COMPACTNESS],
)
print(attributes.shape)Create a CFP layer and run a forward pass:
import torch
from mtlearn import morphology
from mtlearn.layers import ConnectedFilterPreprocessingLayer
cfp_layer = ConnectedFilterPreprocessingLayer(
in_channels=1,
attributes_spec=[(
morphology.AttributeType.AREA,
morphology.AttributeType.CIRCULARITY,
)],
tree_type="max-tree",
device="cpu",
)
x = torch.tensor([[[[1, 2], [3, 4]]]], dtype=torch.float32)
y = cfp_layer(x)
assert y.shape == x.shapeExecutable examples are available in notebooks/.
Install notebook dependencies with:
pip install "mtlearn[notebooks]"The main public experiment example is:
notebooks/experiments/Example_screws_filtering.ipynb
Representative ICPR 2026 experiment notebooks are available in notebooks/ICPR2026.
ConnectedFilterPreprocessingLayer is the recommended implementation for new
CFP experiments.
Tensor operations, trainable parameters, and cached attributes can live on CUDA
when device="cuda". Morphology-tree construction is still performed by the
C++ backend on CPU.
The main implementation uses an implicit Jacobian formulation. The dense region-pixel matrix is not materialized during normal training; tree-ordering metadata is used to perform the equivalent reconstruction and backward accumulation more compactly. This reduces memory pressure compared with explicit region-pixel Jacobian construction.
Reference implementations based on explicit Jacobians and CPU tree traversals remain available for gradient checks, comparisons, and debugging.
MTLearn uses a C++ morphology backend internally through mtlearn::morphology.
User code should interact with morphology through the public Python facade
mtlearn.morphology, rather than depending on backend-specific APIs.
The backend is
MorphologicalAttributeFilters
/ mmcfilters, but the top-level Python package mmcfilters is not required
as a runtime dependency of mtlearn.
MTLearn is a research-oriented library. CFP is the first validated member of a broader planned family of trainable connected-operator layers. The current implementation supports max-tree and min-tree CFP workflows, multi-attribute groups, dataset-level attribute normalization, cached preprocessing, and PyTorch forward/backward for CFP parameters on CPU or CUDA tensors.
If you use the CFP layer in your work, please cite:
Wonder A. L. Alves, Lucas de P. O. Santos, Ronaldo F. Hashimoto, Nicolas Passat, Anderson H. R. Souza, Dennis J. Silva, Yukiko Kenmochi. A trainable connected filter preprocessing layer based on component trees. International Conference on Pattern Recognition (ICPR), 2026, Lyon, France. ⟨hal-05575141⟩