As part of #1313 we are expanding how users can add CropModels, there is no reason that the weights have to follow ImageNet normalization.
Allow custom image normalization in CropModel
Problem
bounding_box_transform() in src/deepforest/datasets/cropmodel.py hardcodes ImageNet normalization, making it impossible for users to supply models pretrained with different normalization stats.
# Line 32 - no way to override this
data_transforms.append(resnet_normalize)
A user passing a custom model via CropModel(model=my_model) that expects different preprocessing will get silently wrong results.
Proposed Fix
Add an optional normalize parameter to bounding_box_transform and thread it through from the CropModel config:
def bounding_box_transform(augmentations=None, resize=None, normalize=None):
if resize is None:
resize = [224, 224]
if normalize is None:
normalize = resnet_normalize
data_transforms = []
data_transforms.append(transforms.ToTensor())
data_transforms.append(normalize)
data_transforms.append(transforms.Resize(resize))
if augmentations:
data_transforms.append(transforms.RandomHorizontalFlip(0.5))
return transforms.Compose(data_transforms)
Users could then pass custom normalization via config:
cropmodel:
normalize:
mean: [0.5, 0.5, 0.5]
std: [0.5, 0.5, 0.5]
Or normalize: false to disable it entirely.
As part of #1313 we are expanding how users can add CropModels, there is no reason that the weights have to follow ImageNet normalization.
Allow custom image normalization in CropModel
Problem
bounding_box_transform()insrc/deepforest/datasets/cropmodel.pyhardcodes ImageNet normalization, making it impossible for users to supply models pretrained with different normalization stats.A user passing a custom model via
CropModel(model=my_model)that expects different preprocessing will get silently wrong results.Proposed Fix
Add an optional
normalizeparameter tobounding_box_transformand thread it through from the CropModel config:Users could then pass custom normalization via config:
Or
normalize: falseto disable it entirely.