Skip to content

fix: Update dependency fft-conv-pytorch to torch-fftconv#14

Closed
vmiller987 wants to merge 1 commit intoMIC-DKFZ:masterfrom
vmiller987:fix-fftconv
Closed

fix: Update dependency fft-conv-pytorch to torch-fftconv#14
vmiller987 wants to merge 1 commit intoMIC-DKFZ:masterfrom
vmiller987:fix-fftconv

Conversation

@vmiller987
Copy link
Copy Markdown
Contributor

@vmiller987 vmiller987 commented Nov 13, 2025

Hello all!

With nnUNet, we currently receive the following warning using PyTorch 2.9:

/home/vmiller/Work/caml/.venv/lib/python3.13/site-packages/fft_conv_pytorch/fft_conv.py:139: UserWarning: Using a non-tuple sequence for multidimensional indexing is deprecated and will be changed in pytorch 2.9; use x[tuple(seq)] instead of x[seq]. In pytorch 2.9 this will be interpreted as tensor index, x[torch.tensor(seq)], which will result either in an error or a different result (Triggered internally at /pytorch/torch/csrc/autograd/python_variable_indexing.cpp:345.)
  output = output[crop_slices].contiguous()

The source of this is from fft-conv-pytorch which is a dependency of batchgeneratorsv2. Unfortunately, it does not look like fft-conv-pytorch is being maintained. This PR has existed since July: fkodom/fft-conv-pytorch#26

The offending code:
fft-conv-pytorch/fft_conv_pytorch/fft_conv.py lines 134-139`

    # Remove extra padded values
    crop_slices = [slice(None), slice(None)] + [
        slice(0, (signal_size[i] - kernel.size(i) + 1), stride_[i - 2])
        for i in range(2, signal.ndim)
    ]
    output = output[crop_slices].contiguous()

With the help of a coworker, we found an active fork with PyPi releases where someone did fix the issue:

The fix:
fft-conv-pytorch/torch_fftconv/functional.py lines 242-245

    # Remove extra padded values
    index = (slice(None),) * 2 + tuple(slice(p, o + p)
                                       for p, o in zip(padding, output_size))
    output = output[index].contiguous()

I have updated and tested the two locations where this library is used in batchgeneratorsv2 and updated the pyproject.toml to replace the out of date dependency.

@FabianIsensee
Copy link
Copy Markdown
Member

Thanks for the PR. I have also encountered this issue and therefore copied the relevant function from the fft-conv-pytorch repo into batchgenerators itself. This allows me to keep it running looking forward without relying on others to maintain it.
Best,
Fabian

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants