Skip to content

How to lower memory usage of Cuda? #14

@raanubis

Description

@raanubis

Hi,

thanks for the nice work.

Preprocess is already finished fo vqvae model (multispeaker)
I have a dataset size with 498005 files.
If I try to use "wavernn.py -m vqvae"
I get the error message that my Cuda memory run out.
RuntimeError: CUDA out of memory. Tried to allocate 1.70 GiB (GPU 0; 8.00 GiB total capacity; 5.08 GiB already allocated; 929.97 MiB free; 27.78 MiB cached)

How it is possible (which option I need to set) that it's run with my card?

I own only a graphic card with 8 GB

thanks

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions