-
Notifications
You must be signed in to change notification settings - Fork 20
Open
Description
I ran the ProteomicsLFQ workflow on some test data on a machine with 64 GB RAM. If I don't change the "--max_memory" setting (default: "128.GB"), the pipeline fails during the "proteomicslfq" step with the following message:
Error executing process > 'proteomicslfq (1)'
Caused by:
Process requirement exceed available memory -- req: 64 GB; avail: 62.8 GB
However, if I set "--max_memory 48.GB", the error goes away and the step finishes. So obviously this isn't a big problem, but I don't understand why "proteomicslfq" asks for a large amount of memory (that it doesn't need), but not for the maximum amount specified by the parameter. In addition, if "--max_memory" needs to be adapted to the available RAM, this should be documented and the parameter not hidden by default.
Metadata
Metadata
Assignees
Labels
No labels