Skip to content

Memory requirements unclear #185

@hendrikweisser

Description

@hendrikweisser

I ran the ProteomicsLFQ workflow on some test data on a machine with 64 GB RAM. If I don't change the "--max_memory" setting (default: "128.GB"), the pipeline fails during the "proteomicslfq" step with the following message:

Error executing process > 'proteomicslfq (1)'

Caused by:
  Process requirement exceed available memory -- req: 64 GB; avail: 62.8 GB

However, if I set "--max_memory 48.GB", the error goes away and the step finishes. So obviously this isn't a big problem, but I don't understand why "proteomicslfq" asks for a large amount of memory (that it doesn't need), but not for the maximum amount specified by the parameter. In addition, if "--max_memory" needs to be adapted to the available RAM, this should be documented and the parameter not hidden by default.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions