-
Notifications
You must be signed in to change notification settings - Fork 2
Open
Labels
Description
Past evidence has indicated that when > 400 subjects in a single analysis/group, qmake chunk job memory requirements could exceed the currently hard coded 3 GB in tableinput.R. Ideally, tableinput.R could automatically scale the memory setting based on the number of subjects in the largest analysis/group (or perhaps it is better for the gtx package to do this as it has more knowledge of the analysis/groups). As a simple fix, allow a config key where the analyst can specify the number of subjects in the largest analysis/group and tableinput.R would recognize and scale the memory setting accordingly. Something like -l mt=NG where N is (X / 100) where X is the number of subjects - unsure if N can be float or if it should be rounded.
Reactions are currently unavailable