Ensure that we don't skip any commands#2
Ensure that we don't skip any commands#2slycordinator wants to merge 5 commits intoAkianonymus:masterfrom
Conversation
…sible by the chosen maximum
|
To make it clear. It misses 3 of the 10 tasks in the first command. Because the eval call only happens when [[ ${job} -eq "${NO_OF_JOBS}" ]], so in the first one, the commands run when $job == 7, then gets reset to 0, and the remaining 3 commands get added to cmds string, but are never run since it won't reach 7. And same idea for second command; there are 10 tasks yet it will only try to run them when $job == 100, so none of them are run because the index will never reach 100. And with the modified one: And this has all tasks complete. |
Refactor the job execution into its own function. Removes code duplication, since now the code is called in two sections of the script
|
Nice Can you fix the formatting ? Thanks |
This reverts commit 8519188.
Refactor parallel-bash.bash to place process/task execution into a function Because the code is now called twice, this removes code duplication.
Done. I was in the middle of fixing that when you commented. |
fixes #1
In _execute::_process_arguments::parallel-bash()
We append each command to the cmds string, then stop and evaluate it (and run the commands) only once the job index reaches that of $NO_OF_JOBS.
But when the total number of requested jobs is not evenly divisible by $NO_OF_JOBS, after looping through all of the commands, some remain having not been run, due to the fact that $NO_OF_COMMANDS having not been reached. This happens when, for example, there are 13 commands requested and $NO_OF_COMMANDS == 10, resulting in 3 commands that never got evaluated.
And in that case, after iterating through the input arrays, $cmds will contain the previous commands that were yet to be processed.
So, this change has us run eval on $cmds in that case, so long as it's not empty.