12

I have a bash script with a function that needs to run in parallel with different arguments. I need to know if at least one of the executions failed (returned non-zero) - doesn't matter how many failed.

The command accepts an array of parameters for the execution. I need to limit the concurrency to 4 concurrent runs due to high load. I also need to print the logs in the parent process (the one that runs the bash script)

this is the function I'm running:

function run_and_retry {
  EXIT_STATUS=0
  $COMMAND || EXIT_STATUS=$?

  if [ $EXIT_STATUS -ne 0 ]; then
    EXIT_STATUS=0
    $COMMAND || EXIT_STATUS=$?

  fi

  return $EXIT_STATUS
}

I've tried using GNU parallel and xargs and encountered issues with both.

With xargs: (couldn't get the exit status out of it, and it also didn't work when I ran it in TravisCI)

PARAMETERS=(first-parameter second-parameter third-parameter)
export -f run_and_retry
echo "${PARAMETERS[@]}" | xargs -P 4 -n 1 -I {} bash -c "run_and_retry {}"

With GNU parallel:

PARAMETERS=(first-parameter second-parameter third-parameter)
export -f run_and_retry
parallel -j 4 -k --lb 2 run_and_retry {} ::: echo "${PARAMETERS[@]}" 
3
  • The -j parallel option needs an argument, ex. parallel -j 4. What is COMMAND and how is it assigned? Commented Jan 3, 2019 at 9:42
  • thanks, fixed that Commented Jan 3, 2019 at 9:58
  • The run_and_retry function seems like a really roundabout way of saying $COMMAND || $COMMAND || return Commented Jan 3, 2019 at 10:37

2 Answers 2

7

You are so close to getting the syntax of GNU Parallel correct:

COMMAND=echo
PARAMETERS=(first-parameter second-parameter third-parameter)
parallel -j 4 -k --retries 2 "$COMMAND" {} ::: "${PARAMETERS[@]}" ||
  echo $? commands failed. More than 99 if $? = 100

Or if you really insist on doing the retrying yourself:

PARAMETERS=(first-parameter second-parameter third-parameter)
export -f run_and_retry
parallel -j 4 -k run_and_retry {} ::: "${PARAMETERS[@]}" ||
  echo One or more commands failed
Sign up to request clarification or add additional context in comments.

2 Comments

I am encountering an issue with the exit status - the script is failing because the command failed on the first try, but it passed on the second try.. do you know how to make the GNU parallel fail only if a command failed twice? what I did is this: parallel -j 4 -k --retries 2 "$COMMAND{}" ::: "${PARAMETERS[@]}" || FAIL=1
@AviaEyal GNU Parallel tries twice. If the command fails both times it prints the failing output and returns with an error. Run with -u to see output from both failing attempts.
2

I need to know if at least one of the executions failed (returned non-zero)

From posix xargs:

EXIT STATUS

1-125
A command line meeting the specified requirements could not be assembled, one or more of the invocations of utility returned a non-zero exit status, or some other error occurred.

The man xargs seems a bit different:

EXIT STATUS

123 if any invocation of the command exited with status 1-125

But I would check the return status of the command and return a predefined number (ex. 1) from the function to handle that.

parameters=(1 2 3 fail)

func() { 
    COMMAND=sleep
    # I guess OP intends to try running COMMAND twice
    if ! "$COMMAND" 0."$1" && ! "$COMMAND" 0."$1"; then
        return 1
    fi
}

export -f func
if printf "%s\0" "${parameters[@]}" | xargs -0 -P4 -n1 -t -- bash -c 'func $1' -- ; then
   echo "Success!"
else
   echo "Error!"
fi

Live version available at tutorialspoint.

Well, we can even count the number of childs manually, and it get's pretty simple with wait -n. From stackoverflow - WAIT for “1 of many process” to finish:

bash 4.3 added a -n flag to the built-in wait command, which causes the script to wait for the next child to complete.

So we can just:

cnt=0
failed=false
for i in "${parameters[@]}"; do
    ( func "$i" ) &
    if (( cnt < 4 )); then
        cnt=$((cnt+1))
    else
        # handle more then 4 processes
        if ! wait -n; then
           failed=true
        fi
    fi
done
# handle still running processes after all have been forked
for i in $(seq $cnt); do
    if ! wait -n; then
        failed=true
    fi
done

if "$failed"; then
    echo "One of the jobs failed!"
fi

3 Comments

Thanks for your answer! the command I'm running is "./node_modules/protractor/bin/protractor ./protractor.conf.js --specs=test/e2e/login/login.spec.js" and I'm getting an error of No such file or directory. If I copy-paste the command into the shell it works, just not inside the script. any idea why?
No such file or directory shoudn't be a full error, shell should tell which file or directory it can't find. You may interest yourself in quoting. Probably you want to use bash arraysCOMMAND=(echo 1 2 3); "${COMMAND[@]}" "$additional_arg"
ok cool, when I removed the quotes it made the command run. but the function is still failing even though the command itself passes. I do still have this error: environment: line 2: Report: command not found and I don't know why, and I don't see the logs for the command itself

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.