3

I'm writing a bash script which has many parts, but I want built in resiliency and parallel processing as much as possible. An example is below:

while true do var=$(curl someurl) || continue var2=$(curl someurl2) || continue var3=$(curl someurl3) || continue dosomething var1 || continue dosomething2 var2 || continue dosomething3 var3 || continue break done 

I'd like to run the 3 curl commands in parallel, and then the three "do something" commands in parallel as well. However, I'm not sure how this would be handled. I considered inserting the wait command after each block and inserting & after each command as follows:

while true do var=$(curl someurl) & || continue var2=$(curl someurl2) & || continue var3=$(curl someurl3) & || continue wait dosomething var1 & || continue dosomething2 var2 & || continue dosomething3 var3 & || continue wait break done 

However, in this situation, I'm not sure how the exit statuses and/or race conditions could be managed. Additionally, this doesn't seem like best coding practices. Is there a good way to do this?

2
  • 1
    I think parallelism, as basically a performance concern, is not best handled in bash; if performance becomes an issue, it's usually best to move to a more sophisticated language. Commented Oct 22, 2015 at 20:12
  • Depending on your requirements you could restructure it as (var1=...; dosomething $var1) &; ditto var2 & var3; wait; wait; wait. Commented Oct 23, 2015 at 2:16

1 Answer 1

2

Bash doesn't let you capture the output of a something in the background.

$ unset var; var=$(sh -c 'sleep 5; echo foo; exit 42') $ echo $? $var 42 foo $ unset var; var=$(sh -c 'sleep 5; echo foo; exit 42') & wait [1] 30250 [1]+ Exit 42 var=$(sh -c 'sleep 5; echo foo; exit 42') $ echo $? $var 0 

There is wait -n which returns when some background process has ended, and it exits with that process's exit status. Since you know how many background processes you have, you can wait -n that many times. Ugly, but...

while true do curl someurl1 > file1 & curl someurl2 > file2 & curl someurl3 > file3 & for i in 1 2 3; do wait -n || { echo some curl failed; continue 2; } done dosomething "$(< file1)" & dosomething "$(< file2)" & dosomething "$(< file3)" & for i in 1 2 3; do wait -n || { echo some dosomething failed; continue 2; } done break done 
1
  • Is there perhaps a better command or built in function than wait which could be used? This seems ugly as well as just poor programming principles in general. That, or maybe there's a completely different code structure that could be used? Commented Oct 22, 2015 at 21:44

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.