Shell scripts in parallel

 · 3 min · torgeir

The other day we tried to controllably DDOS a colleague's web server using a shell script. Like every time I dig into the quicks of bash I learn something new.

Terminal Bash

Sequences of numbers can be created using seq in bash.

seq 2 4
2
3
4

The & will send scripts to the background, and jobs can list what’s currently running. This will put three sleep commands in the background and list the number of jobs running.

sleep 1 & sleep 1 & sleep 1 & jobs -r | wc -l | grep -Eow '[0-9]+'
3

In the words of chatgpt:

The jobs -r command in bash lists running jobs that are continued in the background

  • -E: Uses extended regex syntax instead of basic regex syntax.
  • -o: Only prints the matched portion of the line, rather than the whole line.
  • -w: Only matches whole words. For example, if you use -w to search for “cat” in a file, it won’t match “concatenate”.

The following will launch 4 commands in the background (&), using a subshell (the parens), each echo-ing <number> done when it is done. To make a shell script wait for background jobs to finish, use the wait command. This will take ~2s to finish as each command sleeps for 2 seconds.

for i in $(seq 1 4); do
  (sleep 2 && echo "$i done") &
  echo waiting
done;
wait # takes ~2s
waiting
waiting
waiting
waiting
4 done
2 done
1 done
3 done

The following will read each line received from stdin into the variable i and echo its number.

seq 4 | while read i; do echo line $i; done
line 1
line 2
line 3
line 4

Another way of acomplishing this is using process substitution. This allows the output of a command to be referred like its read from a file, by wrapping the command in <( ), like this

while read i; do echo line $i; done < <(seq 4)
line 1
line 2
line 3
line 4

We can combine the above into a while loop that will wait if there are max_bg_jobs or more background jobs running, before launching another one. wait -n will wait for a single background job to finish. (For wait -n to work on macos ventura you need an upgraded bash, e.g. by brew install bash to install 5.2 as of may 2023)

The following will keep a maximum number of 2 jobs going at once. A final wait ensures we wait for the last background job (or jobs, depending on how many you run and max_bg_jobs) to finish before the program exits.

#!/usr/bin/env bash
max_bg_jobs=2
while read l; do
    P="$(jobs -r | wc -l | grep -Eow '[0-9]+')"
    if [[ $P -ge $max_bg_jobs ]]; then
        wait -n # wait for single job
    fi
    (echo "$(date +'%H:%M:%S') starting $l" \
        && sleep 2 \
        && echo "$(date +'%H:%M:%S') done $l") &
done < <(seq 5)
wait # wait for remaining jobs
13:01:33 starting 1
13:01:33 starting 2
13:01:35 done 1
13:01:35 done 2
13:01:35 starting 3
13:01:35 starting 4
13:01:37 done 3
13:01:37 done 4
13:01:37 starting 5
13:01:39 done 5

Change the echo-s that are sent into the background to e.g. curl, and you have yourself a machinery that can limit the number of concurrent requests to a number of your choice.

Resources