I've got a script where the functionality I need has outgrown bash (or at least outgrown my desire to maintain it in bash) so I'm looking at how to do this in python instead.
I've looked through the multiprocessing documentation to my ability to understand it, and I'm still not entirely sure which combination of map-ping, starmap-ping, async-ing, and other python magic I need to use. The starting point for this block would be represented in bash like this:
command1 argument1a arg1b > output1 < /dev/null &command2 argument2a arg2b > output2 < /dev/null &command3 argument3a arg3b > output3 < /dev/null &waitdo_something_else
building out the command and argument list is easy enough, I'm having trouble with the job control equivalents and getting the script to wait in the right places.
Some relevant things I'm trying to accomplish:
- each
command
may be different or another instance of the same command, but each is associated with a uniqueoutput
file so I don't have to worry about locking/overwriting. - none of the processes need to take any input.
- stderr should be printed to the terminal, although if for some reason it's way easier to put it into a separate file that would be fine as well.
- I want all of the processes to start more or less simultaneously (the precise order/timing is not important here)