How to download a list of URLs using more than one process (say wget) at the time?
First, create a file with URLs – one URL per line. Let’s call the file url.txt. Then we need to create N wget processes, each downloading one URL at the time. Thanks to xargs it is trivial:

cat url.txt | xargs -n 1 -P 10 wget

-n 1 will make xargs run command (wget) with only one argument at the time
-P 10 will create 10 parallel processes