-1

I have mutliple (over 100) similar to this:

wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2009_02//00031327004/auxil/
wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2009_01//00031327001/uvot/
wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2010_12//00031856009/uvot/
wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2008_01//00031043003/uvot/
wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2012_01//00032237004/uvot/

I have been told that this can be completed very quickly and easy with a bash script, could someone give me an example if possible for multiple wget commands? What do I need to include within the script? Apologies for my n00b questions, but...I am one!

5
  • 1
    How similar? What's different between each call? Just the URL? Commented Dec 3, 2015 at 20:50
  • I'll update the answer with a few lines. Commented Dec 3, 2015 at 20:51
  • Do you have all these URLs in a text file? Commented Dec 3, 2015 at 20:57
  • Yeh, all in a normal .txt file Commented Dec 3, 2015 at 20:58
  • bash your-file-here.txt, if all you want is to run it as a script. Commented Dec 3, 2015 at 21:31

2 Answers 2

2

Assuming your URL list looks like this:

http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2009_02//00031327004/auxil/
http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2009_01//00031327001/uvot/
http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2010_12//00031856009/uvot/
http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2008_01//00031043003/uvot/
http://heasarc.gsfc.nasa.gov/FTP/swift/data/obs/2012_01//00032237004/uvot/

Just read your URLs from the text file using a while loop:

#!/bin/bash
while read url; do
    wget -q -nH --cut-dirs=5 -r -l0 -c -N -np -R 'index*' -erobots=off --retr-symlinks "$url"
done < urls.txt
Sign up to request clarification or add additional context in comments.

Comments

0

Assuming your noob question is about speeding up the process time,

you can spawn multiple process's in background & and wait for them

wget firsturl &
wget secondurl &
...
wait

4 Comments

I'd really prefer a bash script.
its a bash script, just replace firsturl with what ever you are using
And this can be placed in a text file to be run as a script?
yes, just replace ... with all your urls and end each line with &

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.