I have a program which takes urls as commandline arguments and output a pdf file. For example, I use command substitution to provide the input urls from a file urlsfile
wkhtmltopdf $(cat urlsfile) my.pdf
In urlsfile, each line is a url.
Now I would like to group every 15 lines in urlsfile, and feed a group of urls to the program at a time.
How can I do that in bash?
Note that it is acceptable to create a pdf file per 15 urls, and then I will merge the pdf files into one. If the merge can be done by the program, that is better.
Thanks.
cat urlsfile | xargs wkhtmltopdf my.pdfdoesn't work, because I only know the program takes inputs from cmdline args, and not sure how to make it take inputs from stdin.wkhtmltopdfprovides a--read-args-from-stdinoption that may be useful