2

I am trying to execute a curl for 300 at a same time and using array. I do not know how to bring the content of my file in array. The code I write is bellow.

array=();
for i in {1..300}; do
  array+=( file.txt ) ; 
done; 
curl "${array[@]}";

The file.text include the following code

  --next 'https://d16.server.com/easy/api/OmsOrder' -H 'Connection: keep- 
  alive' - H 'Pragma: no-cache' -H 'Cache-Control: no-cache' -H 'Accept: 
  application/json, 
  text/plain, */*' -H 'Sec-Fetch-Dest: empty' -H 'User-Agent: Mozilla/5.0 
 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) 
  Chrome/80.0.3987.132 Safari/537.36' -H 'Content-Type: application/json' -H 
 'Origin: https://d.server.com' -H 'Sec-Fetch-Site: same-site' -H 
 'Sec-Fetch-Mode: cors' -H 'Referer: https://d.server.com/' -H 'Accept- 
  Language: en-US,en;q=0.9,fa;q=0.8' --data-binary 
 '{"isin":"IRO3TPEZ0001","financeId":1,"quantity":50000,"price":5400}' -- 
  compressed"
9
  • What's in "file.txt"? URLs? Do you want to pass the first 300 lines in the file to curl and ignore the rest? Commented Mar 17, 2020 at 19:13
  • @thatotherguy file.txt include a JSON staff and it is too long. I would like post it to the server 300 times. when I bring the content of file here I will get this error "-bash: /usr/bin/curl: Argument list too long" Commented Mar 17, 2020 at 19:19
  • There is no server in your example so this can't work. You should figure out a complete command for posting to a server once before trying to put anything in a loop or in an array. Commented Mar 17, 2020 at 19:22
  • Your script does this: curl file.txt file.txt file.txt ..., with 300 times file.txt. Commented Mar 17, 2020 at 19:22
  • @thatotherguy I have put the staff in file.txt. The file text include server and header and data. Commented Mar 17, 2020 at 19:31

3 Answers 3

1
array=();
for i in {1..300}; do
  array+=( $(cat file.txt|head -$i | tail -1) );
done; 
curl "${array[@]}";
Sign up to request clarification or add additional context in comments.

1 Comment

Could be improved a bit with sed -n "${i}p" Also, surely there's a way that isn't O(n^2)? Perhaps stackoverflow.com/questions/11393817/… ?
1

You have a file with shell formatted words that you are trying to repeat over and over in a command.

Since the words are shell formatted, you'll need to interpret them using e.g. eval:

contents=$(< file.txt)
eval "words=( $contents )"
arguments=()
for i in {1..300}
do
  arguments+=( "${words[@]}" )
done
curl "${arguments[@]}"

A more robust design would be to not use shell quoting and instead format one argument per line:

--next
https://d16.server.com/easy/api/OmsOrder
-H
Connection: keep-alive
-H
Pragma: no-cache

You can then use the above code and replace the eval line with:

mapfile -t words < file.txt

8 Comments

This work for 100 but for 300 I will get again the error "-bash: /usr/bin/curl: Argument list too long"
@Andrea What is the output of uname -sr and ulimit -s? And how big is your file.txt in total?
The file.txt is 2kb.
What is the output of uname -sr and ulimit -s?
I don't understand what you mean by uname -sr and ulimit -s
|
0

The answer to this question should have been "put each request into a file, one option per line, and use -K/--config to include the file into the command line." That certainly should allow for 300 requests in a single curl command without exceeding the limit on the size of a shell command. (By "request" here, I mean "a URL with associated options". If you only want to use 300 URLs without modifying any other option, you can easily do that by just listing the URLs, on the command line if they aren't too long or otherwise in a file.)

Unfortunately, it doesn't work. I believe that it is supposed to work, and the fact that it doesn't is a bug. If you specify multiple -K options and each of them refers to a file which includes one request and the --next option, then curl will execute only the first and last file. If you instead put the --next options on the command-line in between the -K options, all the request options will be merged, and in addition curl will complain about a missing URL.

However, you can use the -K option by concatenating all 300 requests and passing them through stdin, using -K - to read from stdin. To test that, I created the file containing a single request:

$ cat post-req
--next
-H "Connection: keep-alive"
-H "Pragma: no-cache" 
-H "Cache-Control: no-cache" 
-H "Accept: application/json, text/plain, */*" 
-H "Sec-Fetch-Dest: empty" 
-H "User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.132 Safari/537.36" 
-H "Content-Type: application/json" 
-H "Origin: https://d.server.com" 
-H "Sec-Fetch-Site: same-site" 
-H "Sec-Fetch-Mode: cors" 
-H "Referer: https://d.server.com/" 
-H "Accept-Language: en-US,en;q=0.9,fa;q=0.8" 
--data-binary "{\"isin\":\"IRO3TPEZ0001\",\"financeId\":1,\"quantity\":50000,\"price\":5400}" 
--compressed
--url "http://localhost/foo"

and then set up a little webserver that just returns the requested path, and invoked curl with:

for i in $(seq 300); do cat post-req; done | curl -K -

Indeed, all three hundred requests are passed through.

For what it's worth, I reported the bug as https://github.com/curl/curl/issues/5120, and many thanks to Daniel Stenberg for being incredibly responsive by committing a fix in less than two days. So probably the issue will be resolved in the next curl release.

6 Comments

Thanks for the answer but when I run the for i in $(seq 300); do cat post-req; done | curl -K - to the terminal will get the error like Warning: <stdin>:1: warning: '$' is unknown Warning: <stdin>:3: warning: ''https' is unknown I will be grateful if you let me know why? And one more thing how can I speed up these requests and make more requests?
@andrea: the line $cat post-req is me typing a bash command which shows you the file. You're not supposed to put it into the file. The error about https is because you didn't follow the model in my file; in a config file the --url option is obligatory. You can't leave it out like you can on the command-line. See Curl's help for more details.
I have done it correctly and it works very well. Just wondering how can speed up this and post 1000 in 1 seconds? Is there any way to do this indeed my computer send 300 request in 9 seconds.
For your other question, you could take a look at the --parallel option if your version of curl is recent enough. (You need 7.66.0, I believe.) Please try to avoid violating the terms of use of the webserver if it is not yours.
Just test the Bash in Terminal but It only send 175 and get their response. It suppose to post 300 request and get 300 responses, what is going wrong?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.