5

How can I create a data file with one column in which there will be 1000 rows with zero values?

something like:

output:

0
0
0
0
0
.
.

.

5 Answers 5

11

You might use yes(1) for that (piped into head(1)...):

yes 0 | head -n 1000 > data_file_with_a_thousand_0s.txt

and if you need a million zeros, replace the 1000 with 1000000

PS. In the old days, head -1000 was enough since equivalent to head -n 1000 today.

2
  • 1
    This definitely feels like "the Unix way" - glue some simple elements together into a pipe. It's worth noting though that the bare "-1000" argument is deprecated and the standard form (as listed on the man page you linked) is now "-n 1000". See e.g. unix.com/man-page/posix/1p/head and gnu.org/software/coreutils/manual/html_node/head-invocation for confirmation of this history. Commented Aug 5, 2016 at 15:17
  • Brilliant. Didn't know yes existed. Commented Aug 5, 2016 at 18:57
8

Simply,

printf '0\n%.0s' {1..1000}

or using for loop,

for i in {1..1000}; do echo "0"; done

using awk,

awk 'BEGIN{for(c=0;c<1000;c++) print "0"}'

As @StéphaneChazelas pointed out, Using {1..1000} requires zsh or recent versions of bash, yash or ksh93 and also means storing the whole range in memory (possibly several times). You'll find it becomes a lot slower (if it doesn't crash for OOM) than using awk or yes 0 | head ... for large ranges like {1..10000000}. Or in other words it doesn't scale well. Possible workaround would be to use

for ((i=0; i<=10000000;i++)); do echo 0; done 

(ksh93/zsh/bash) wouldn't have the memory issue but would still be orders of magnitude slower than a dedicated tool or real programming language approach.

7
  • The printf command might fail when 1000 gets replaced with 1000000 because the shell would fail execve(2) with E2BIG Commented Aug 5, 2016 at 5:45
  • 1
    Using {1..1000} requires zsh or recent versions of bash, yash or ksh93 and also means storing the whole range in memory (possibly several times). You'll find it becomes a lot slower (if it doesn't crash for OOM) than using awk or yes|head for large ranges like {1..10000000}. Or in other words it doesn't scale well. Commented Aug 5, 2016 at 10:39
  • @BasileStarynkevitch, the shells that support {x..y} (zsh, ksh93, bash and yash) all have printf builtin, so the E2BIG doesn't apply. Commented Aug 5, 2016 at 10:41
  • @StéphaneChazelas I am completely agree with you. Is there any other workaround in this case ? Commented Aug 5, 2016 at 10:44
  • 1
    for ((i=0; i<=10000000;i++)); do echo 0; done (ksh93/zsh/bash) wouldn't have the memory issue but would still be orders of magnitude slower than a dedicated tool or real programming language approach. Commented Aug 5, 2016 at 10:58
6
perl -e 'print "0\n" x 1000' > file.txt


As @Stéphane Chazelas notes, this is fast for large numbers but can run into memory issues(use yes|head approach in that case)

performance comparison, selected best out of 3 continuous runs

$ time perl -e 'print "0\n" x 100000000' > /dev/null
real    0m0.117s

$ time python -c 'import sys; sys.stdout.write("0\n" * 100000000)' > /dev/null
real    0m0.184s

$ time yes 0 | head -n 100000000 > /dev/null
real    0m0.979s

$ time awk 'BEGIN{for(c=0;c<100000000;c++) print "0"}' > /dev/null
real    0m12.933s

$ time seq 0 0 0 | head -n 100000000 > /dev/null
real    0m19.040s
2
  • 1
    I find it's by far the fastest for large numbers of rows (even than its python -c 'import sys; sys.stdout.write("0\n" * 1000)' equivalent or the yes|head approach). Commented Aug 5, 2016 at 11:02
  • 1
    However, it means storing the whole output in memory before printing it so also has scalability issues. Commented Aug 5, 2016 at 11:07
2
python2 -c 'print "0\n" * 1000' > file.txt
2
  • that will print an extra blank line, you would have to use trailing comma (,) at the end of command to prevent this. Commented Aug 5, 2016 at 9:56
  • 1
    This only works in the old python2, in Python 3 print no longer is a statement Commented Aug 5, 2016 at 10:17
0

seq could be used:

seq 0 0 0 | head -1000

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.