135

I have a long text file with list of file masks I want to delete

Example:

/tmp/aaa.jpg
/var/www1/*
/var/www/qwerty.php

I need delete them. Tried rm `cat 1.txt` and it says the list is too long.

Found this command, but when I check folders from the list, some of them still have files xargs rm <1.txt Manual rm call removes files from such folders, so no issue with permissions.

2
  • 19
    Even though it's six years later: Would you mindaccepting one of the answers? This'll mark the question as resolved and help other users as well. Commented May 14, 2017 at 11:06
  • He's all like, "nah" Commented Oct 2, 2024 at 22:22

12 Answers 12

158

This is not very efficient, but will work if you need glob patterns (as in /var/www/*)

for f in $(cat 1.txt) ; do 
  rm "$f"
done

If you don't have any patterns and are sure your paths in the file do not contain whitespaces or other weird things, you can use xargs like so:

xargs rm < 1.txt
Sign up to request clarification or add additional context in comments.

5 Comments

What will happen with the first solution (using CAT) if the path contains a space?
This didn't work for me using MINGW64 Bash with a text file containing paths with spaces.
@804b18f832fb419fb142 Yep, same for me. Any alternatives?
To deal with white space, I suggest using a while loop with read function to solve this problem. A script withfilename='123.txt'; while read line ; rm -r "$line"; done < $filename. Pay attention to the quote marks around $line. It is a gotcha point in shell scripting.
Performance hint: our (average) web server can remove around 500 files per second with xargs rm
81

Assuming that the list of files is in the file 1.txt, then do:

xargs rm -r <1.txt

The -r option causes recursion into any directories named in 1.txt.

If any files are read-only, use the -f option to force the deletion:

xargs rm -rf <1.txt

Be cautious with input to any tool that does programmatic deletions. Make certain that the files named in the input file are really to be deleted. Be especially careful about seemingly simple typos. For example, if you enter a space between a file and its suffix, it will appear to be two separate file names:

file .txt

is actually two separate files: file and .txt.

This may seem not so dangerous, but if the typo is something like this:

myoldfiles *

Then instead of deleting all files that begin with myoldfiles, you'll end up deleting myoldfiles and all non-dot-files and directories in the current directory. Probably not what you wanted.

3 Comments

"myoldfiles /" or "/ tmp" even more disastrous with -rf. Note the space character next to the "/".
I think this is a dangerous answer, why use recurse if it's a list of files, coupled with forced deletion makes this a footgun
@CervEd Because sometimes a man needs to shoot himself in the foot to get the job done.
35

Use this:

while IFS= read -r file ; do rm -- "$file" ; done < delete.list

If you need glob expansion you can omit quoting $file:

IFS=""
while read -r file ; do rm -- $file ; done < delete.list

But be warned that file names can contain "problematic" content and I would use the unquoted version. Imagine this pattern in the file

*
*/*
*/*/*

This would delete quite a lot from the current directory! I would encourage you to prepare the delete list in a way that glob patterns aren't required anymore, and then use quoting like in my first example.

1 Comment

This is currently the only answer that handles all of /My Dir With Spaces/ /Spaces and globs/*.txt, --file-with-leading-dashes, and as a bonus it's POSIX and works on GNU/Linux, macOS and BSD.
20

You could use '\n' for define the new line character as delimiter.

xargs -d '\n' rm < 1.txt

Be careful with the -rf because it can delete what you don't want to if the 1.txt contains paths with spaces. That's why the new line delimiter a bit safer.

On BSD systems, you could use -0 option to use new line characters as delimiter like this:

xargs -0 rm < 1.txt

7 Comments

On OS X this gets me xargs: illegal option -- d
Good point. It seems there is no other option than -0 on OS X.
xargs -I_ rm _ also works in OS X :) see stackoverflow.com/a/39335402/266309
you can also first run xargs -d '\n' ls < 1.txt or xargs -d '\n' stat < 1.txt to verify what will be deleted and avoid nasty surprises
@Ray using stat or find you get errors if the files don't exists, you don't have permission etc which I find helpful to verify the paths are correct etc
|
19

xargs -I{} sh -c 'rm "{}"' < 1.txt should do what you want. Be careful with this command as one incorrect entry in that file could cause a lot of trouble.

This answer was edited after @tdavies pointed out that the original did not do shell expansion.

5 Comments

Thanks, but no need to delete folders, only files
The globs won't get expanded, as they are never 'seen' by the shell.
@tdavies wow - you're right. This command (albeit a bit uglier) will work: xargs -I{} sh -c 'rm {}' < 1.txt
doesn't work if filename has single quotes
Edit: Now it also supports spaces
17

You can use this one-liner:

cat 1.txt | xargs echo rm | sh

Which does shell expansion but executes rm the minimum number of times.

3 Comments

As long as any expansion isn't too long?
True, a glob could produce an argument list which is too long -- you can add the -n <n> argument to xargs to reduce the number of arguments passed to each rm, but that will still not protect you from a single glob which exceeds the limit.
It would be great to elaborate on supposed expansion here. I'd think this would cause one rm invocation per line in the source file?
7

Just to provide an another way, you can also simply use the following command

$ cat to_remove
/tmp/file1
/tmp/file2
/tmp/file3
$ rm $( cat to_remove )

2 Comments

Works like a charm, also with rm -r. Deleted about 5TB of data this way. Can also be used with other commands, such as ls or du.
what about paths with spaces?
4

cat 1.txt | xargs rm -f | bash Run the command will do the following for files only.

cat 1.txt | xargs rm -rf | bash Run the command will do the following recursive behaviour.

Comments

3

In this particular case, due to the dangers cited in other answers, I would

  1. Edit in e.g. Vim and :%s/\s/\\\0/g, escaping all space characters with a backslash.

  2. Then :%s/^/rm -rf /, prepending the command. With -r you don't have to worry to have directories listed after the files contained therein, and with -f it won't complain due to missing files or duplicate entries.

  3. Run all the commands: $ source 1.txt

1 Comment

Spaces are not the only characters which need escaping, popular in filenames are also brackets of any kind which can lead to unwanted expansion.
0

Here's another looping example. This one also contains an 'if-statement' as an example of checking to see if the entry is a 'file' (or a 'directory' for example):

for f in $(cat 1.txt); do if [ -f $f ]; then rm $f; fi; done

Comments

-1

Here you can use set of folders from deletelist.txt while avoiding some patterns as well

foreach f (cat deletelist.txt)
    rm -rf ls | egrep -v "needthisfile|*.cpp|*.h"
end

Comments

-1

In case somebody prefers sed and removing without wildcard expansion:

sed -e "s/^\(.*\)$/rm -f -- \'\1\'/" deletelist.txt | /bin/sh

Reminder: use absolute pathnames in the file or make sure you are in the right directory.

And for completeness the same with awk:

awk '{printf "rm -f -- '\''%s'\''\n",$1}' deletelist.txt | /bin/sh

Wildcard expansion will work if the single quotes are remove, but this is dangerous in case the filename contains spaces. This would need to add quotes around the wildcards.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.