0

I have a small workflow formed by one run.sh script that automates a few scripts.

In this run.sh I have a function to run some INFO and ERROR messages and save it in a log.txt file but the scripts also generate outputs. I would like to save these outputs in another log file and also see the output when I run the pipeline.

I run my pipeline with this command.

run.sh -f1 a file -f2 other file -d a/directory.

I have seen I can do that as explained in this link

But as long as I know this will not show me the output in the terminal.

How can I get output in the terminal and also save it in a file? I am using a cluster computer and the output in the terminal is not saved if I lost the conection or log off from my PC.

0

1 Answer 1

3

You know about tee? Something like...

run.sh -f1 a file -f2 other file -d a/directory | tee output.txt

...would run your script and show the standard output, while at the same time store it in output.txt

Sign up to request clarification or add additional context in comments.

3 Comments

if the directory I want to save the output.txt file is not created until the run starts, what can I do?
Save to a temporary file, or manually create the directory before the run. There is no way to create a directory simply by redirecting.
I'd just store it somewhere else - it's a log of what happens, so make a folder for those logs and name the file with the date and time instead of output.txt (e.g. ... | tee $(date +"%Y-%m-%d_%H:%M:%S").txt). Maybe make a function that takes the two files and the directory as parameters and then do that command?

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.