I have a python script (e.g. test.py) and a commands.txt file which contains a custom bash function (e.g. my_func) along with its parameters, e.g.
my_func arg1 arv2; my_func arg3 arg4; my_func arg5 arg6;
This function is included in the ~/.bash_profile.
What I have tried to do is:
subprocess.call(['.', path/to/commands.txt], shell=True)
I know this is not the best case, in terms of setting the shell argument into True, but I cannot seem to implement it even in this way. What I get when I run test.py is:
my_func: command not found
.bash_profile?export -f my_func. If you don't do that then it is only visible to the interactive session you define it in, i.e. it does not get passed to child processes.my_funcuses other bash functions in its definition. Does this affect anything?subprocess.call(['for arg; do . "$arg"; done', '_', os.path.expanduser('~/.bashrc'), '/path/to/commands.txt'], shell=True)will work, if and only if your scripts (commands.txtand.bashrc) are both compatible with/bin/sh.subprocess.Popen(['bash', '-c', 'funcname "$@"', '_'] + args)will be able to run it (withargsbeing a Python list of arguments to pass to the function, ie.args=['foo', 'bar', 'baz']to runfuncname foo bar baz). Usingbashinstead ofshell=Truemakes sure you get a shell that actually knows how to read those exported functions, and passing arguments out-of-band from code is a security precaution: you don't want to pass a freeform field and have someone put$(rm -rf ~)in it.