118

I have a script that looks something like this:

export foo=/tmp/foo                                          
export bar=/tmp/bar

Every time I build I run 'source init_env' (where init_env is the above script) to set up some variables.

To accomplish the same in Python I had this code running,

reg = re.compile('export (?P<name>\w+)(\=(?P<value>.+))*')
for line in open(file):
    m = reg.match(line)
    if m:
        name = m.group('name')
        value = ''
        if m.group('value'):
            value = m.group('value')
        os.putenv(name, value)

But then someone decided it would be nice to add a line like the following to the init_env file:

export PATH="/foo/bar:/bar/foo:$PATH"     

Obviously my Python script fell apart. I could modify the Python script to handle this line, but then it'll just break later on when someone comes up with a new feature to use in the init_env file.

The question is if there is an easy way to run a Bash command and let it modify my os.environ?

1

6 Answers 6

137

The problem with your approach is that you are trying to interpret bash scripts. First you just try to interpret the export statement. But when people think they can use bash syntax they will. They will use variable expansion, conditionals, process substitutions. In the end you will have a full blown bash script interpreter with a gazillion bugs. Don't do that.

Let Bash interpret the file for you and then collect the results.

Here is a minimal example how to do so:

#! /usr/bin/env python

import os
import pprint
import shlex
import subprocess

command = shlex.split("bash -c 'source init_env && env'")
proc = subprocess.Popen(command, stdout = subprocess.PIPE)
for line in proc.stdout:
  (key, _, value) = line.partition("=")
  os.environ[key] = value
proc.communicate()

pprint.pprint(dict(os.environ))

Make sure that you handle errors. see here for how: "subprocess.Popen" - checking for success and errors

Also read the documentation on subprocess.

this will only capture variables set with the export statement, as env only prints exported variables. you can add set -a to treat all variables as exported.

command = shlex.split("bash -c 'set -a && source init_env && env'")
                                ^^^^^^

note that this code will not handle multi line variables. it will also not handle bash function definitions.


perhaps better than calling bash source from inside python is to first let bash source the file and then run the python script

#!/bin/bash
source init_env
/path/to/python_script.py

here bash will source init_env with all the power and glory and quirks of bash. the python script will inherit the updated environment.

note that again only exported variables will be inherited. you can force all variable assignments to be exported with set -a

#!/bin/bash
set -a
source init_env
/path/to/python_script.py

another approach would be to tell the users that they can strictly only do key=value without any bash power. then use python configparser.

this will have the advantage of simple init_env syntax and a rigorously tested config parser. but the disadvantage that the init_env will no longer be as expressive as bash config files can be.

Sign up to request clarification or add additional context in comments.

12 Comments

If you do care about non-exported variables and the script is outside of your control, you can use set -a to mark all variables as exported. Just change the command to: ['bash', '-c', 'set -a && source init_env && env']
Note that this will fail on exported functions. I'd love it if you could update your answer showing parsing that works for functions too. (e.g. function fff() { echo "fff"; }; export -f fff)
Note: this does not support multiline environment variables.
In my case, iterating over proc.stdout() yields bytes, thus I was getting a TypeError on line.partition(). Converting to string with line.decode().partition("=") solved the problem.
This was super helpful. I executed ['env', '-i', 'bash', '-c', 'source .bashrc && env'] to give myself only the environment variables set by the rc file
|
37

Using pickle:

import os, pickle
# For clarity, I moved this string out of the command
source = 'source init_env'
dump = '/usr/bin/python -c "import os,pickle;print pickle.dumps(os.environ)"'
penv = os.popen('%s && %s' %(source,dump))
env = pickle.loads(penv.read())
os.environ = env

Updated:

This uses json, subprocess, and explicitly uses /bin/bash (for ubuntu support):

import os, subprocess as sp, json
source = 'source init_env'
dump = '/usr/bin/python -c "import os, json;print json.dumps(dict(os.environ))"'
pipe = sp.Popen(['/bin/bash', '-c', '%s && %s' %(source,dump)], stdout=sp.PIPE)
env = json.loads(pipe.stdout.read())
os.environ = env

2 Comments

This one has a problem on Ubuntu - the default shell there is /bin/dash, which does not know the source command. In order to use it on Ubuntu, you have to run /bin/bash explicitly, e.g. by using penv = subprocess.Popen(['/bin/bash', '-c', '%s && %s' %(source,dump)], stdout=subprocess.PIPE).stdout (this uses the newer subprocess module which has to be imported).
Extracting the env as json works great, but I had the issue that assigning os.environ=env does not really update the environment for called subprocesses but made os.environ" a plain dict without special features. So I replaced this by 'for key in env: os.environ[key] = env[key]. This does not delete any variable, but I only need to add new or extend existing variables.
30

Rather than having your Python script source the bash script, it would be simpler and more elegant to have a wrapper script source init_env and then run your Python script with the modified environment.

#!/bin/bash
source init_env
/run/python/script.py

3 Comments

It may solve the problem in some circumstances, but not all of them. For example I am writing a python script that needs to do something like sourcing the file (actually it loads modules if you know what I'm talking about), and it needs to load a different module depending on some circumstances. So this would not solve my problem at all
This does answer the question in most cases, and I would use it wherever possible. I had a hard time making this work in my IDE for a given project. One possible modification might be to run the whole thing in a shell with the environment bash --rcfile init_env -c ./script.py
This is my preferred solution as well. However, many debuggers like Visual Studio Code or PyCharm do not allow to launch debuggable Python script this way.
9

Updated @lesmana's answer for Python 3. Notice the use of env -i which prevents extraneous environment variables from being set/reset (potentially incorrectly given the lack of handling for multiline env variables).

import os, subprocess
if os.path.isfile("init_env"):
    command = 'env -i sh -c "source init_env && env"'
    for line in subprocess.getoutput(command).split("\n"):
        key, value = line.split("=")
        os.environ[key]= value

1 Comment

Using this gives me "PATH: undefined variable" because env -i unsets the path. But it does work without the env -i. Also be careful that the line might have multiple '='
7

Example wrapping @Brian's excellent answer in a function:

import json
import subprocess

# returns a dictionary of the environment variables resulting from sourcing a file
def env_from_sourcing(file_to_source_path, include_unexported_variables=False):
    source = '%ssource %s' % ("set -a && " if include_unexported_variables else "", file_to_source_path)
    dump = '/usr/bin/python -c "import os, json; print json.dumps(dict(os.environ))"'
    pipe = subprocess.Popen(['/bin/bash', '-c', '%s && %s' % (source, dump)], stdout=subprocess.PIPE)
    return json.loads(pipe.stdout.read())

I'm using this utility function to read aws credentials and docker .env files with include_unexported_variables=True.

Comments

0

Best workaround I found is like this :

  • Write a wrapper bash script that calls your python script
  • In that bash script you can source or call that script after sourcing your current terminal

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.