15

I'm trying to write a bash script that takes an environment variable and passes it along to a command.

So if I had something like:

export OUT="-a=arg1 -b=\"arg2.0 arg2.1\""

I want in my bash script to do something like:

<command> -a=arg1 '-b=arg2.0 arg2.1'

I have one approach that seems to do this, but it involves using eval:

eval <command> ${OUT}

If I include set -x right about the command, I will see:

+ eval <command> a=arg1 'b="arg2.0' 'arg2.1"'
++ <command> -a=arg1 '-b=arg2.0 arg.1'

However, I've poked around the dangers of using eval and since this will be taking the arguments from user input, it's less than ideal.

Since this is bash, I've also considered using arrays to store my arguments and simply put: <command> "$ARRAY[@]" to do what I want. I've been trying to use IFS, but I'm not sure what I should be splitting on.

3
  • This is generally when I whip out a higher level scripting language, like ruby/python/perl/etc. Commented May 29, 2014 at 1:03
  • You can't export arrays, so if you insist on using a bash script file, that won't work. You could use an array with a bash function, though. Commented May 29, 2014 at 1:08
  • alternative: use set ...args... ; <more code>; command $@ Commented May 29, 2014 at 1:28

3 Answers 3

5

For the simplified problem described in the answer above; i.e., turning the following environment variable into three arguments inside a bash script:

export OPTS="a=arg1 b=arg2.0 b=arg2.1"

Just do the following:

#!/bin/bash
opts=( $OPTS )
my-command "${opts[@]}"

# Use this for debugging:
echo "number of opts = ${#opts[@]}; opts are: ${opts[@]}"
Sign up to request clarification or add additional context in comments.

Comments

4

If you're not completely inflexible about the format of $OUT, one possibility would be to repeat the option= string to allow for concatenation. Then you'd write:

export OUT="a=arg1 b=arg2.0 b=arg2.1"

If that is acceptable, the following script will work

#!/bin/bash

# Parse $OUT into an associative array.
# Instead of using $OUT, it would be cleaner to use "$@".
declare -A args
for arg in $OUT; do
  if [[ "$arg" =~ ^([[:alnum:]]+)=(.*)$ ]]; then
    key=${BASH_REMATCH[1]}
    val=${BASH_REMATCH[2]}
    if [[ -z ${args[$key]} ]]; then
      args[$key]=-$key="$val"
    else
      args[$key]+=" $val"
    fi
  fi
done

# Test, approximately as specified
command() { :; }
set -x
command "${args[@]}"
set +x

I can't say I like it much, but it's the closest I've been able to come.

Here's a sample run:

$ export OUT="a=foo b=bar  b=glitch s9= s9=* "
./command-runner
+ command -a=foo '-b=bar glitch' '-s9= *'
+ :
+ set +x

If you import a bash function (for example, in your bash startup file), you can make much better use of arrays. Here's one approach:

# This goes into your bash startup file:
declare -a SAVED_ARGS
save_args() {
  SAVED_ARGS=("$@")
}

do_script() {
  /path/to/script.sh "${SAVED_ARGS[@]}" "$@"
}

For expository purposes, script.sh:

#!/bin/bash
command() { :; }

set -x
command "${@/#/-}"
set +x

Example:

$ save_args x=3 y="a few words from our sponsor"
$ do_script a=3 b="arg2.0 arg2.1"
+ command -x=3 '-y=a few words from our sponsor' -a=3 '-b=arg2.0 arg2.1'
+ :
+ set +x
$ do_script a=42
+ command -x=3 '-y=a few words from our sponsor' -a=42
+ :
+ set +x

In case it's not obvious:

command() { :; }

defines a bash function called command which does almost nothing (except invoke the builtin : which does nothing), and

"${@/#/-}"

expands to the positional parameters, inserting a dash at the beginning of each one use a find-and-replace substitution. The pattern # is actually an empty pattern which only matches at the beginning of the string.

6 Comments

Hrm, that's a pretty interesting solution. But I'm trying to make it more intuitive on the user so they would treat it as simply appending it to the list of arguments passed to <command>. One of the reasons I was interested in the array approach was that I found that if I did something like ./script.sh -arg1="foo bar" and in the script itself i do <command> "$@", it works surprisingly well.
@EdY: The array solution is possible if you're ok with defining a bash function which calls your script which executes your command which lay in the house that jack built. Do you want that solution?
What do you mean "lay in the house that jack built"? Are you talking about my_script.sh -> bash function in my_script.sh -> calls command with arguments?
@EdY: Sorry, it was a very culture-specific joke. en.wikipedia.org/wiki/This_Is_the_House_That_Jack_Built . And yes, I'm talking about a bash function which interprets the array and calls your my_script.sh which then calls the command.
Ohh, so it it something like my_script -> grab the environment variable -> my_script env_variable_foramtted_for_cmd_line -> pass <command> "${@}"? Would it be like some if statement to check to see if it was the first or second time called? Or why does it need to be in a bash function?
|
0

set your env varivbale as: export abc=123

while execution of any script where abc need to pass as an argument pass as below: ./testing.sh "$abc"

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.