199

I need to read these bash variables into my JSON string and I am not familiar with bash. any help is appreciated.

#!/bin/sh

BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar

JSON_STRING='{"bucketname":"$BUCKET_NAME"","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}'


echo $JSON_STRING 
3

21 Answers 21

348

You are better off using a program like jq to generate the JSON, if you don't know ahead of time if the contents of the variables are properly escaped for inclusion in JSON. Otherwise, you will just end up with invalid JSON for your trouble.

BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar

JSON_STRING=$( jq -n \
                  --arg bn "$BUCKET_NAME" \
                  --arg on "$OBJECT_NAME" \
                  --arg tl "$TARGET_LOCATION" \
                  '{bucketname: $bn, objectname: $on, targetlocation: $tl}' )
Sign up to request clarification or add additional context in comments.

9 Comments

Excellent advice: use the right parser/generator for the task at hand. Applies to JSON, CSV, XML, ...
It would be probably nice for future readers, if jq's args are explained.
Normally, jq processes input; but with -n, there is none. --arg <var> <string> replaces $<var> with the string value <string>. --argjson <var> <json> replaces $<var> with the Json blob <json>. The combination of --arg and --argjson provides powerful mechanisms to construct arbitrary Json structures.
If you want oneline minified output, add -c / --compact-output.
If you simply want to compose an object with the given args as key-value pairs, you can replace the template portion at the end with '$ARGS.named' which is a special jq variable. E.g. jq -n --arg foo bar '$ARGS.named' will produce {"foo":"bar"}. Edit: I see now this is also mentioned as an answer below :D
|
85

You can use printf:

JSON_FMT='{"bucketname":"%s","objectname":"%s","targetlocation":"%s"}\n'
printf "$JSON_FMT" "$BUCKET_NAME" "$OBJECT_NAME" "$TARGET_LOCATION"

much clear and simpler

3 Comments

nice, especially if you don't or can't have jq installed. this lets you save it to a variable too: JSON_STRING=$(printf "$JSON_FMT" "$BUCKET_NAME"...)
This will give invalid JSON if any of variables contains a double-quote. OBJECT_NAME='This "will not" work well' yields {"objectname":"This "will not" work well"}
"Simpler" but wrong.
67

A possibility:

#!/bin/bash 

BUCKET_NAME="testbucket"
OBJECT_NAME="testworkflow-2.0.1.jar"
TARGET_LOCATION="/opt/test/testworkflow-2.0.1.jar

# one line
JSON_STRING='{"bucketname":"'"$BUCKET_NAME"'","objectname":"'"$OBJECT_NAME"'","targetlocation":"'"$TARGET_LOCATION"'"}'

# multi-line
JSON_STRING="{
\"bucketname\":\"${BUCKET_NAME}\",
\"objectname\":\"${OBJECT_NAME}\",
\"targetlocation\":\"${TARGET_LOCATION}\"
}"

# [optional] validate the string is valid json
echo "${JSON_STRING}" | jq

4 Comments

Clever. I always forget you can mix single and double quote strings.
The jq library example is great for complicated/long implementations, but this is great for a 1-liner here and there.
This comes especially handy when you need to pass the JSON string as an option in a command, say elasticdump. Handling it via jq might complicate it.
This will give invalid JSON if any of variables contains a double-quote. OBJECT_NAME='This "will not" work well' yields {"objectname":"This "will not" work well"}
54

In addition to chepner's answer, it's also possible to construct the object completely from args with this simple recipe:

BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar

JSON_STRING=$(jq -n \
                  --arg bucketname "$BUCKET_NAME" \
                  --arg objectname "$OBJECT_NAME" \
                  --arg targetlocation "$TARGET_LOCATION" \
                   '$ARGS.named')

Explanation:

  • --null-input | -n disabled reading input. From the man page: Don't read any input at all! Instead, the filter is run once using null as the input. This is useful when using jq as a simple calculator or to construct JSON data from scratch.
  • --arg name value passes values to the program as predefined variables: value is available as $name. All named arguments are also available as $ARGS.named

Because the format of $ARGS.named is already an object, jq can output it as is.

4 Comments

This must be a newer feature (which would be nice to use) but jq version jq-1.5-1-a5b5cbe (default on Google CloudShell) barfs on the above with jq: error: ARGS/0 is not defined at <top-level>, line 1: $ARGS.named jq: 1 compile error
I just verified it to be working with jq-1.6 on Arch Linux, which is the latest release (November 2018).
This works very nicely if you want a single-level object with all the values at the top level. However, for some use cases (eg communicating with APIs using curl) args may need to be more deeply nested. In that case, the original version of @chepner's answer is needed.
It needs to be exactly - '$ARGS.named', double quotes don't work
28

I had to work out all possible ways to deal json strings in a command request, Please look at the following code to see why using single quotes can fail if used incorrectly.

# Create Release and Tag commit in Github repository

# returns string with in-place substituted variables 

json=$(cat <<-END
    {
        "tag_name": "${version}", 
        "target_commitish": "${branch}", 
        "name": "${title}", 
        "body": "${notes}", 
        "draft": ${is_draft}, 
        "prerelease": ${is_prerelease} 
    }
END
)

# returns raw string without any substitutions
# single or double quoted delimiter - check HEREDOC specs

json=$(cat <<-!"END"   # or 'END' 
    {
        "tag_name": "${version}", 
        "target_commitish": "${branch}", 
        "name": "${title}", 
        "body": "${notes}", 
        "draft": ${is_draft}, 
        "prerelease": ${is_prerelease} 
    }
END
)
# prints fully formatted string with substituted variables as follows:

echo "${json}"  
{ 
    "tag_name" : "My_tag", 
    "target_commitish":"My_branch"
    ....
}

Note 1: Use of single vs double quotes

# enclosing in single quotes means no variable substitution 
# (treats everything as raw char literals)

echo '${json}'   
${json} 

echo '"${json}"'   
"${json}" 
# enclosing in single quotes and outer double quotes causes
# variable expansion surrounded by single quotes(treated as raw char literals).

echo "'${json}'" 
'{ 
    "tag_name" : "My_tag", 
    "target_commitish":"My_branch"
    ....
}'

Note 2: Caution with Line terminators

  • Note the json string is formatted with line terminators such as LF \n
  • or carriage return \r(if its encoded on windows it contains CRLF \r\n)
  • using (translate) tr utility from shell we can remove the line terminators if any

# following code serializes json and removes any line terminators 
# in substituted value/object variables too

json=$(echo "$json" | tr -d '\n' | tr -d '\r' )
# string enclosed in single quotes are still raw literals

echo '${json}'   
${json} 

echo '"${json}"'   
"${json}" 
# After CRLF/LF are removed

echo "'${json}'" 
'{ "tag_name" : "My_tag", "target_commitish":"My_branch" .... }'

Note 3: Formatting

  • while manipulating json string with variables, we can use combination of ' and " such as following, if we want to protect some raw literals using outer double quotes to have in place substirution/string interpolation:
# mixing ' and " 

username=admin
password=pass

echo "$username:$password"
admin:pass

echo "$username"':'"$password"
admin:pass

echo "$username"'[${delimiter}]'"$password"
admin[${delimiter}]pass

Note 4: Using in a command

  • Following curl request already removes existing \n (ie serializes json)
response=$(curl -i \
            --user ${username}:${api_token} \
            -X POST \
            -H 'Accept: application/vnd.github.v3+json' \
            -d "$json" \
            "https://api.github.com/repos/${username}/${repository}/releases" \
            --output /dev/null \
            --write-out "%{http_code}" \
            --silent
          )

So when using it for command variables, validate if it is properly formatted before using it :)

4 Comments

code only Answers are discouraged on SO. Please consider adding an explanation highlighting the important bits & how/why it works to force the OP's issue, for long term value & quick understanding. most upvotes are accumulated over time, as users learn something from your post that they can apply to their own coding issues. Explanations go a long way to "ah ha" learning moments. Words also helps overcome the "wall of code" hesitation that may prompt a user to skip to a different post instead.
Thanks for pointing out, I have updated my answers, my issues were due to using incorrect mix of single and double quotes in the json string and when using the json variable. Also I had issues with CR/LF when using it on posix/windows bash shell and unix bash shell. So removing the newline/carriage return terminators were key points in resolving my issues. Happy to contribute!
I found your examples to be by far the easiest to read and write, especially for more complex JSON documents. Thank you!
This is great. The option for jq on my system would have been quite difficult. This will help me out.
20

First, don't use ALL_CAPS_VARNAMES: it's too easy to accidentally overwrite a crucial shell variable (like PATH)

Mixing single and double quotes in shell strings can be a hassle. In this case, I'd use printf:

bucket_name=testbucket
object_name=testworkflow-2.0.1.jar
target_location=/opt/test/testworkflow-2.0.1.jar
template='{"bucketname":"%s","objectname":"%s","targetlocation":"%s"}'

json_string=$(printf "$template" "$BUCKET_NAME" "$OBJECT_NAME" "$TARGET_LOCATION")

echo "$json_string"

For homework, read this page carefully: Security implications of forgetting to quote a variable in bash/POSIX shells


A note on creating JSON with string concatenation: there are edge cases. For example, if any of your strings contain double quotes, you can broken JSON:

$ bucket_name='a "string with quotes"'
$ printf '{"bucket":"%s"}\n' "$bucket_name"
{"bucket":"a "string with quotes""}

Do do this more safely with bash, we need to escape that string's double quotes:

$ printf '{"bucket":"%s"}\n' "${bucket_name//\"/\\\"}"
{"bucket":"a \"string with quotes\""}

1 Comment

using all caps or not depends on code style, and usually people do use ALL_CAPS_ENV for variables. e.g: google.github.io/styleguide/…
9

If you need to build a JSON representation where members mapped to undefined or empty variables should be ommited, then jo can help.

#!/bin/bash

BUCKET_NAME=testbucket
OBJECT_NAME=""

JO_OPTS=()

if [[ ! "${BUCKET_NAME}x" = "x" ]] ; then
        JO_OPTS+=("bucketname=${BUCKET_NAME}")
fi

if [[ ! "${OBJECT_NAME}x" = "x" ]] ; then
        JO_OPTS+=("objectname=${OBJECT_NAME}")
fi

if [[ ! "${TARGET_LOCATION}x" = "x" ]] ; then
        JO_OPTS+=("targetlocation=${TARGET_LOCATION}")
fi

jo "${JO_OPTS[@]}"

The output of the commands above would be just (note the absence of objectname and targetlocation members):

{"bucketname":"testbucket"}

2 Comments

Found this question and its answers, all from 2018. But then weirdly, down here at the bottom is your answer, which you wrote only 4 hours ago. So +1 for this great-looking utility, but also +1 for the weird coincidence.
This is one of the few correct answers. Others don’t correctly encode strings that contain quotes or backslashes.
5

can be done following way:

JSON_STRING='{"bucketname":"'$BUCKET_NAME'","objectname":"'$OBJECT_NAME'","targetlocation":"'$TARGET_LOCATION'"}'

Comments

2

For Node.js Developer, or if you have node environment installed, you can try this:

JSON_STRING=$(node -e "console.log(JSON.stringify({bucketname: $BUCKET_NAME, objectname: $OBJECT_NAME, targetlocation: $TARGET_LOCATION}))")

Advantage of this method is you can easily convert very complicated JSON Object (like object contains array, or if you need int value instead of string) to JSON String without worrying about invalid json error.

Disadvantage is it's relying on Node.js environment.

2 Comments

Neat idea, but it's missing some form of quotes around the bash variables, which means node will interpret them as JavaScript variables instead of strings (so the command will error). Also, there's a potential security vulnerability in using "injected" bash variables like this - see my answer
Like the parent said, this is not going to work. See an attempt of mine for a more robust nodejs version: stackoverflow.com/a/73907425/245966
2

You could use envsubst:

  export VAR="some_value_here"
  echo '{"test":"$VAR"}' | envsubst > json.json

also it might be a "template" file:

//json.template
{"var": "$VALUE", "another_var":"$ANOTHER_VALUE"}

So after you could do:

export VALUE="some_value_here"
export ANOTHER_VALUE="something_else"
cat  json.template | envsubst > misha.json

Comments

2

For a general case of building JSON from bash with arbitrary inputs, many of the previous responses (even the high voted ones with jq) omit cases when the variables contain " double quote, or \n newline escape string, and you need complex string concatenation of the inputs.

When using jq you need to printf %b the input first to get the \n converted to real newlines, so that once you pass through jq you get \n back and not \\n.

I found this with version with nodejs to be quite easy to reason about if you know javascript/nodejs well:

TITLE='Title'
AUTHOR='Bob'
JSON=$( TITLE="$TITLE" AUTHOR="$AUTHOR" node -p 'JSON.stringify( {"message": `Title: ${process.env.TITLE}\n\nAuthor: ${process.env.AUTHOR}`} )' )

It's a bit verbose due to process.env. but allows to properly pass the variables from shell, and then format things inside (nodejs) backticks in a safe way.

This outputs:

printf "%s\n" "$JSON"
{"message":"Title: Title\n\nAuthor: Bob"}

(Note: when having a variable with \n always use printf "%s\n" "$VAR" and not echo "$VAR", whose output is platform-dependent! See here for details)

Similar thing with jq would be

TITLE='Title'
AUTHOR='Bob' 
MESSAGE="Title: ${TITLE}\n\nAuthor: ${AUTHOR}"
MESSAGE_ESCAPED_FOR_JQ=$(printf %b "${MESSAGE}")
JSON=$( jq '{"message": $jq_msg}' --arg jq_msg "$MESSAGE_ESCAPED_FOR_JQ" --null-input --compact-output --raw-output --monochrome-output )

(the last two params are not necessary when running in a subshell, but I just added them so that the output is then same when you run the jq command in a top-level shell).

Comments

1

To build upon Hao's answer using NodeJS: you can split up the lines, and use the -p option which saves having to use console.log.

JSON_STRING=$(node -pe "
  JSON.stringify({
    bucketname: process.env.BUCKET_NAME,
    objectname: process.env.OBJECT_NAME,
    targetlocation: process.env.TARGET_LOCATION
  });
")

An inconvenience is that you need to export the variables beforehand, i.e.

export BUCKET_NAME=testbucket
# etc.

Note: You might be thinking, why use process.env? Why not just use single quotes and have bucketname: '$BUCKET_NAME', etc so bash inserts the variables? The reason is that using process.env is safer - if you don't have control over the contents of $TARGET_LOCATION it could inject JavaScript into your node command and do malicious things (by closing the single quote, e.g. the $TARGET_LOCATION string contents could be '}); /* Here I can run commands to delete files! */; console.log({'a': 'b. On the other hand, process.env takes care of sanitising the input.

Comments

1

These solutions come a little late but I think they are inherently simpler that previous suggestions (avoiding the complications of quoting and escaping).

    BUCKET_NAME=testbucket
    OBJECT_NAME=testworkflow-2.0.1.jar
    TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar
    
    # Initial unsuccessful solution
    JSON_STRING='{"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}'
    echo $JSON_STRING 
    
    # If your substitution variables have NO whitespace this is sufficient
    JSON_STRING=$(tr -d [:space:] <<JSON
    {"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}
    JSON
    )
    echo $JSON_STRING 
    
    # If your substitution variables are more general and maybe have whitespace this works
    JSON_STRING=$(jq -c . <<JSON
    {"bucketname":"$BUCKET_NAME","objectname":"$OBJECT_NAME","targetlocation":"$TARGET_LOCATION"}
    JSON
    )
    echo $JSON_STRING 
    
    #... A change in layout could also make it more maintainable
    JSON_STRING=$(jq -c . <<JSON
    {
       "bucketname" : "$BUCKET_NAME",
       "objectname" : "$OBJECT_NAME",
       "targetlocation" : "$TARGET_LOCATION"
    }
    JSON
    )
    echo $JSON_STRING

1 Comment

None of this solves the need for quoting. Test if OBJECT_NAME='Steve "Jobs" McQueen' -- to be valid JSON you'd need to have "Jobs" changed to \"Jobs\"; jq can do that, bash's heredoc code doesn't understand the need.
0

Bash will not insert variables into a single-quote string. In order to get the variables bash needs a double-quote string. You need to use double-quote string for the JSON and just escape double-quote characters inside JSON string. Example:

#!/bin/sh

BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar

JSON_STRING="{\"bucketname\":\"$BUCKET_NAME\",\"objectname\":\"$OBJECT_NAME\",\"targetlocation\":\"$TARGET_LOCATION\"}"


echo $JSON_STRING 

Comments

0

if you have node.js and get minimist installed in global:

jc() {
    node -p "JSON.stringify(require('minimist')(process.argv), (k,v) => k=='_'?undefined:v)" -- "$@"
}
jc --key1 foo --number 12 --boolean \
    --under_score 'abc def' --'white space' '   '
# {"key1":"foo","number":12,"boolean":true,"under_score":"abc def","white space":"   "}

you can post it with curl or what:

curl --data "$(jc --type message --value 'hello world!')" \
    --header 'content-type: application/json' \
    http://server.ip/api/endpoint

be careful that minimist will parse dot:

jc --m.room.member @gholk:ccns.io
# {"m":{"room":{"member":"@gholk:ccns.io"}}}

Comments

0

Used this for AWS Macie configuration:

JSON_CONFIG=$( jq -n \
   --arg bucket_name "$BUCKET_NAME" \
   --arg kms_key_arn "$KMS_KEY_ARN" \
   '{"s3Destination":{"bucketName":$bucket_name,"kmsKeyArn":$kms_key_arn}}'
)

aws macie2 put-classification-export-configuration --configuration "$JSON_CONFIG"

Comments

0

You can simply make a call like this to print the JSON.

#!/bin/sh
BUCKET_NAME=testbucket

OBJECT_NAME=testworkflow-2.0.1.jar

TARGET_LOCATION=/opt/test/testworkflow-2.0.1.jar

echo '{ "bucketName": "'"$BUCKET_NAME"'", "objectName": "'"$OBJECT_NAME"'", "targetLocation": "'"$TARGET_LOCATION"'" }'

or

JSON_STRING='{ "bucketName": "'"$BUCKET_NAME"'", "objectName": "'"$OBJECT_NAME"'", "targetLocation": "'"$TARGET_LOCATION"'" }'
echo $JOSN_STRING

Comments

0

You can do it with jo:

#/bin/sh

BUCKET_NAME="testbucket"
OBJECT_NAME="testworkflow-2.0.1.jar"
TARGET_LOCATION="/opt/test/testworkflow-2.0.1.jar"

jo bucketname="${BUCKET_NAME}" objectname="${OBJECT_NAME}" targetlocation="${TARGET_LOCATION}"
# {"bucketname":"testbucket","objectname":"testworkflow-2.0.1.jar","targetlocation":"/opt/test/testworkflow-2.0.1.jar"}

Comments

0

with jq "@sh" we can translate jq string array to bash array:

#!/bin/bash

set -eufo pipefail
#set -x

# $1 should calculate an json array from input
jq_array() {
  jq "$1"
}

# $1 is an array name
# $2 should calculate an json array from input
read_jq_array() {
  eval unset "$1"
  eval "$(jq -r '(["'"$1"'=("] + ['"$2"' | @sh ] + [")"]) | join(" ")')"
}

dump_array() {
  printf "[%s]\n" "$@"
}

input_string='["a'\''  b", "2\"\nnew line", "1\t2\t3"]'

# view the input
jq_array '.' <<< "$input_string"

arr=()

# transform to bash array
read_jq_array arr . <<< "$input_string"

# view the result
dump_array "${arr[@]}"

The bash function read_jq_array parse a json array and eval assign to the target bash array. All special characters line new line, tab, space, quotas are kept raw as expected.

Comments

0

If you don't want to use jq or other external binary, you can use this escape_json_string function in Bash:

#!/bin/bash

function escape_json_string() {
  local input=$1
  for ((i = 0; i < ${#input}; i++)); do
    local char="${input:i:1}"
    local escaped="${char}"
    case "${char}" in
      $'"' ) escaped="\\\"";;
      $'\\') escaped="\\\\";;
      *)
        if [[ "${char}" < $'\x20' ]]; then
          case "${char}" in 
            $'\b') escaped="\\b";;
            $'\f') escaped="\\f";;
            $'\n') escaped="\\n";;
            $'\r') escaped="\\r";;
            $'\t') escaped="\\t";;
            *) escaped=$(printf "\u%04X" "'${char}")
          esac
        fi;;
    esac
    echo -n "${escaped}"
  done
}

BUCKET_NAME=testbucket
OBJECT_NAME=testworkflow-2.0.1.jar
TARGET_LOCATION='/opt/My "Test"/testworkflow-2.0.1.jar'

JSON_STRING="{
\"bucketname\":\"$(escape_json_string "${BUCKET_NAME}")\",
\"objectname\":\"$(escape_json_string "${OBJECT_NAME}")\",
\"targetlocation\":\"$(escape_json_string "${TARGET_LOCATION}")\"
}"

echo "${JSON_STRING}"

Output:

{
"bucketname":"testbucket",
"objectname":"testworkflow-2.0.1.jar",
"targetlocation":"/opt/My \"Test\"/testworkflow-2.0.1.jar"
}

Comments

0

The answers here are good when the object properties are known up front. If it's a list of properties of an n length however, I find this utility a better approach:

build_json() {
  local args=("$@")
  local arg_length=${#args[@]}
  local obj="{}"

  for ((i = 0; i < "$arg_length"; i++)); do
    if [[ ${args[$i]} == --* ]] && [[ $((i + 1)) -lt $arg_length ]]; then
      local key="${args[$i]#--}"
      local value="${args[$((i + 1))]}"
      obj=$(echo "$obj" | jq "(.\"${key}\"|=\"$value\")")
    fi
  done

  echo "$obj"
}

An example use will be:

$ build_json --asdf 1 --fdsa 2
{
  "asdf": "1",
  "fdsa": "2"
}

# or

$ build_json --prop1 val1 --prop2 val2 --prop3 val3 --prop4 val4
{
  "prop1": "val1",
  "prop2": "val2",
  "prop3": "val3",
  "prop4": "val4"
}

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.