12

I'd like to pass a JSON object as a command line argument to node. Something like this:

node file.js --data { "name": "Dave" }

What's the best way to do this or is there another more advisable way to do accomplish the same thing?

5
  • Do you want it exactly like in your question? I'd expect it to have single-quotes around it at least. Commented Sep 30, 2015 at 22:49
  • No that would be fine actually. If I use JSON.stringify on an object first and then put it in single quotes is that safe to be passed on the command line? Commented Sep 30, 2015 at 22:55
  • Depends on if the stringified JSON had single-quotes in the content, then you'd have to escape them like any other command-line argument. Commented Sep 30, 2015 at 22:56
  • Ok that works great - thanks! Commented Sep 30, 2015 at 23:17
  • 1
    Depends on the "command line" you are using too. If this is *nix with any of the common shells you will need to escape or otherwise protect shell meta-characters from interpretation by the shell. --data '{ "name": "Dave" }' with single-quotes would do it, or --data { \"name\": \"Dave\" } but the spaces in this second example breaks that JSON into several arguments. What you probably want is --data is an arg, and { "name": "Dave" } is an arg, and that's mostly a shell problem (or command.com or cmd.exe on windows) Commented Sep 30, 2015 at 23:57

3 Answers 3

5

if its a small amount of data, I'd use https://www.npmjs.com/package/minimist, which is a command line argument parser for nodejs. It's not json, but you can simply pass options like

--name=Foo 

or

-n Foo

I think this is better suited for a command line tool than json.

If you have a large amount of data you want to use you're better of with creating a json file and only pass the file name as command line argument, so that your program can load and parse it then.

Big objects as command line argument, most likely, aren't a good idea.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks. I think it's a good idea - I was considering that but ideally I'd avoid the file system for performance reasons.
4

this works for me:

$ node foo.js --json-array='["zoom"]'

then in my code I have:

  import * as _ from 'lodash';

  const parsed = JSON.parse(cliOpts.json_array || []);

  _.flattenDeep([parsed]).forEach(item => console.log(item));

I use dashdash, which I think is the best choice when it comes to command line parsing.

To do the same thing with an object, just use:

$ node foo.js --json-object='{"bar": true}'

3 Comments

can you provide more detail i mean what you has been used in option or you created your custom type
@Rocky with dashdash, --json-array is just a string option, same with --json-object, the string gets parsed with JSON.parse, manually in your program
np, by the way if you need variables in your JSON string, then check out something called heredoc, google bash+heredoc. if you dont use heredoc then you have to comment out every double quote which is annoying and errorprone
3

This might be a bit overkill and not appropriate for what you're doing because it renders the JSON unreadable, but I found a robust way (as in "works on any OS") to do this was to use base64 encoding.

I wanted to pass around lots of options via JSON between parts of my program (a master node routine calling a bunch of small slave node routines). My JSON was quite big, with annoying characters like quotes and backslashes so it sounded painful to sanitize that (particularly in a multi-OS context).

In the end, my code (TypeScript) looks like this:

in the calling program:

const buffer: Buffer = new Buffer(JSON.stringify(myJson));
const command: string = 'node slave.js --json "' + buffer.toString('base64') + '" --b64';
const slicing: child_process.ChildProcess = child_process.exec(command, ...)

in the receiving program:

let inputJson: string;
if (commander.json) {
  inputJson = commander.json;
  if (commander.b64) {
    inputJson = new Buffer(inputJson, 'base64').toString('ascii');
  }
}

(that --b64 flag allows me to still choose between manually entering a normal JSON, or use the base64 version, also I'm using commander just for convenience)

2 Comments

Why do you choose to use buffer for passing around json strings? Are there any advantages or disadvantages?
As far as I remember, it was as I said to get rid of special characters which otherwise may cause OS-dependent issues. The buffer is just how I did the base64 conversion. Also, I added some zip compression on top of it making it more compact, avoiding hitting the command size limit (which is sometimes pretty short). The other alternative, for commands still too large, is just to write the command in a json file and pass that file's name around.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.