0

Is there a best practice for handling JSON documents with duplicated keys in PowerShell?

Ideally would like to compile the values of said keys into a single array mapped to the key.

For example:

{
"column01" : "id1",
"column02" : "id2",
"column03: : "id3",
"column03" : "id4"
}

Transformed to:

{
"column01" : "id1",
"column02" : "id2",
"column03: : [
             "id3",
             "id4"
             ]
}

I have been exploring options with the ConvertTo-Json cmdlet, but have not found a solution.

Appreciate the help!

1
  • 3
    that is not valid JSON. what generated it? perhaps that should be fixed. Commented Nov 30, 2018 at 20:28

1 Answer 1

1

While JSON does allow duplicate keys, this is not recommended and I suggest handling this by normalizing the JSON. Otherwise you can certainly pass your JSON with the duplicate key to ConvertFrom-Json but this won't result in your desired output.

It should be...

$obj =     @"
{ "column01" : "id1",
  "column02" : "id2",
  "column03" : ["id3", "id4"]

}
"@

Then using $json = $obj | ConvertFrom-Json to convert into a powershell obj.

Then you can do the same with a PowerShell object and convert to JSON.

$obj =     @{ 
  "column01" = "id1";
  "column02" = "id2";
  "column03" = ("id3", "id4")

}    

$json = $obj | ConvertTo-Json

$json

If you want to know how to normalize the data, I suggest you either edit your question or ask a new one.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.