This is valid JSON (I've run it against two JSON validators and also parsed it using powershell):
{
"actionCD": "error",
"NotesTXT": "\"Exception call timeout\""
}
This is not valid JSON:
{
"actionCD": "error",
"NotesTXT": "\\"Exception call timeout\\""
}
However, the parse_json function yields a failure with the first example:
SELECT '{ "actionCD": "error", "NotesTXT": "\"Exception call timeout\"" }' as json_str
,PARSE_JSON(json_str) as json;
Error parsing JSON: missing comma, pos 38
And unexpectedly, the snowflake parse_json function works with the invalid json:
SELECT '{ "actionCD": "error", "NotesTXT": "\\"Exception call timeout\\"" }' as json_str
,PARSE_JSON(json_str) as json;
<No Errors>
This is leaving me throughly flummoxxed and uncertain on how to proceed. I'm using powershell programmatically to create valid JSON and then trying to insert valid JSON into snowflake using INSERT INTO ()...SELECT ...
Here is the insert statement I'm trying to build in powershell:
INSERT INTO DBNAME.SCHEMANAME.TABLENAME(
RunID
,jsonLogTXT
) SELECT
'$RunID'
,parse_json('$($mylogdata | ConvertTo-Json)')
;
# where $($mylogdata | ConvertTo-Json) outputs valid json, and from time-to-time includes \" to escape the double quotes.
# But snowflake fails because snowflake wants \\" to escape the double quotes.
Is this expected? (obviously I find it unexpected :-) ). What is the recommendation here? (Should I search my json-stored-as-a-string in powershell for " and replace it with \" before sending it on to snowflake? That feels really hacky, though?)
\as an escape character in its'...'strings.