2

I am trying to import this CSV file:

HEADER
"{
  ""field1"": ""123"",
  ""field2"": ""456""
}"
"{
  ""field1"": ""789"",
  ""field2"": ""101""
}"

into my Postgres table.

However, it seems that the \copy my_table(my_text) from 'my_file.csv' command is creating a row for each line of the file.

This is what I get :

                       my_text
-----------------------------------------------------
 HEADER
 "{
   ""field1"": ""123"",
   ""field2"": ""456""
 }"
 "{
   ""field1"": ""789"",
   ""field2"": ""101""
 }"
(9 rows)

and what I expect:

{""field1"": ""123"", ""field2"": ""456""}
{""field1"": ""789"", ""field2"": ""101""}
(2 rows)

2 Answers 2

1

Escaping the new line might do the trick:

\copy my_table(my_text) from my_file.csv csv header escape E'\n' quote '"'
Sign up to request clarification or add additional context in comments.

2 Comments

using this method I get 2 rows with a '{' as a value.
are you sure? keep in mind that the new line is part of the string, so if you're visualizing the result set with a tool like pgadmin you might need to "resize" the row to see the whole thing. In my machine it works just fine @JamieA
0

The default for \copy is "text" format, not "csv" format. All you have to do is tell your \copy to use csv, and that there is a header line.

\copy my_table(my_text) from 'my_file.csv' csv header

Changing the escape character as the other answer suggests is unnecessary and in fact will break things. Your data will load, but will not be valid JSON.

You probably want to make this column type JSON or JSONB, not text. This will automatically validate your data as being valid JSON data, and in the case of JSONB will make future parsing of it faster.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.