0

I would like to import a large R data.frame object into Postgres. I am saving the object as a CSV file using these commands:

> out_file <- paste(input_path, "data.csv", sep="")
> con<-file(out_file, encoding="UTF-8")
> write.csv(df, out_file)

No error messages are shown. Then switching to psql I issue the import with COPY, which results in this error:

# COPY data_in FROM 'data.csv' DELIMITER ',' CSV HEADER;
ERROR:  invalid byte sequence for encoding "UTF8": 0xf8
CONTEXT:  COPY data_in, line 74358

Which software is failing here? Or is more guidance needed to get the correct encoding?

2
  • 1
    Try write.csv(df, out_file,fileEncoding=TRUE) or write.csv(df, con). You are using the file path despite creating a connection object Commented Jun 26, 2018 at 8:30
  • @Rohit well spotted. Would you like to file that as an answer? Commented Jun 27, 2018 at 12:22

1 Answer 1

1

From my comment:

write.csv(df, out_file,fileEncoding=TRUE)
# write.csv(df, con)

Either of the above will work. If the encoding option is added to the connection, I don't think it doesn't affect the file itself.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.