0

I am writing a function for a python environment which needs to import data from a csv file into an existing table. The scope in which the function can execute does not allow for a db SUPERUSER to run the function and so the COPY function is out of the question.

What other options are efficient?

The files range from 100,000 to 1,000,000 rows.

This question has also been posted here

2
  • x-posted dba.stackexchange.com/q/154670/7788 Commented Nov 9, 2016 at 3:52
  • This question has also been posted [here][1] Commented Nov 9, 2016 at 6:50

1 Answer 1

2

Use psycopg2's COPY ... FROM STDIN support.

Sign up to request clarification or add additional context in comments.

6 Comments

I will look into that immediately. Thanks a lot!
x-posted. I am not familiar with this term. Is this relating to my posting the exact same question on multiple forums?
Right. "cross-post". It helps people avoid wasting time answering it in multiple places because they don't know it's answered elsewhere. It also helps other people looking at the same question later find an answer. I find it quite rude to post many places like that without cross-referencing them, it shows disrespect for others' time and for the quality of the knowledge base. If you posted other places please edit your question to link to appropriate forums, list archives, etc.
Not a problem. Thanks for letting me know.
The function requests a buffer size parameter. How do you determine an appropriate buffer size?
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.