2

I have to import a compressed sql dump into a PostgreSQL database. The dump gets created by:

pg_dump -F c

So I get a compressed file, which I can't just parse line by line and then use psycopg2 to upload it into my database.

The compressed dumps I'm dealing with are quite big (up to 10GB). What would be an efficient way to do import them?

7
  • Is there a reason you can't re-import using the command line? Doing it through Python is going to be inefficient. Commented Feb 10, 2011 at 21:48
  • If you have to "control" it via Python, you can always execute the necessary command line command from within Python. Commented Feb 10, 2011 at 21:50
  • Why not just use postgres's own tools, such as pg_restore? Commented Feb 10, 2011 at 22:34
  • Do you know what compression algorithm it uses? Commented Feb 10, 2011 at 22:35
  • @Keith: Its own, as its says in the man. File is 'src/bin/pg_dump/pg_backup_archiver.c' in the source tree. It uses zlib but it has its own structure. Its not just zipped. Best bet would probably be interfacing with it in C and using the pg headers if you really have to do it this way. Commented Feb 10, 2011 at 23:26

1 Answer 1

2

You basically can't do that unless you reimplement pg_restore in your Python project. Consider instead calling pg_restore from your Python program.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.