I have an original database[1] in which I deleted quite a lot of data which I didn't intend to by using a bad query.
I have created a backup database[2] that is a copy of the original database[1] I deleted from, before I deleted anything. Now I want to move the data that was wrongfully deleted, from my backup database[2] to my original database[1]
I need to make sure that no duplicates are created in this process as some of the data is still in my original database[1].
My databases have the structure:
-----------------------------------------------------
| id (serial - auto incrementing int) | - primary key
| did (varchar) |
| sid (int) |
| timestamp (bigint) |
| data (json) |
| db_timestamp (bigint) |
-----------------------------------------------------
I have tried finding a solution by Googling, but to no avail. Based on my SQL knowledge I don't think this can be done (from one db to another), but I am quite ready to implement a Python script, if that is what it takes (I am quite well rounded in Python). I am running PostgreSQL 9.6 and I am using pgAdmin 3 to write queries to my DB.
The table I need to transfer from is called datastore and I hope that someone has a good idea of how to perform this data transfer, without creating duplicates.
I hope I got every detail nailed down, if not, let me know and I will provide it.
UPDATE
I guess it would be a good idea to mention that the problem that I encounted on my original database[1] was that I deleted a little to much data. The rows I want transferred are the rows that I accidentally deleted.
So I assume that it would be good enough to check if a the id of each row exists already. If it does I should just skip the row in question and if not I should transfer the row.
Any suggestions are welcome! I am not the brightest SQL hawk ;)
COPYfrom db to csv abd then from csv to a another db. or usedblinkfor it. But from you rescription I have a feeling you speak of table when you say database. Please update post with backup and restore command you usedselectandinsert. Depending on how the data ended up in your tables, yourtimestampcolumn might be a good candidate to place a unique constraint, rejecting any duplicates.copyanddblinkwould work for you