I have two servers (let's say x.x.x.x - main one and y.y.y.y - secondary one)
On the main server I have a Django application running which stores its data into Postgres database, secondary one is absolutely empty.
Every minute it creates about 900+ lines in one table, so, eventually, it became over 3M lines in that table and the processing of all those objects (filtering, sorting) became really slow because of its amount. However, I need only those lines which were written within last 3 days, no more. Still, I cannot simply remove the data because I need it for analysis in the future, so I need to keep it somewhere.
What I think about is creating another database on the secondary server and keep all the extra data there. So I need to transfer all the data that is older than 3 days from local (primary) server to the remote (secondary) server.
The regularity can be achieved using cron which is a trivial task.
What's not trivial is the command I need to execute in cron. I don't think there is a built-in SQL command to do this, so I'm wondering if this is possible at all.
I think the command should look something like this
INSERT INTO remote_server:table
SELECT * FROM my_table;
Also, I think it's also worth mentioning that the table I'm having troubles with is being constantly updated as I've written above. So, may be these updates are causing speed problems when executing some filter or sorting queries.