0

I am dealing with a great number of inserts per second int a Postgres DB (and a lot of read too). A few days ago I heard about Redis and start to think about send all these INSERTS for Redis first, to avoid a lot of open/insert/close things in Postgres every second. Than, after some short period, i could group those data from Redis, in a INSERT SQL structure and run them together in Postgres, with only one connection opened. The system stores GPS data and an Online Map read them, in real time. Any suggestions for that scenario? Thanks !!

1
  • The latest unstable Redis version (3.2) includes built-in GeoSpatial indexes. See Introducing the GEO API. Commented Dec 7, 2015 at 20:45

1 Answer 1

1

I do not know how important it is in your case to have the data available for your users almost real time. But from the listed above, I do not see anything that can not be solved by configuration/replication for Postgresql.

  • You have A lot of writes to your database; before going for a different technology, Postgresql is tested in big battles and I am sure you can get more by configuring it to handle more writes if it is optimized. link

  • You have a lot of read to your database; A Master-Slave replication can let all your read traffic be targeted to those DB salves and you can scale horizontally as much as you need.

Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.