0

I have some extensive queries (each of them lasts around 90 seconds). The good news is that my queries are not changed a lot. As a result, most of my queries are duplicate. I am looking for a way to cache the query result in PostgreSQL. I have searched for the answer but I could not find it (Some answers are outdated and some of them are not clear).

  • I use an application which is connected to the Postgres directly.

  • The query is a simple SQL query which return thousands of data instance.

SELECT * FROM Foo WHERE field_a<100

Is there any way to cache a query result for at least a couple of hours?

2 Answers 2

1

It is possible to cache expensive queries in postgres using a technique called a "materialized view", however given how simple your query is I'm not sure that this will give you much gain.

You may be better caching this information directly in your application, in memory. Or if possible caching a further processed set of data, rather than the raw rows.

ref:
https://www.postgresql.org/docs/current/rules-materializedviews.html

Sign up to request clarification or add additional context in comments.

7 Comments

I am connected to the Postgres through Grafana (a time series admin panel application). So, I am not able to cache the query in the application.
Normally, in a similar situation I cache the queries using Redis but in this case I am not able to do so.
What do your rows look like?
You may wish to consider ETLing the data from postgres into something like influxdata.com if it is high frequency time series data that you wish to display in grafana. Alternatively consider if you can use the materialized view approach to aggregate the data in some way, such that less rows will need to be returned
Does your grafana widget run aggregate functions over the data? If so, you could probably move the aggregations into postgres onto a materialised view. Can't really offer any better suggestions without more context about what the data looks like and what you are trying to do with it
|
0

Depending on what your application looks like, a TEMPORARY TABLE might work for you. It is only visible to the connection that created it and it is automatically dropped when the database session is closed.

CREATE TEMPORARY TABLE tempfoo AS
  SELECT * FROM Foo WHERE field_a<100;

The downside to this approach is that you get a snapshot of Foo when you create tempfoo. You will not see any new data that gets added to Foo when you look at tempfoo.

Another approach. If you have access to the database, you may be able to significantly speed up your queries by adding and index on on field

1 Comment

I am using Grafana and in my dashboard I have a raw query (string of the query) which is called. I believe temporary table is not the solution of my problem. I am looking for something like Redis caching but I would like to my Postgres handle it.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.