I have a large PostgreSQL database that is effectively read-only, except for very infrequent batch updates. Are there any performance optimizations I can do to make use of this? Can/should I disable visibility check for example?
The largest table:
CREATE TABLE "gene_measurements" (
"gene" INTEGER NOT NULL REFERENCES "genes" ON DELETE CASCADE,
"sample" INTEGER NOT NULL REFERENCES "samples" ON DELETE CASCADE,
"value" REAL NOT NULL
);
CREATE UNIQUE INDEX "gene_measurements_unique_1" ON "gene_measurements" ("sample", "gene") INCLUDE ("value");
A typical query:
SELECT value WHERE gene = 1 AND sample = 2
And the plan:
-------------------------------------------------------------------------------------------------------------------------------------------------------
Index Only Scan using gene_measurements_gene_index on gene_measurements (cost=0.57..4.59 rows=1 width=4) (actual time=63.621..63.621 rows=0 loops=1)
Index Cond: ((sample = 2) AND (gene = 1))
Heap Fetches: 0
Planning Time: 0.156 ms
Execution Time: 63.674 ms
(5 rows)
explain (analyze, buffers, format text)(not just a "simple" explain) as formatted text and make sure you keep the indention of the plan. Paste the text, then put```on the line before the plan and on a line after the plan. Please also include completecreate indexstatements for all indexes as well.