When it comes to database optimization, developers often reach for the same old tools: rewrite the query slightly differently, slap an index on a column, denormalize, analyze, vacuum, cluster, repeat. Conventional techniques are effective, but sometimes being creative can really pay off!
In this article, I present unconventional optimization techniques in PostgreSQL.
image by abstrakt design
Table of Contents
Imagine you have this table of users:
db =# CREATE TABLE users ( id INT PRIMARY KEY , username TEXT NOT NULL , plan TEXT NOT NULL , CONSTRAINT plan_check CHECK ( plan IN ( 'free' , 'pro' )) ); CREATE TABLE
For each user you keep their name and which payment plan they're on. There are only two plans, "free" and "pro", so you add a check constraint.
Generate some data and analyze the table:
db =# INSERT INTO users SELECT n , uuidv4 (), ( ARRAY [ 'free' , 'pro' ])[ ceil ( random () * 2 )] FROM generate_series ( 1 , 100 _000 ) AS t ( n ); INSERT 0 100000 db =# ANALYZE users ; ANALYZE
You now have 100K users in the system.
... continue reading