I have a postgres table that I need to store arbitrary key/value pairs in, and some of the values could be arrays. I see that postgres supports JSONB and Array field types, and was considering using JSONB, and have been reviewing the JSONB docs on indexing
The thing I'm trying to understand is if the field can be indexed effectively, given that there will be increasingly different json schemas stored in the field?
For example, say I have a table to store plugin data, and those plugins will each be allowed to store as many keys as they want, per user.
Can I effectively store data in this field, and have it's indexes perform well, given that
- the number of plugins will increase, but will probably be < 100
- the json field values will mostly be simple json objects, one level deep
- I am in full control of what the json schema will look like, for each plugin, if that helps.
plugin_data
| key | value(JSONB) | plugin_type | userId |
|---|---|---|---|
| plugin.one | {id:1, test:'two'} | one | 100 |
| plugin.one | {id:32, test:'my title'} | one | 102 |
| plugin.two | {schedule:52, count:5, increment:2} | two | 200 |
| plugin.three | {duration:10s} | three | 200 |