Ensure array column are disjoint sets in postgres - postgresql

If I have a table with a column with an array of strings, how can I create a constraint to ensure that the arrays in all rows are disjoint sets (ie. no string is present in the array in more than one row)?
This would be for postgres, need a gin index, and using the && array operator.

Related

PostgreSQL get array column value from second index

I have a table with a column that contains array of numbers like : ['1','3','45'];
And was looking for a way to select the column value from the second index so that I get ['3','45']
Is there any way to do this in PostgreSQL?

PostgreSQL: Index JSONB array that is queried with `#?` operator

My table (table) has a JSONB field (data) that contains a field with an array where I store tags (tags).
I query that table with an expression like:
SELECT * FROM table WHERE data->'tags' #? '$[*] ? (# like_regex ".*(foo|bar).*" flag "i");
With such use-case is there a way for me to index the data->'tags' array to speed up the query? Or should I rather work on moving the tags array out of the JSONB field and into a TEXT[] field and index that?
I've already tried:
CREATE INDEX foo ON tbl USING GIN ((data->'tags') jsonb_path_ops);
but it doesn't work: https://gist.github.com/vkaracic/a62ac917d34eb6e975c4daeefbd316e8
The index you built can be used (if you set enable_seqscan=off, you will see that it does get used), but it is generally not chosen as it is pretty useless for this query. The only rows it would rule out through the index are the ones that don't have the 'tags' key at all, and even at that is poorly estimated so probably won't be used without drastic measures.
You could try to convert to text[] and the use parray_gin, but probably better would be to convert to a child table with text and then use pg_trgm.

Ideal postgres index for non unique varchar column

I need to create a varchar category column in a table and search for rows that are belonging to a particular category.
ie. ALTER TABLE items ADD COLUMN category VARCHAR(30)
The number of categories is very small (repeated across the table)
and the intention is to only use = in the where clause.
ie. select * from items where category = 'food'
What kind of index would be ideal in postgres?
Especially if the table is never expected to be too big (less than 5,000 rows always)
This is a textbook usecase for a Hash Index - you have a very small number of distinct values and only use the equality operator to query them. Using a hash index will enable you to index a relatively small hash of the value, which will allow for faster querying.

Set partial unique index on table creation in PostgreSQL

When defining unique indices in PostgreSQL you have two options:
Define unique index using constraint with the table via a single statement
Define index later after table is created using separate CREATE INDEX statement
However, when it comes to unique partial indices, is it possible to create one with the table using single statement or do I have to use a separate statement?
I would prefer to define all the indices with the table.

Zend_Db_Table_Abstract - should we use our table column names?

Zend_Db_Table_Abstract insert method will accept an array of $data containing column value pairs.
Should that column names inside the array, correspond, exactly, to our database table column names ?
Yes.
On the array passed we NEED to make sure our keys name on that array correspond, exactly, to our table column names.