On 7/20/16 1:14 PM, Mark Lybarger wrote:
> This leads me to think I need to create 2^5 or 32 unique constraints to
> handle the various combinations of data that I can store.
Another option would be to create a unique index of a bit varying field
that set a bit to true for each field that was NULL WHERE <bit varying
field> != 0.
Let me know if you want to go that route, I could probably add that to
http://pgxn.org/dist/count_nulls/ without much difficulty. Though,
probably a better way to accomplish that would be to add a function to
count_nulls that spits out an array of fields that are NULL; you could
then do a unique index on that WHERE array != array[].
Maybe a less obtuse option would be to use a boolean array. Storage
would be ~8x larger, but since there should be very few rows I doubt
that matters.
--
Jim Nasby, Data Architect, Blue Treble Consulting, Austin TX
Experts in Analytics, Data Architecture and PostgreSQL
Data in Trouble? Get it in Treble! http://BlueTreble.com
855-TREBLE2 (855-873-2532) mobile: 512-569-9461