database - Switching from a table with many columns to an auxiliiary table with multiple rows -
we have postgres database table "foo" , auxiliiary tables "bar_1" ... "bar_k" hold additional info specific types of foos. basically, can think of "foo" parent class mutual attributes, , each "bar_i" table beingness subclass of "foo" adds additional properties.
we thinking of redesigning schema instead of several "bar" tables, have table of "foo_metadata" lists different columns in each kind of "bar" table, , "foo_attributes" table has values in "bar" tables.
the advantage of such scheme generic: putting metadata each class database, applications not need updated , tests written each changed or added class.
i imagine performance hit: instead of reading 1 row "foo" , 1 "bar" table, we'd read several rows "foo_metadata" , "foo_attributes" tables. likewise updating (although we'd update values , timestamps, aren't indexed.)
my question is: how much of performance nail get? there ways minimize performance hit? metrics can utilize estimate performance nail before commit a lot of developer time redesigning system.
(we're not concerned postgres enforcing types on values, since strings, , application needs check values before inserted database anyway...)
the model described generic eav model (entity–attribute–value model).
there tons of info on internet. google eav
. can start this
article in wiki.
the biggest problems eav:
it can hard write queries. (multiple joins same value table instead of multiple columns single table) you cant apply database constraints model (so no reference integrity , checks). database postgresql
No comments:
Post a Comment