Sunday, 15 January 2012

sql - Database - How to manage high quantity of data over time with different refresh rates ? -



sql - Database - How to manage high quantity of data over time with different refresh rates ? -

i need design sql server database capable of logging different info on different time scale .

basicaly have log info battery cells lots of battery @ anytime.

here basic model of database. datatypes on images not 1 use.

i utilize tinyint of them (2 bytes)

time 3 bytes date 2 bytes

so imagine :

1 cell study file emitted every 24 hours. but:

each attribute of cell don't refresh @ same frequence.

for instance :

time attribute refresh every second amps attribute refresh every second temp1 attribute refresh every minute date refresh every day

and cell reporting 24/7 on years.

if there 1000 battery around world linked database, , each battery have let's 20 cells.

20 000 cells reporting 24/7

so here problem :

if 1 attribute alter don't want whole line re stored. if 20 000 cells need 1to year. (and null stored instead of non refreshed values).

i hope explaination clear enough, don't hesitate inquire farther information

as usual apologize english language :/

thank you.

you need 20k inserts per sec doable shows how much info is. no problem @ bulk-insert @ rate, have maintain info around time. that's going lot of rows , tbs.

i'd consider storing in custom format: store 1 binary blob per battery , per hour. in blob free encode changes way want. can store non-changing columns efficiently not storing them @ all. can compress blob before storing.

this scheme still gives indexing on battery , time. time resolution decreased 1 hour. app has decode blobs , extract needed info them.

the info compressible because redundant. compression there fast schemes lzo, lz4 , quicklz. can more compression things 7z.

sql sql-server database-design logging model

No comments:

Post a Comment