Thursday, 15 September 2011

php - How to approach multi-million data selection -



php - How to approach multi-million data selection -

i have table stores specific updates customers.

some sample table:

record_id | customer_id | unit_id | time_stamp | data1 | data2 | data3 | data4 | more

when created application, did not realize how much table grow -- have on 10mil records within 1 month. facing issues, when php stops executing due amount of time takes. queries produce top-1 results, based on time_stamp + customer_id + unit_id

how suggest handling type of issues? example, can create new table each customer, although think not solution.

i stuck no solution in mind.

if you're on cloud (where you're charged moving info between server , db), ignore.

move logic server

the fastest query select whereing primary. won't matter how big database is, come fast table of 1 row (as long hardware isn't unbalanced).

i can't tell you're doing query, first download of sorting , limiting info php. 1 time you've got need, select info straight whereing on record_id (i assume that's primary).

it looks on demand info pretty computationally intensive , huge, recommend using faster language. http://blog.famzah.net/2010/07/01/cpp-vs-python-vs-perl-vs-php-performance-benchmark/

also, when start sorting , limiting on server rather db, can start identifying shortcuts speed further.

this server's for.

php mysql database-performance

No comments:

Post a Comment