On 2010-10-18 05:42, Jason Feng wrote:

I am using
Perl and MySQL to maintain a database of mobile network configuration about 30
tables and millions of rows. Every day, I’ll be importing new configuration
data to the database.

I’d like to create a delta report on which row and which
column are modified, which row is deleted, which row is added. It will be 
time-consuming if just comparing each row and column one by one of today's data 
and yesterday's data.

Can anyone
give me some good suggestions on this? Thanks in advance!

If you still have yesterday's data as well, for example in a stopped slave, then you can use the maatkit tools (written in Perl).

It uses techniques like chunking/nibbling, and bin_xor-aggregations, to quickly find out in what region of the data there were changes.

It helps to have an integer as the first column in the PK of each table.

The alternative is what Jeff suggests: analyze the binlog.
Also pretty straightforward.

--
Ruud

--
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to