Hi all,

I am looking for some tools to manipulate large
database dumps.

Currently, I am trying to manipulate wikipedia
databases - which contain a huge number of records.
Wikipedia provides the entire MySql (huge) dump at
[1], but  I don't require all the fields in various
tables, 
Eg. the categorylinks table consists of (int, varchar,
varchar , timestamp) - I am only interested in the
first two columns.

I am writing script to remove un-necessary fields from
the dumps, but I wonder if there are any free software
"data loaders" out there. 

Any suggestions / comments welcome.

Regards,
Devendra.

 [1]
http://en.wikipedia.org/wiki/Wikipedia:Database_download





      
____________________________________________________________________________________
Don't let your dream ride pass you by. Make it a reality with Yahoo! Autos.
http://autos.yahoo.com/index.html
 



--
______________________________________________________________________
Pune GNU/Linux Users Group Mailing List:      ([email protected])
List Information:  http://plug.org.in/cgi-bin/mailman/listinfo/plug-mail
Send 'help' to [EMAIL PROTECTED] for mailing instructions.

Reply via email to