Re: InnoDB #sql files

2011-11-04 Thread Reindl Harald
Am 04.11.2011 19:12, schrieb Ian Rubado: > Hi there, > > I had the same issue as you posted about at the bottom of: > > http://bugs.mysql.com/bug.php?id=20867 > > I was curious if you ever found a solution. I ended up converting tables to > MyIsam and flushing my innodb files to resolve. no,

Re: Deleting Records in Big tables

2011-11-04 Thread Andy Wallace
Excellent point... replication makes many things trikier On 11/4/11 9:54 AM, Derek Downey wrote: Be careful deleting with limit. If you're replicating, you're not guaranteed the same order > of those you've deleted. Perhaps a better way to delete in smaller chunks is to increase the id valu

Re: Deleting Records in Big tables

2011-11-04 Thread Derek Downey
Be careful deleting with limit. If you're replicating, you're not guaranteed the same order of those you've deleted. Perhaps a better way to delete in smaller chunks is to increase the id value: DELETE FROM my_big_table WHERE id> 5000; DELETE FROM my_big_table WHERE id> 4000; etc -- Derek On

Re: Deleting Records in Big tables

2011-11-04 Thread Andy Wallace
I've had some luck in the past under similar restrictions deleting in chunks: delete from my_big_table where id > 2474 limit 1000 But really, the best way is to buy some more disk space and use the new table method On 11/4/11 1:44 AM, Adarsh Sharma wrote: Thanks Anand, Ananda Kumar wrote: W

Re: Deleting Records in Big tables

2011-11-04 Thread Johan De Meersman
- Original Message - > From: "Reindl Harald" > > well i guess you have to sit out add the key > wrong table design having an id-column without a key or > something weird in the application not using the primary > key for such operations For high-volume insert-only tables the lack of a ke

Re: Deleting Records in Big tables

2011-11-04 Thread Reindl Harald
PLEASE do not top-post after you got a reply at the bottom of your quote sorry, but i can not help you with your application if it for whatever reason uses the filed 'id' in a where-statement and your table has no key on this column your table-design is wrong and you have to add the key yes this

Re: Deleting Records in Big tables

2011-11-04 Thread Ananda Kumar
Create PROCEDURE qrtz_purge() BEGIN declare l_id bigint(20); declare NO_DATA INT DEFAULT 0; DECLARE LST_CUR CURSOR FOR select id from table_name where id> 123; DECLARE CONTINUE HANDLER FOR NOT FOUND SET NO_DATA = -1; OPEN LST_CUR; SET NO_DATA = 0; FETCH LST_CUR INTO l_id; WH

Re: Deleting Records in Big tables

2011-11-04 Thread Reindl Harald
Am 04.11.2011 08:22, schrieb Adarsh Sharma: > delete from metadata where id>2474; > but it takes hours to complete. > > CREATE TABLE `metadata` ( > `meta_id` bigint(20) NOT NULL AUTO_INCREMENT, > `id` bigint(20) DEFAULT NULL, > `url` varchar(800) DEFAULT NULL, > `meta_field` varchar(200) DEF

Re: Deleting Records in Big tables

2011-11-04 Thread Adarsh Sharma
Thanks Anand, Ananda Kumar wrote: Why dont you create a new table where id < 2474, rename the original table to "_old" and the new table to actual table name. I need to delete rows from 5 tables each > 50 GB , & I don't have sufficient space to store extra data. My application loads 2 GB dat

Re: Deleting Records in Big tables

2011-11-04 Thread Ananda Kumar
Why dont you create a new table where id < 2474, rename the original table to "_old" and the new table to actual table name. or You need to write a stored proc to loop through rows and delete, which will be faster. Doing just a simple "delete" statement, for deleting huge data will take ages. re

Deleting Records in Big tables

2011-11-04 Thread Adarsh Sharma
Dear all, Today I need to delete some records in > 70 GB tables. I have 4 tables in mysql database. my delete command is :- delete from metadata where id>2474; but it takes hours to complete. One of my table structure is as :- CREATE TABLE `metadata` ( `meta_id` bigint(20) NOT NULL AUTO_IN