Re: max_rows query + SegFaulting at inopportune times

2007-03-22 Thread Brent Baisley
think it issues a segfault. - Original Message - From: "Micah Stevens" <[EMAIL PROTECTED]> To: "JP Hindin" <[EMAIL PROTECTED]> Cc: Sent: Thursday, March 22, 2007 5:24 PM Subject: Re: max_rows query + SegFaulting at inopportune times Oh, I didn'

Re: max_rows query + SegFaulting at inopportune times

2007-03-22 Thread Micah Stevens
Oh, I didn't see the first comment. My mistake. It's likely a 32bit integer size limit of some sort then. 32bit = 4gbytes -Micah On 03/22/2007 02:08 PM, JP Hindin wrote: Micah; In the first eMail I mentioned that I had excluded filesystem size limits by manually producing a 14GB tar file. If

Re: max_rows query + SegFaulting at inopportune times

2007-03-22 Thread JP Hindin
Micah; In the first eMail I mentioned that I had excluded filesystem size limits by manually producing a 14GB tar file. If it was only that simple :) On Thu, 22 Mar 2007, Micah Stevens wrote: > This table size is based on your filesystem limits. This is a limit of > the OS, not MySQL. > > -Micah

Re: max_rows query + SegFaulting at inopportune times

2007-03-22 Thread Micah Stevens
This table size is based on your filesystem limits. This is a limit of the OS, not MySQL. -Micah On 03/22/2007 01:02 PM, JP Hindin wrote: Addendum; On Thu, 22 Mar 2007, JP Hindin wrote: Zero improvement. I used the following CREATE: MAX_ROWS=10; At first I thought I

Re: max_rows query + SegFaulting at inopportune times

2007-03-22 Thread JP Hindin
I have, after further googling, discovered that the 4.2 billion figure that MySQL uses as 'max_rows' is, indeed, max_rows and not a max database size in bytes. In theory I have solved my problem, and wasted however many peoples bandwidth by putting all these eMails to the MySQL list. Mea culpa, m

Re: max_rows query + SegFaulting at inopportune times

2007-03-22 Thread JP Hindin
Addendum; On Thu, 22 Mar 2007, JP Hindin wrote: > Zero improvement. I used the following CREATE: > MAX_ROWS=10; At first I thought I had spotted the obvious in the above - the MAX_ROWS I used is smaller than the Max_data_length that resulted, presumably MySQL being smarter than I a

Re: max_rows query + SegFaulting at inopportune times

2007-03-22 Thread JP Hindin
ot;Michael Dykman" <[EMAIL PROTECTED]> > Cc: "JP Hindin" <[EMAIL PROTECTED]>; > Sent: Thursday, March 15, 2007 2:09 PM > Subject: Re: max_rows query + SegFaulting at inopportune times > > > > > > On Thu, 15 Mar 2007, Michael Dykman wrote: &g

Re: max_rows query + SegFaulting at inopportune times

2007-03-15 Thread Brent Baisley
PROTECTED]> To: "Michael Dykman" <[EMAIL PROTECTED]> Cc: "JP Hindin" <[EMAIL PROTECTED]>; Sent: Thursday, March 15, 2007 2:09 PM Subject: Re: max_rows query + SegFaulting at inopportune times On Thu, 15 Mar 2007, Michael Dykman wrote: What host OS are you

Re: max_rows query + SegFaulting at inopportune times

2007-03-15 Thread JP Hindin
On Thu, 15 Mar 2007, Michael Dykman wrote: > What host OS are you running? And which file system? MySQL is always > limited by the file size that the host file system can handle. "Deb Sarge" is a Linux distribution, the "large file support" I mentioned allows files up to 2 TB in size. > On 3/15/

Re: max_rows query + SegFaulting at inopportune times

2007-03-15 Thread Michael Dykman
What host OS are you running? And which file system? MySQL is always limited by the file size that the host file system can handle. - michael dykman On 3/15/07, JP Hindin <[EMAIL PROTECTED]> wrote: Greetings all; I have a quandary regarding table limits, and clearly I am not understanding ho

max_rows query + SegFaulting at inopportune times

2007-03-15 Thread JP Hindin
Greetings all; I have a quandary regarding table limits, and clearly I am not understanding how this all works together. I have a test database which needs to keep long-term historical data, currently the total dataset in this one table is probably about 5.5GB in size - although since I have a 4G