Eric Bollengier wrote:
>
> It's the database job... I never seen a database (mysql, postgres or oracle)
> saying something like "Sorry i'm out of memory". Database takes memory
> that you give, never more.
>
Talking about Oracle - I can tell you those erros can happen (generally,
not speaking
I don't think ORA 04030 is a good example ...
It means that the oracle process has tried to allocate memory as asked by the
DBA, and couldn't, because either the server has no more memory, the process
has hit an administrative (OS) limit, the OS has done an optimistic memory
allocation...
As f
This cause a database/system crash ? or a just a transaction rollback ?
...
Le Vendredi 23 Mars 2007 17:03, Oliver Lehmann a écrit :
> Eric Bollengier wrote:
> > It's the database job... I never seen a database (mysql, postgres or
> > oracle) saying something like "Sorry i'm out of memory". Datab
Hi,
> 1. With the batch insert code turned on there are a number of regression
> tests that fail. They must all pass without errors prior to production
> release. Responsible: Eric
> Deadline: Roughly the end of March
I work on it
> 2. I am very concerned that the new batch insert code will und
>
> This is exactly what I was seeing with dbcheck.
>
> Why have a dog and then do all the barking yourself?
>
> In this case the dog is the SQL database and the barking is the needless
> extraction and [counting|deleting] of individual NULL JobIds
>
>
> The comments about SQL crashes are because I
On Wed, 21 Mar 2007, Marc Cousin wrote:
> I think I haven't explained the memory issue correctly :
I realise it's an issue for large selects, but in the case given:
>
> The example Kern gave is :
>
> "SELECT JobMedia.JobMediaId,Job.JobId FROM JobMedia "
>"LEFT OUTER JOIN Job ON (
I think I haven't explained the memory issue correctly :
The example Kern gave is :
"SELECT JobMedia.JobMediaId,Job.JobId FROM JobMedia "
"LEFT OUTER JOIN Job ON (JobMedia.JobId=Job.JobId) "
"WHERE Job.JobId IS NULL LIMIT 30";
and it only fails if I remove the
On Wed, 21 Mar 2007, David Boyes wrote:
>> - First, have a default limit of the number of records that will be
>> inserted
>> in any one batch request. This should guarantee that an out of memory
>> problem will not normally occur.
>
> Can we calculate this based on available memory at execution
> 1. With the batch insert code turned on there are a number of
regression
> tests
> that fail. They must all pass without errors prior to production
release.
> Responsible: Eric
> Deadline: Roughly the end of March
Makes sense.
> - First, have a default limit of the number of records that wil