Hi Mark,
many thanks for that. I will check it tomorrow. It´s late here and i am afraid
if i look into it now i will get no sleep. ;)
Regards,
Matthias
Am 07.06.2011 um 23:46 schrieb Mark Talluto:
> Hi Matthias,
>
> The code below is based on a locally saved text file that is vertical bar
>
Hi Matthias,
The code below is based on a locally saved text file that is vertical bar
delimited.
The code counts a particular value in each customer record. The true portion
of the if counts the entire database. The else portion counts only the
customers that are currently being viewed. Th
Hi Mark,
thanks for your suggestion. That works so far, but maybe i can speed it up with
Mark Tallutos technique.
Regards,
Matthias
Am 07.06.2011 um 15:27 schrieb Mark Schonewille:
> Hi Matthias,
>
> Since the data is already in memory, there is no reason to process it in
> steps of 50. Als
Hi Mark,
could you please explain how you did that. How you chunk the data into groups?
That is not clear for me.
Regards,
Matthias
Am 07.06.2011 um 20:46 schrieb Mark Talluto:
> On Jun 7, 2011, at 6:27 AM, Mark Schonewille wrote:
>
>> Since the data is already in memory, there is no reason t
Hi Bob,
in my case data contains sql update commands in each line.
I want to open the DB connection, send 50 (maybe more if that works) sql
commands one after one and then close the db connection.
And want to repeat that with the other lines.
I just want to avoid that there are timeouts, if i se
On Jun 7, 2011, at 6:27 AM, Mark Schonewille wrote:
> Since the data is already in memory, there is no reason to process it in
> steps of 50. Also, using repear with x =... is very slow. Use repear for each
> with a counter instead:
I believe there is a major speed benefit to chunking the data
If you are not planning to change anything in the data itself, then you can use
the repeat for each line theLine of theData form in your inner loop.
But I do not see the advantage of doing it in blocks of 50, unless there is
something about the data that requires it.
Bob
On Jun 7, 2011, at
Hi Matthias,
Since the data is already in memory, there is no reason to process it in steps
of 50. Also, using repear with x =... is very slow. Use repear for each with a
counter instead:
put 0 into myCounter
repeat for each line myLine in DATA
add 1 to myCounter
// do something with myLine