but is it me?

Mac OS 10.12.6, LC 8.1.10

I have a 4 card stack which when loaded takes the LC memory to 66MB
I then load a 25MB plain text file into a scrolling field  (which takes maybe 
20 seconds and shows LC as 'not responding’ in Activity Monitor).  This take 
memory to 282MB
I then perform the following script  Where the field “coerckey” is a list of 21 
keywords I need to find in the imported text.

   put empty into field "foundlines"
   put empty into ttemp
   put empty into field "totaliser"
   put 1 into ttcount
   repeat for each line i in field "coerckey"
      put i into field "currentkey"
      put field "import" into ttemp
      filter lines of ttemp with regex pattern i
      put ttemp & return before field "foundlines" 
    put "•••  " & i & "   •••" & return before field "foundlines"
    filter field "foundlines" without empty
    put the number of lines of field "foundlines" -ttcount  into field 
"coerfreq"
    put ttcount + 1 into ttcount
   end repeat

It works fine, and at a reasonable lick, but the memory balloons to 2.46 GB, 
and is never released, even after the handler is done.  I have several other 
similar scripts, and unsurprisingly, after two or three, LC chokes.

I read somewhere that there was a memory leak in a previous version related to 
the filter command, although I thought that was something to do with Unicode.

So is what I see down to me, or LC?

Best wishes,

David Glasgow


_______________________________________________
use-livecode mailing list
use-livecode@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-livecode

Reply via email to