great! I couldn't get it to crash here either, but our machines all have a lot of RAM.

Bruce

On Wed, 7 Jun 2006, Lars M. Rimol wrote:

Hi,
I just ran make_average_subjects with increased swap (from 2G to 8G) on 178
subjects and now it worked, which would seem to indicate that it may have
been a memory issue? I did not crash with 3, 10 and 113 subjects with 2G
swap.

yours,
LMR

Hmmm, I've run it on >100 subjects without trouble. I'll try now and see if
I can replicate locally. Is there anything different about these volumes?
Are they all 256^3 and unsigned char?

cheers,
Bruce

Could be a memory leak. Can you send  us all the text output? We need to
know specifically what program it is dying on.
doug


Hi,
I am analysing a dataset with 180 subjects. When I run
make_average_subjects it crashes, the error message usually >>>being
something like this:

mghRead: encountered error executing: 'zcat /space/monkeys/1/home/lmr
/subjects/VETSA_test/testAVG/tmp/make_average_vol-tmp/orig-19963b.mgz',frame
-1

I have tried running it with fewer subjects and it worked with 3, 10 and
73 subjects. Also, it crashes at different points in >>>the processing
stream, so once it crashed at subject # 130, which I removed from the list
of subjects, and then next time >>>it crashed at # 30. My frist guess was
that it is a memory problem. I tried running the top command in a different
terminal at the same time as running make_average_subject with 2 and 10
subjects, and it seemed to require up to >>>400m of virtual memory and 10%
of memory, but I couldn't see any difference in memory demands between
running 2 >>>and 10 subjects. This was running with 4G of physical memory
and 2G of swap.

Is this a known issue? (Couldn't find anything in the mailing list
archive..)

Thank you!

_______________________________________________
Freesurfer mailing list
Freesurfer@nmr.mgh.harvard.edu
https://mail.nmr.mgh.harvard.edu/mailman/listinfo/freesurfer

Reply via email to