I am analysing a dataset with 180 subjects. When I run make_average_subjects it crashes, the error message usually being something like this:
mghRead: encountered error executing: 'zcat /space/monkeys/1/home/lmr/subjects/VETSA_test/testAVG/tmp/make_average_vol-tmp/orig-19963b.mgz',frame -1
I have tried running it with fewer subjects and it worked with 3, 10 and 73 subjects. Also, it crashes at different points in the processing stream, so once it crashed at subject # 130, which I removed from the list of subjects, and then next time it crashed at # 30. My frist guess was that it is a memory problem. I tried running the top command in a different terminal at the same time as running make_average_subject with 2 and 10 subjects, and it seemed to require up to 400m of virtual memory and 10% of memory, but I couldn't see any difference in memory demands between running 2 and 10 subjects. This was running with 4G of physical memory and 2G of swap.
Is this a known issue? (Couldn't find anything in the mailing list archive..)
Thank you!
--
yours,
LMR
_______________________________________________ Freesurfer mailing list Freesurfer@nmr.mgh.harvard.edu https://mail.nmr.mgh.harvard.edu/mailman/listinfo/freesurfer