You need post working code that demonstrates your problem and the error 
message if you want to get help. There is no built-in limitation on list 
length, but you might exceed your computer's RAM. Check "ulimit -a".



On Tuesday, February 4, 2014 5:07:58 PM UTC, Jeroen wrote:
>
> Hi,
>
> I'm using Sage scripts to run tasks in batch. They look like:
>
> def dostuff(X):
>         result = [X] #plus irrelevant calculations
>         return result
> print dostuff([1,2,3,4,n])
>
> This worked fine for all recent data sets with an input array size of 1.0 
> - 2.4M records. However Sage crashes if larger data sets (3.5M - 10.0M)are 
> used with segmentation faults. Is the way I'm using Sage terribly wrong or 
> did I reached some hard boundaries and does it make sense that it crashes? 
> Or does it seem to be a bug? Thanks for your help.
>
>
> Cheers,
>
> Jeroen
>
> Ubuntu 12.04 LTS x64 on Intel Xeon E3 with 24GB RAM
> Sage Version 6.1, Release Date: 2014-01-30 (GIT)
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"sage-support" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sage-support+unsubscr...@googlegroups.com.
To post to this group, send email to sage-support@googlegroups.com.
Visit this group at http://groups.google.com/group/sage-support.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to