Hi all, gotta do some computations involving a really big matrix, and need to save each row in a different file (matrix is too big to fit in memory). SInce every row can be computed independently I have tried to parallelize the computation. First, this is my original computation of the matrix (for a group G):
def computeKillingbyrows(G): print("Studying "+str(G)) print("computing normalizers") normalizer_sizes = dict([(g, G.normalizer(g).order()) for g in G]) l = sorted(normalizer_sizes) n = len(l) print("computing Killing form") for x in range(n): compute_row(x) row = [normalizer_sizes[l[x]*l[y]] for y in xrange(n)] filename = str(G)+"_row_"+str(x) save(row, filename) print("Row "+str(x)+" saved!") one can easily check that this works for a small group (say A4). Now, this is what I try as a parallel version (compute and save each row independently) cores = 2 # just for testing, will adjust later def computeKillingbyrows(G): print("Studying "+str(G)) print("computing normalizers") normalizer_sizes = dict([(g, G.normalizer(g).order()) for g in G]) l = sorted(normalizer_sizes) n = len(l) @parallel(cores) def compute_row(x): row = [normalizer_sizes[l[x]*l[y]] for y in xrange(n)] filename = str(G)+"_row_"+str(x) save(row, filename) print("Row "+str(x)+" saved!") print("computing Killing form") compute_row(range(n)) the latest gets to print down the "computing Killing form" print, and exits without any error and without saving any files. Is it possible to do the above with @parallel? I have tried looking for documentation for that decorator, but the examples in the source don't try to do anything similar. Cheers J -- To post to this group, send an email to sage-devel@googlegroups.com To unsubscribe from this group, send an email to sage-devel+unsubscr...@googlegroups.com For more options, visit this group at http://groups.google.com/group/sage-devel URL: http://www.sagemath.org