I opened trac tickets #931 and #933. Below the mail I sent to William. I wil need all the help I can get! Martin!?
Jaap -------- Original Message -------- Subject: back to permanents Date: Fri, 19 Oct 2007 20:03:24 +0200 From: Jaap Spies <[EMAIL PROTECTED]> To: William Stein <[EMAIL PROTECTED]> Hi William, I would like to go back to my old love: permanents of rectangular matrices. I still think SAGE is the only mathematical software with an implementation of the permanent function for non-square matrices over an arbitrary field! But it is frickin' slow, as you could have said. Calculating the permanent of a 13 x 17 matrix with a 'band' of 4 1's over the main diagonal. Over ZZ: > sage: time f(13,4) > CPU times: user 3.98 s, sys: 0.07 s, total: 4.05 s > Wall time: 4.08 > 1596800 Over QQ > sage: time f(13,4) > CPU times: user 8.39 s, sys: 0.09 s, total: 8.48 s > Wall time: 8.56 > 1596800 My all C-program with ints based on gmp: > [EMAIL PROTECTED] perm_gmp]$ time ./ds 13 4 > 1596800 > > real 0m0.328s > user 0m0.326s > sys 0m0.003s > [EMAIL PROTECTED] perm_gmp]$ In the reference manual it still says that the code is optimized only for matrices over QQ :-)! What we need is optimization for integer matrices, followed by more optimization for (0,1) matrices, eventually for (-1,0,1) matrices. That are the matrices that 'count' in applications. 1) A speed boost can be achieved replacing 'my' pure Python function _combinations, to be found in sage.structure.sequence, with a real fast implementation in C/Cython. 2) For the (0,1) matrices I'm thinking at the following approach: Let A = (a_{ij}) be an m x n (m <= n) (0,1)-matrix. We define a matrix X = (x_{ij}) with independent indeterminates x_{ij}: x_{ij} = 0 iff a_{ij} = 0. Now define a list of equations: sum_{i=1}^{i=m} x_{ij} = 1 for j = 1, ..., n sum_{j=1}^{j=n} x_{ij} = 1 for i = 1, ..., m x_{ij}^2 = x_{ij} for i = 1, ..., m and j = 1, ..., n It is easy to prove that the number of solutions to this equations is equal to the permanent of A. Based on a paper from Bernasconi, et al.: Computing Groebner Bases in the Boolean Setting with Applications to Counting (1997) (which restricts itself to square matrices and a number of polynomials less than 255), we can do the following: 1) calculate a Groebner basis 2) compute the number of solutions (the permanent) If this could be done fast, it beats Ryser's algorithm (See the articla above). This could be my program for the next weeks, including SD6. But I will need help. My Groebner bases are rusty, my knowledge of Singular is rather minimal (I'm studying Decker, Lossen, Computing in Algebraic Geometry, A Quick Start using Singular). What do you think? Jaap --~--~---------~--~----~------------~-------~--~----~ To post to this group, send email to sage-devel@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sage-devel URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/ -~----------~----~----~----~------~----~------~--~---