Don't feel like you need to spend too much time defending Sage.  It
speaks for itself.  It's free, open source, and improving daily.  And,
even though I greatly admire "a top research mathematician who happens
to write assembly code to find Hecke operators... now who might that
be," I don't think he has managed a programming project the size and
scope of Sage.  So, although it is nice to get the celebrity
endorsement, I think most people will pickup Sage over Magma simply
because it's free.  In five years, you may find that there are more
person-hours being spent developing Sage than Magma... in the mean
time, I think having well documented, low-bug code is more important
than beating Magma on every benchmark.

--jason

On 3/19/07, William Stein <[EMAIL PROTECTED]> wrote:
>
> On 3/19/07, a top research mathematician wrote:
> > xxxx put me off sage a bit: he tried it a year or so ago and
> > after a while he realised that he was actually just coding in python
> > rather than sage, and that magma seemed to be much quicker. Maybe this
> > has changed now though. How can sage overtake magma if huge chunks of
> > magma are written in assembly or whatever they do in the core?
>
> You seem to be confused about the relationship between the architectures
> of SAGE and Magma.  The architecture of SAGE and Magma are
> very similar, at least in regards to what you wrote above.
> Both are a combination of a high-level interpreter and
> lower-level compiled code.   The Python interpreter is in some ways
> slightly slower than Magma's interpreter, and in other ways faster.
> Python is a standard mainstream programming language with full support
> for user-defined types, object oriented programming, multiple inheritence,
> etc., and Python is used by millions of people around the world daily for a
> wide range of applications, and is maintained by large group of
> people.  Magma is not mainstream, does not support user-defined types,
> object oriented programming, multiple inheritance, and is used by at
> most a few thousand people daily, and only
> for mathematics.
>
> There is also a SAGE compiler (called SageX), which turns most SAGE code
> into native C code, which is then compiled with a C compiler.  About a third
> of the SAGE library is compiled in this way.  But it is also easily used by 
> end
> users, even from the graphical user interface.  Here's a quote from Helena
> from two weeks ago: "Being able to compile functions is such a big speed up,
> it's surprising this is not possible with magma or other packages.  It
> makes a huge difference."   My Ph.D. student, Robert Bradshaw, turned
> out to secretly be very good at compilers, and greatly improved SageX
> (which is a fork of a program called Pyrex) for use in SAGE.
>
> For much basic arithmetic (floating point and integer/rational
> arithmetic, numerical linear algebra),  rely on *exactly* the same C
> libraries, namely GMP, MPFR, ATLAS, etc.  This is where probably most
> of the assembly code in the MAGMA core is located -- in GMP -- which
> is also used by SAGE.
>
> To take another example, (presumbly) the exact linear algebra in Magma is
> written in C code.  The analogous functionality for SAGE is provided by:
>    (1) basic infrastructure: compiled SageX code that Robert Bradshaw
>         and I wrote from scratch
>    (2) fast echelon form, charpoly, system solving, etc.: provided by
> Linbox (http://www.linalg.org/), IML
> (http://www.cs.uwaterloo.ca/~z4chen/iml.html), and soon over F_2,
> m4ri, by Gregor Bard/, and both NTL and PARI in some cases.
>
> Linbox is a powerful C++ library that has been under development since 1999 by
> a group of symbolic algebra researchers (starting with Erich
> Kaltofen).  It has a very clear theoretical basis, and many of the
> algorithms are connected with interesting papers.  SAGE is the first
> system to serious use Linbox (and we only started recently), so it's
> getting a lot of stress testing from our use.   As an example of how
> Linbox "works", Clement Pernet, one of the main Linbox developers,
> co-authored a paper last year on a new algorithm for fast charpoly
> computation over ZZ. That algorithm is implemented in the newest
> version of Linbox, and when I compared timings on my laptop, it was
> twice as fast as Magma's charpoly over ZZ, and gaining as n-->oo.
>
> When writing snippets of code, there are some issues that can make
> SAGE seem much slower than MAGMA, if one doesn't know what one is
> doing and hasn't read much of the documentation.  But usually asking
> at sage-support clears things like this up.
>
> Research Mathematician said:
> > I'm sure I am but this is probably because I don't know anything about
> > python. I thought python was being 100% interpreted and magma was
> > being 100% compiled, for example.
>
> Magma is an interpreter.  Part of Magma is written in this interpreter
> language (e.g., 99% of my modular forms code).  Python is an interpreter
> that is itself written in C.  Part of SAGE is written using this interpreter,
> but most of SAGE is written in various compiled languages.  In both cases
> one gets easy access to compiled code via an interpreter.  PARI is the
> same too, except that with PARI the system is entirely implemented in
> C (nothing is interpreted).
>
>
>  -- William
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~----------~----~----~----~------~----~------~--~---

Reply via email to