[sage-devel] Problematic symbolic link in sage directory

2008-04-29 Thread Jason Martin

The symbolic link

/home/was/www/sage/pre

points to a non-existent referrent, which causes rsync on my mirror to
skip file deletion (I'm using the "safe" mode for link following in
rsync)... which is making my mirror at least grow a little faster than
it should.

Can this sym link be removed?

--jason

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://www.sagemath.org
-~--~~~~--~~--~--~---



[sage-devel] Re: fast vs viable (offline post)

2008-05-01 Thread Jason Martin

I think Sage is awesome.  I've started using it in my research.  I've
been using it to teach classes, and I've even forced my students to
learn Python (and Linux) in the process.

Yes, Sage has lots of bugs, but at least I get to see the code, and if
I'm motivated then I can even fix it myself.  I used to hit a bug a
week with Magma, but I can't see that code, and I'm no longer
motivated to report the problems (because now I'm at an institution
that can't afford a Magma license so I have to use other people's
Magma installations).

We should all try to write good code, good documentation, and good
tests.  When time and money limit what is possible, each developer
just does the best she can, and the rest of us should just say,
"Thanks for the hard work!"

(By the way, thanks for the hard work everyone!  I really appreciate it!!)

As for the "formal methods" and "proof of correctness" approaches:  I
just don't think that they are viable given our current resource
limitations.  I've worked on military projects requiring that level of
verification, and even with multi-million dollar annual budgets it was
painful to achieve high-assurance on projects much smaller than Sage.
So, even though it would be nice to be able to believe the results of
every computation, I think we just have to let the users know, "Sage
makes a good attempt to give you the right answer, but bugs happen, so
double check your computations if you need to rely on them."

Keep up the good work!!

(Sorry, I guess I didn't add anything to the conversation, but it was
more fun than grading finals.)

--jason

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://www.sagemath.org
-~--~~~~--~~--~--~---



[sage-devel] Mirrors question

2008-07-11 Thread Jason Martin

I noticed that on http://www.sagemath.org/mirrors.html most of the
mirrors are listed as "unknown" although they appear to work just fine
when I checked them out, although they are mirroring the sage.math
stuff.  Should we mirror operators be pulling from a different place?

Also, rsync is complianing:
   symlink has no referent: "/home/was/www/sage/dist/src/announce"

Thanks,
jason

Jason Worth Martin
Asst. Professor of Mathematics
http://www.math.jmu.edu/~martin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://www.sagemath.org
-~--~~~~--~~--~--~---



[sage-devel] Can a 32-bit Python call a 64-bit library via the Python/C API?

2007-02-06 Thread Jason Martin

Hi All,

I'm trying to build a 64-bit Sage for Mac OS X (Tiger).

However, I can't get a 64-bit Python to build because not enough of
the fundamental Apple libraries are 64-bit... and I can't get a 32-bit
Python build to accurately use the 64-bit libraries I can get built.
The problem appears to be that the Python/C API expects function calls
to act like whatever mode it was built in (I don't actually, *know*
this, I'm just guessing based on what I'm seeing.)

So, it looks like I'll have to wait until Leopard comes out (which is
supposed to have full 64-bit frameworks).  That's rather
disappointing.  I'm hoping that one of you Python experts can tell me
I'm just being silly and show me how to do this?  Anyone?

--jason

--
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: Determining the number of CPUs

2007-02-07 Thread Jason Martin

On Linux, grep /proc/cpuinfo for lines beginning with "processor".
Take the line with the biggest number and add one to it.  Kinda
kludgey, but that's the first thing that comes to mind.

On OS X, the command "sysctl hw.ncpu" will give it to you.

On Solaris, I believe that "procinfo" will give it to you, but I
haven't worked on a Solaris box in a while, so that may be old.

On Windows, I have no clue.

I think that on some x86 platforms, you can actually get this
information with the "cpuid" assembly instruction.  I'll look that up
and send you some C code to do it if it's possible.

--jason

On 2/7/07, Joshua Kantor <[EMAIL PROTECTED]> wrote:
>
> I was wondering if anybody had any ideas on good (platform
> independent) ways to figure out the number of CPUs a computer running
> sage has available, (short of asking the user).
>
>
>
>
> Josh
>
>
> >
>


-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Identifying number of cores, building 64-bit on OS X, and Linbox

2007-02-07 Thread Jason Martin

1.  Many different web sources seem to indicate that the prefered way
of finding out the number of cores in Linux is just to grep
/proc/cpuinfo as Fernando's code does.  The use of the CPUID
instruction for this purpose is discouraged since it isn't consistent
between AMD and Intel chips, so ignore my previous blurb.

"sysctl hw.ncpu" works in OS X.

I still have no idea how to do it on Windows... googling this isn't
helping me because I don't know the key words to start looking.  Any
of you Windows programmers out there know how to do it?


2.  Building 64-bit Sage on OS X appears to have hit a wall.  Numerous
sources have verified that 32/64bit libraries shall not meet one
another.  Guess we just wait for Leopard.


3.  I agree the Linbox rocks!!  I've been perusing the source code,
and they've done a fantastic job making this thing really modular (in
the programming sense) and flexible at each level.  I believe that the
linear algebra library I was proposing can be built within Linbox
(rather than on top of it).  It just needs some C++ methods for
pickling-up their data structures for MPI passing, and the basic
routines can be naively MPI-parallelized with minimal effort.  A
second round could replace the naive versions with more
parallel-specific routines;  I'm very optimistic.  This could really
speed up the time-table.

--jason

On 2/8/07, William Stein <[EMAIL PROTECTED]> wrote:
>
> Hi,
>
> If anybody wants to help out by building sage-2.1.alpha4, please download
> it from here:
>
>   http://sage.math.washington.edu/home/was/tmp/sage-2.1.alpha4.tar
>
> extract, then do make followed by "make test".  Let me know if it works.
>
> This has both linbox and quaddouble in it, plus a number of small bugfixes
> and new code people have sent me.
>
> NOTE: I am aware that linbox spews info to the screen when
> it is called -- fixing that is the only obstruction I know of
> right now to releasing sage-2.1.   One possibility is to
> hack the linbox package somehow.  (The official way to stop
> output is buggy -- because linbox has been almost entirely
> a research system, and isn't really used in a production way by
> other software yet, so extra debugging output isn't something
> the developers would notice.)   By the way, in many ways
> linbox freakin' rocks.
>
> William
>
> >
>


-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] 64-bit guest OS in Mac OS X VMs

2007-02-12 Thread Jason Martin

Since we were discussing how to provide 64-bit sage on Macs...

Looks like VMware is entering the competition with Parallels to
provide virtual machines for OSX.  The VMware beta is free (for now)
and claims to support 64-bit guest OSes.  I'll try it out later this
week and see if I can get Sage to build and run in a 64-bit Linux
Guest OS.

http://www.vmware.com/products/beta/fusion/

--jason

-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Anyone else try VMware Fusion (VMware for Macs)?

2007-02-17 Thread Jason Martin

Hi All,

I was able to get 64-bit Ubuntu installed and running in a VMware
virtual machine.  Sage appears to build on it, too (that's as far as
I've gotten in testing it).  One problem I'm having is that I'd like
to be able to login to the virtual machine remotely.  However, I can't
seem to get the networking setup to accept incoming connection (this
is a VMware setup issue, not an OS issue).  Has anyone else played
with this?  I'm able to do it easily with Parallels Desktop.  I'd like
to get it resolved so that any SD3 developers who want to experiment
with it can do so.

--jason

-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: ianal but...

2007-02-18 Thread Jason Martin

I just learned what IANAL means, and indeed I am not a Lawyer, but
point 3c of the Microsoft Permissive License appears to directly
conflict with the GPL.

--jason

On 2/18/07, William Stein <[EMAIL PROTECTED]> wrote:
> Hello,
>
> Certain kind researchers at Microsoft Rsearch have code they would like to
> contribute to SAGE.  They are only allowed release it under the Microsoft
> Permissive License, which is described here (and linked to):
>
>
> http://www.microsoft.com/resources/sharedsource/licensingbasics/sharedsourcelicenses.mspx
>
> (1) Is this license GPL compatible?  It seems like it might be.  It is very
> free (much more so than the GPL).  What do you think?  I couldn't find
> anything definitive online...  One thing -- this license is very clear about
> patent issues, which is comforting.
>
> (2) General thoughts?
>
> --
> William Stein
> Associate Professor of Mathematics
> University of Washington
>  >
>


-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: ianal but...

2007-02-18 Thread Jason Martin

Oops, strike this comment.  Clearly I am neither a lawyer nor a
careful reader (commas are so tricky).  As I re-read the license
several times, I now cannot see any conflict with GPL.  However,
several webpages (including wikipedia's software license list) say
that it does conflict.

I cannot see how the Microsoft Permissive License would cause any
problem in a portion of software included with Sage.

--jason

On 2/18/07, Jason Martin <[EMAIL PROTECTED]> wrote:
> I just learned what IANAL means, and indeed I am not a Lawyer, but
> point 3c of the Microsoft Permissive License appears to directly
> conflict with the GPL.
>
> --jason
>
> On 2/18/07, William Stein <[EMAIL PROTECTED]> wrote:
> > Hello,
> >
> > Certain kind researchers at Microsoft Rsearch have code they would like to
> > contribute to SAGE.  They are only allowed release it under the Microsoft
> > Permissive License, which is described here (and linked to):
> >
> >
> > http://www.microsoft.com/resources/sharedsource/licensingbasics/sharedsourcelicenses.mspx
> >
> > (1) Is this license GPL compatible?  It seems like it might be.  It is very
> > free (much more so than the GPL).  What do you think?  I couldn't find
> > anything definitive online...  One thing -- this license is very clear about
> > patent issues, which is comforting.
> >
> > (2) General thoughts?
> >
> > --
> > William Stein
> > Associate Professor of Mathematics
> > University of Washington
> >  > >
> >
>
>
> --
> Jason Worth Martin
> Asst. Prof. of Mathematics
> James Madison University
> http://www.math.jmu.edu/~martin
> phone: (+1) 540-568-5101
> fax: (+1) 540-568-6857
>
> "Ever my heart rises as we draw near the mountains.
> There is good rock here." -- Gimli, son of Gloin
>


-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] need some beta-testers for a sage gmp patch

2007-02-18 Thread Jason Martin

Hi All,

I've been tweaking the gmp used in sage.  Right now this will only
benefit Linux users who have a core2 processor.  However, I'd like to
make sure that my changes don't break the sage build on other
platforms.  So, I'd appreciate any and all feedback!  Instructions and
the beta gmp spkg are at

http://www.math.jmu.edu/~martin/sage-beta-gmp

let me know what breaks.

Thanks,
jason

p.s.  By the way, I'd love to make this work on a 64-bit Vista/cygwin
platform, but I don't have access to any of those.  So, if you've got
one and you want to try this out, please let me know.

-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Sage mirror scripts

2007-02-19 Thread Jason Martin

If someone who is already mirroring sage could send me scripts for
syncing the mirror and a brief description of the setup, I'd greatly
appreciate it.  I'm also willing to run a sage notebook if it's easy
to setup.  (However, the dinky pentium4 that I've setup for this task
might not be powerful enough to handle much computation.)

thanks,
jason

-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] New sage mirror... let me know if you see any errors

2007-02-22 Thread Jason Martin

Just finished getting a new sage mirror setup.  It took nearly 28
hours for the initial sync!!  (Average connection speed was only
around 100kB/s.)

Anway, when you have time to kill, go to

   http://modular.math.jmu.edu

and let me know if I've mis-configured anything.

Thanks,
jason

-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Problem building sage-2.2 on OS X (error with MPFI build picking up external MPFR lib)

2007-03-02 Thread Jason Martin

The problem is that the linker is picking up my 64-bit version of
libmpfr.a located in /usr/local/lib instead of the 32-bit libmpfr.a
located in sage-2.2/local/lib.  I fixed it by adding a -Z flag (which
tells the linker not to search the system directories).  There should
be a cleaner fix, but I'm not familiar with the MPFI source tree.

This is probably something we should tell the list about since we want
to make sure that Sage builds completely from the components it gets
distributed with and doesn't try to pick up outside libraries.

--jason

On 3/2/07, Jason Martin <[EMAIL PROTECTED]> wrote:
> I'm seeing the same error.  I'll start investigating it.
>
> On 2/28/07, Clement Pernet <[EMAIL PROTECTED]> wrote:
> > Hello Jason,
> >
> > I am trying to install SAGE on your mac OS X machine, but meet some link
> > errors when compiling mpfi: the mpfr library can not be linked to.
> > Do you have met such a problem ?

-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: Problem building sage-2.2 on OS X (error with MPFI build picking up external MPFR lib)

2007-03-05 Thread Jason Martin

1.  Sorry for not replying sooner, but I was pleasantly removed from
technology for the last several days :-)

2.  It's the MPFI code that isn't building, not the MPFR code.

3.  I manually executed the compile line where the Makefile was
failling and added the "-Z" option.  It isn't something that you can
add to the global CFLAGS for the package because you probably will
need to link against system libraries at some point in time (e.g. for
printf, etc.).  I was hoping that someone who knew the code better
could suggest where to tweak the make options so that local libraries
(e.g. sage/local/lib) get searched before system libraries.  The
Makefile is built with automake/autoconf so I don't know exactly where
to tweak it... I guess it's time to learn automake.

4.  I'll take a closer look at it tonight and see if I can find a clean fix.

--jason

On 3/2/07, William Stein <[EMAIL PROTECTED]> wrote:
>
> On Friday 02 March 2007 8:30 am, Jason Martin wrote:
> > The problem is that the linker is picking up my 64-bit version of
> > libmpfr.a located in /usr/local/lib instead of the 32-bit libmpfr.a
> > located in sage-2.2/local/lib. I fixed it by adding a -Z flag (which
> > tells the linker not to search the system directories). There should
> > be a cleaner fix, but I'm not familiar with the MPFI source tree.
>
> Thanks!
>
> Where did you add the -Z flag?I would like to modify the mpfr
> build script so this doesn't happen to anybody else.
>
>  -- William
>
> >
>


-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: some ramblings about the relationship between sage and MAGMA

2007-03-19 Thread Jason Martin

Don't feel like you need to spend too much time defending Sage.  It
speaks for itself.  It's free, open source, and improving daily.  And,
even though I greatly admire "a top research mathematician who happens
to write assembly code to find Hecke operators... now who might that
be," I don't think he has managed a programming project the size and
scope of Sage.  So, although it is nice to get the celebrity
endorsement, I think most people will pickup Sage over Magma simply
because it's free.  In five years, you may find that there are more
person-hours being spent developing Sage than Magma... in the mean
time, I think having well documented, low-bug code is more important
than beating Magma on every benchmark.

--jason

On 3/19/07, William Stein <[EMAIL PROTECTED]> wrote:
>
> On 3/19/07, a top research mathematician wrote:
> >  put me off sage a bit: he tried it a year or so ago and
> > after a while he realised that he was actually just coding in python
> > rather than sage, and that magma seemed to be much quicker. Maybe this
> > has changed now though. How can sage overtake magma if huge chunks of
> > magma are written in assembly or whatever they do in the core?
>
> You seem to be confused about the relationship between the architectures
> of SAGE and Magma.  The architecture of SAGE and Magma are
> very similar, at least in regards to what you wrote above.
> Both are a combination of a high-level interpreter and
> lower-level compiled code.   The Python interpreter is in some ways
> slightly slower than Magma's interpreter, and in other ways faster.
> Python is a standard mainstream programming language with full support
> for user-defined types, object oriented programming, multiple inheritence,
> etc., and Python is used by millions of people around the world daily for a
> wide range of applications, and is maintained by large group of
> people.  Magma is not mainstream, does not support user-defined types,
> object oriented programming, multiple inheritance, and is used by at
> most a few thousand people daily, and only
> for mathematics.
>
> There is also a SAGE compiler (called SageX), which turns most SAGE code
> into native C code, which is then compiled with a C compiler.  About a third
> of the SAGE library is compiled in this way.  But it is also easily used by 
> end
> users, even from the graphical user interface.  Here's a quote from Helena
> from two weeks ago: "Being able to compile functions is such a big speed up,
> it's surprising this is not possible with magma or other packages.  It
> makes a huge difference."   My Ph.D. student, Robert Bradshaw, turned
> out to secretly be very good at compilers, and greatly improved SageX
> (which is a fork of a program called Pyrex) for use in SAGE.
>
> For much basic arithmetic (floating point and integer/rational
> arithmetic, numerical linear algebra),  rely on *exactly* the same C
> libraries, namely GMP, MPFR, ATLAS, etc.  This is where probably most
> of the assembly code in the MAGMA core is located -- in GMP -- which
> is also used by SAGE.
>
> To take another example, (presumbly) the exact linear algebra in Magma is
> written in C code.  The analogous functionality for SAGE is provided by:
>(1) basic infrastructure: compiled SageX code that Robert Bradshaw
> and I wrote from scratch
>(2) fast echelon form, charpoly, system solving, etc.: provided by
> Linbox (http://www.linalg.org/), IML
> (http://www.cs.uwaterloo.ca/~z4chen/iml.html), and soon over F_2,
> m4ri, by Gregor Bard/, and both NTL and PARI in some cases.
>
> Linbox is a powerful C++ library that has been under development since 1999 by
> a group of symbolic algebra researchers (starting with Erich
> Kaltofen).  It has a very clear theoretical basis, and many of the
> algorithms are connected with interesting papers.  SAGE is the first
> system to serious use Linbox (and we only started recently), so it's
> getting a lot of stress testing from our use.   As an example of how
> Linbox "works", Clement Pernet, one of the main Linbox developers,
> co-authored a paper last year on a new algorithm for fast charpoly
> computation over ZZ. That algorithm is implemented in the newest
> version of Linbox, and when I compared timings on my laptop, it was
> twice as fast as Magma's charpoly over ZZ, and gaining as n-->oo.
>
> When writing snippets of code, there are some issues that can make
> SAGE seem much slower than MAGMA, if one doesn't know what one is
> doing and hasn't read much of the documentation.  But usually asking
> at sage-support clears things like this up.
>
> Research Mathematician said:
> > I'm sure I am but this is probably because I don't know anything about
> > python. I thought python was being 100% interpreted and magma was
> > being 100% compiled, for example.
>
> Magma is an interpreter.  Part of Magma is written in this interpreter
> language (e.g., 99% of my modular forms code).  Python is an interpreter
> that is itself written in C.  Par

[sage-devel] Sage rep. at ECCAD?

2007-03-27 Thread Jason Martin

Hi All,

Are any Sage evangelists planning to attend the East Coast Computer Algebra Day?

http://eccad07.washcoll.edu/

--jason

-- 
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: Is LiDIA valuable for free software and number theory?

2007-04-23 Thread Jason Martin

I looked at LiDIA a long time ago (like 1999 or 2000) and I remember
being very impressed both with its scope and with its modular (i.e.
recursive) data types and programming dogma.  However, I decided
against using it simply because of its license.  I suspect that other
developers may have made the same decisions and that is why LiDIA
isn't widely used.  If it were GPL'd, then I would certainly find it
extremely useful.

Although Bill is correct in that LiDIA will never be as fast as
from-scratch implementations like FLINT, the advantage of recursive
data types is that you get a lot of code re-use with minimal effort.
Since it's done with a nice object oriented interface, you can change
the implementation at almost any level without having to worry about
propagating changes manually though the code.

LiDIA in GPL'd form would also be very useful to me personally because
LinBox already has hooks built in to use LiDIA... so any data
structure (i.e. Number Fields...) added to LiDIA get propagated into
LinBox with minimal effort.

However, before I spend any time writing code for LiDIA, I want to see
it released under GPL.

--jason

On 4/23/07, Bill Hart <[EMAIL PROTECTED]> wrote:
>
> It would be very valuable if LiDIA were GPL'd! I re-skimread the
> entire 700+ page manual last night.
>
> I'm not certain how many people actually use LiDIA, but my guess is
> not that many. And I really don't know why this is. It is very well
> documented (if not a little too verbosely, and without sufficiently
> many examples), has a wide coverage and is actually structured very
> well. It is a particular shame that it is not widely used given the
> amount of man years that has gone into the project.
>
> My feeling is that it lies between two paradigms. It is neither a
> minimalist superfast C library, nor is it a computer algebra system
> with a nice front end (or at least I don't know of one). So it won't
> get used like the Pari/GP package because of the lack of front end,
> and it wouldn't be my choice if I wanted the greatest possible speed
> if I were writing a C program. For one thing it is not necessarily
> easy to tell which parts of the library are going to be superfast and
> which are going to use generic routines.
>
> However, as far as I can tell, all LiDIA types are recursive on
> account of the generic programming used and everything but p-adics
> seems to be there. It has memory management, error handling and signal
> handling.
>
> It is even possible to do some things quite fast. I realised last
> night that one could use the polynomials over prime fields to
> implement fast polynomial mutiplication if one so desired. Victor
> Schoup seems to have been involved too, so perhaps the specialised
> polynomials over bigints allow for asymptotically fast polynomial
> multiplication, which I imagine is on par with NTL (though I haven't
> checked this). My impression had been that only the naieve polynomial
> multiplication routine had been implemented, but I think I was looking
> at the generic routine not the specialised polynomials over bigints
> routine.
>
> There are various factoring routines available, including (if you look
> in the right place) p-1, p+1, ECM, MPQS and others. However, I
> understand that some of the Pari routines are souped up versions of
> these (e.g. the quadratic sieve in Pari came from LiDIA), so I don't
> expect them to be dazzlingly fast.
>
> The algebraic number theory package seems to contain quite a bit of
> stuff. There's binary quadratic forms, quadratic number fields,
> general number fields, even factorization into ideals and routines for
> working with ideals and elements of number fields. It doesn't have as
> much as Pari, and it is all restricted to absolute extensions of Q as
> far as I can tell, but it doesn't look too bad.
>
> The elliptic curve package has quite a lot of stuff in it, even down
> to implementation of computing Siksek bounds I think. It also has
> stuff for crypto, point counting, generating curves using complex
> multiplication, etc. I'm not sure I'd go so far as to say it contains
> the best implementations of elliptic curve stuff anywhere, but it
> certainly does contain quite a bit of good stuff. I see Nigel Smart,
> John Cremona and others have contributed code.
>
> It doesn't have any p-adic routines obviously, and there aren't
> routines for converting from cubics of genus 1 to Weierstrass elliptic
> curves, I don't think.
>
> There doesn't seem to be a way of resurrecting LiDIA by replacing all
> the basic arithmetic code in it with faster stuff. One can replace the
> kernel with a custom kernel, as far as I can see, but this just
> wouldn't cut it. One would need to replace the entire bigint package,
> and all the other basic packages it sits on (doubles, complex
> arithmetic, reals, rationals, etc). But once one did that, everything
> else on top would run faster automatically because of the way it's
> written. But I'm not personally enamoured of

[sage-devel] Re: SEP: Valgrind & Sage integration: Hunting memory leaks

2007-08-21 Thread Jason Martin

Hi Bill,

What you may be seeing is Valgrind not being able to detect the memory
aliasing you're using in that complex loop condition ((i <
test_mpn_poly->limbs) && (result == 1)).  If you can re-create this
exact error using a similar snipet of code in a tiny example program,
it would be worth sending to the Valgrind folks.

Or, it could be that Valgrind is detecting when you're
test_mpn_poly->limbs hasn't been set :-)

--jason

On 8/21/07, Bill Hart <[EMAIL PROTECTED]> wrote:
>
> Sure, that's precisely what I think the error means. I'm just saying I
> have code where I don't see anything like this, but valgrind still
> reports the error.
>
> The code is quite complex I'm probably just missing something
> somewhere. I mean here is essentially the part which causes the error:
>
> result = 1;
>
> for (unsigned long i = 0; (i < test_mpn_poly->limbs) && (result == 1);
> i++)
> {
>  result = (coeff1[i] == coeff2[i]);
> }
>
> This is run after setting all the entries in coeff1 to equal those in
> coeff2. Essentially that code looks as follows:
>
>FLINT_ASSERT(poly->limbs >= size);
>copy_limbs(poly->coeffs+n*(poly->limbs+1)+1, x, size); // size
> limbs
>poly->coeffs[n*(poly->limbs+1)] = sign; // first limb
>if (poly->limbs > size)
>clear_limbs(poly->coeffs+n*(poly->limbs+1)+size+1, poly->limbs-
> size); // remaining limbs
>
> In short I don't see how there is scope for the error valgrind
> reports. Both i and result are initialised and every entry of coeff1
> is set to something and test_mpn_poly->limbs is set to something when
> the polynomial is initialised. Nevertheless, in context at least, this
> code gives the error mentioned.
>
> Bill.
>
> On 20 Aug, 20:07, mabshoff <[EMAIL PROTECTED]
> dortmund.de> wrote:
> > On Aug 20, 8:10 pm, Bill Hart <[EMAIL PROTECTED]> wrote:
> >
> > > Getting rid of memory leaks also speeds up code dramatically as I
> > > found out recently. When new memory is allocated by the kernel, it
> > > isn't quite ready to be used. As you begin writing to it, pages of
> > > roughly 4kb at a time initiate an interrupt which the kernel has to
> > > deal with, called a minor page fault. These take quite some time to
> > > deal with. So using more and more memory results in more and more
> > > minor page faults. So there is definite benefit in killing memory
> > > leaks, even less serious ones.
> >
> > Hey Bill,
> >
> > > However, there is one "error" which valgrind reports on my own code
> > > from time to time which I have been unable to determine the source of.
> > > It says something like "conditional jump depends on uninitialised
> > > data". I have stared at code for hours trying to determine where these
> > > errors come from. I still have code for which I have been unable to
> > > eliminate such errors.
> >
> > That usually happens in the following circumstance:
> >
> > int i; // this is initialized to zero on any sane system, i.e.
> > anywhere but Windows :)
> >
> > if (i>0)
> >do something;
> >
> > Now valgrind assumes that "conditional jump depends on uninitialised
> > data", i.e. "i". Well, but it is zero anyway would one say. And you
> > would be correct in 99% of all cases, but I fixed a bug very similar
> > to the above in LinBox about 4 weeks ago that caused a crash on Debian
> > unstable's gcc but not with the other 10 compilers I tried. Lesson
> > lerned. The assigment to zero puts i into another segment, so many
> > people avoid it.
> >
> > > I understand the meaning of the error as such, but couldn't determine
> > > why valgrind thought that part of my code contained such an error.
> > > Perhaps valgrind is not infallible, or perhaps I've been missing
> > > something.
> >
> > > Bill.
> >
> > Cheers,
> >
> > Michael
> >
> > 
>
>
> >
>

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: gmp, mpfr, ecm

2007-09-12 Thread Jason Martin

I'm currently testing several of the major GMP patches with gmp-4.2.2.
 Should have some results by this weekend.

On 9/12/07, mabshoff <[EMAIL PROTECTED]> wrote:
>
>
>
> On Sep 12, 10:37 pm, "William Stein" <[EMAIL PROTECTED]> wrote:
> > On 9/12/07, mabshoff <[EMAIL PROTECTED]> wrote:
> >
> >
> >
> >
> >
> > > Kate Minola wrote:
> > > > FYI - the following have recently
> > > > been released:
> >
> > > >   gmp-4.2.2
> > > >   mpfr-2.3.0
> > > >   ecm-6.1.3
> >
> > > > Kate
> >
> > > Hello Kate,
> >
> > > #541, #542 and #642 in sagetrac respectively. I have to admit that  I
> > > added #642 only two hours ago. I expect that I will spend some time on
> > > those tickets during Bug Day 3 (probably the 20th of September)
> >
> > Yep.  However, one thing I want to add is that my understanding
> > is that the GPL-only patches that make xgcd/gcd *vastly* faster
> > for large numbers might only work with gmp-4.2.1, at least
> > without a lot more work to port it, and if so we should wait
> > for that patch to be updated (especially if gmp-4.2.2 is merely
> > a build improvement over gmp-4.2.1).  This sort of thing is
> > just really cool:
> >
> > BEFORE the patch:
> > sage: k=200; n = QQ.random_element(2^k); m=QQ.random_element(2^k)
> > sage: time a=n+m
> > CPU times: user 5.69 s, sys: 0.24 s, total: 5.93 s
> >
> > AFTER the patch:
> > sage: k=200; n = QQ.random_element(2^k); m=QQ.random_element(2^k)
> > sage: sage: time a=n+m
> > CPU times: user 0.99 s, sys: 0.00 s, total: 0.99 s
> > sage: len(str(n))
> > 1204120
> >
> > Having fast arithmetic with million digits rational numbers can be really
> > important...
> >
>
> :)
>
> I consider the gmp update the tricky one. ecm should be straight
> forward. I was concerned about mpfr but according to the SPKG.txt in
> the current mpfr.spkg it is the "vanilla" distribution. I dropped in
> mpfr-2.3.0 in a couple sage installations for fun last week and "-
> testall" with the new release passed on all of them. So we might take
> care of ecm and mpfr soon and sort out the gmp issues later.
>
> > Presumably mpfr and ecm would both work with gmp-4.2.1 if we have
> > to stay with it for some amount of time.
> >
> >  -- William
>
> Cheersm
>
> Michael
>
>
> >
>


-- 
--09-F9-11-02-9D-74-E3-5B-D8-41-56-C5-63-56-88-C0--
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: Burlington, VT on May 30-31, 2008

2007-09-20 Thread Jason Martin

I'm interested.  I am currently teaching a discrete math class where I
am using Sage as an integral component of the course.

The first project, which was due today, had the students finding
primes in consecutive digits of "e"... like the Google competition,
but with a catch: they had to prompt the user for a base, b, and a
number of digits, N, and then search the base-b digits of e for N
digit primes.

--jason


On 9/20/07, Hamptonio <[EMAIL PROTECTED]> wrote:
>
> I'm interested!
>
> My first thought is I could speak on using sage in bioinformatics/
> computational biology courses, although if the organizers had a
> particular focus some of my other interests might be more relevant
> (polytopes, teaching ODEs and calculus with sage, celestial
> mechanics).
>
> Marshall
>
> On Sep 20, 7:59 am, "William Stein" <[EMAIL PROTECTED]> wrote:
> > Hello,
> >
> > If there is a Sage user/developer who is interested in representing
> > Sage -- especially for purposes of teaching -- at an MAA, etc., meeting in
> >
> >   ** Burlington, VT on May 30-31, 2008, **
> >
> > please send me an email.Certainly people on the organizing committee
> > for that meeting _may_ be interesting in putting together a Sage-related
> > component, if there were a speaker or speakers available.
> >
> >   -- William

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: gmp 4.2.2 LGPL V3 issues and other minor tidbits

2007-09-23 Thread Jason Martin

Some thoughts:

1.  I've been doing some performance comparisons on GMP 4.2.2 with the
patches that Sage uses, and I haven't seen any remarkable differences
between 4.2.2 and 4.2.1.  Granted, I have only tested Linux on
AMD64/Intel64 and OS X on Intel64.  Perhaps some other platforms have
a greater difference.  (By the way, the GPL patches for gcd/xgcd seem
to work just fine.)

It is nice that it will compile under OS X now without patching... and
it even builds dynamic libraries.  It is somewhat slow without
patches, though.


2.  I suspect that the GMP developers were very deliberate in their
license choice and will not release it under "LGPLv2 or greater".
Much of the v2/3 license debate hinges on DRM issues: public key
crypto is a big part of most DRM systems, and GMP is a natural choice
for implementing public key crypto... but it can't hurt to ask.


3.  How much of Sage is under "v2 Only"?  That's the only portion that
should cause problems isn't it?

--jason

On 9/23/07, William Stein <[EMAIL PROTECTED]> wrote:
>
> On 9/23/07, Mike Hansen <[EMAIL PROTECTED]> wrote:
> > It seems odd that closed source software could use GMP under the
> > LGPLv3, but that a GPLv2 project could not.  How tightly integrated is
> > the GMP stuff?  Aren't we pretty much just using it as a library?
>
> We are just using it as a library.  The problem isn't LGPLv3,
> but GPLv2 itself! But please see
> http://gplv3.fsf.org/dd3-faq
> where it is made crystal clear that in fact a GPLv2 project can't
> even use an LGPLv3 library in library-only mode.
>
> There is a discussion here:
>   http://lwn.net/Articles/241065/
>
> In short, Magma and Maple can use GMP under LGPLv3, but
> Sage can't, because Sage is GPLv2, and the GPLv2 specifically
> disallows linking against libraries that are more restrictive
> (except things like the C library).
>
>  -- William
>

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] gmp 4.2.2 vs. 4.2.1 performance

2007-09-23 Thread Jason Martin

On 9/23/07, William Stein <[EMAIL PROTECTED]> wrote:
>
> On 9/23/07, Jason Martin <[EMAIL PROTECTED]> wrote:
> > Some thoughts:
> >
> > 1.  I've been doing some performance comparisons on GMP 4.2.2 with the
> > patches that Sage uses, and I haven't seen any remarkable differences
> > between 4.2.2 and 4.2.1.  Granted, I have only tested Linux on
> > AMD64/Intel64 and OS X on Intel64.  Perhaps some other platforms have
> > a greater difference.  (By the way, the GPL patches for gcd/xgcd seem
> > to work just fine.)
>
> Is GMP-4.2.1 with the gcd/xgcd patch vastly faster than GMP-4.2.2 at
> what the gcd patch is for (i.e., gcd's of million digit numbers)?

I didn't test that.  My test was that I built gmp-4.2.2 with gcd
patches, and then I built Pari with it and all the tests passed.  I
didn't do timing test with the gcd patches because I didn't expect any
differences in gcd calculations done with a patched 4.2.1 (see below).
 The gcd routines all rely on the underlying mpn routines (which
implement the very basic add, sub, mul, div functions).  Since the gcd
patches are the same, they won't change in speed if the underlying mpn
routines don't change.  The only issue I was concerned with was would
the gcd patches compile with the new build scripts in 4.2.2.

> > It is nice that it will compile under OS X now without patching... and
> > it even builds dynamic libraries.  It is somewhat slow without
> > patches, though.
>
> Do you mean that GMP-4.2.2 is somewhat slow without patches?
> Or that GMP-4.2.1 is?  Or?

Sorry, I should have been more specific:

On OS X (Intel, Core2)
--
What I meant was that 4.2.2 will compile on OS X without any patching,
and it will build dynamic libraries.  However, the default built of
4.2.2 under OS X is very slow.  Patching it with my patches (in 64bit
mode) gives the same performance as the patched 4.2.1 code (tested
with gmpbench), and you can build dynamic libraries.

On Linux
-
On AMD64 machines, there is no change.  Using Gaudry's patches give
you a huge speed up just like with 4.2.1.  On Core2 machines, 4.2.2 is
marginally faster than 4.2.1 due to a new mpn parameters header file
being included in the code.  If you use my patches, then 4.2.2 is just
as fast as 4.2.1 with my patches (a 30-40% speed up over unpatched
versions).

A quick analysis of the GMP-4.2.2 code shows that almost all of the
updates have been in the build scripts.  There is a new mpn parameters
header file which is just the result of "make tune" being run on a
Core2, but that isn't very significant.

Bottom line:  Performance-wise, the *patched* version of 4.2.2 will be
roughly equivalent to *patched* version of 4.2.1.  The patches are
compatibly with both 4.2.1 and 4.2.2.

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: gmp 4.2.2 LGPL V3 issues and other minor tidbits

2007-09-23 Thread Jason Martin

My vote would be to change the sage license to "GPLv2 or later" and
try to get the Singular developers to do likewise.  Mainly because
that is less work.

Does changing Sage to "v2 or later" require Sage to adopted future GPL
changes?  My interpretation is that it simply gives users the option
to re-distribute it according to later versions of the GPL.  It
doesn't obligate Sage to adopt those future changes, does it?

--jason

> The *only* options I can think of right now are:
>(1) Stick with GPLv2 and *fork* every FSF project Sage depends on:
>  * GMP
>  * GSL
>  * GNUtls (openssl replacement)
> Anything else?
>
>(2) Change the Sage license to GPLv2 or later, and get clarification
>  about the same issue from the Singular developers.
>
> More details:
>
>(1) Make a stand and stick with GPLv2.  This will mean in the long
> run that we will have to FORK, never ever again ship updated
> versions of, or remove dependence on every FSF-owned project.
> This is definitely possible, since the projects are currently very
> mature:
>  * GSL -- they just released GSL v1.10 under GPLv3 only (sage
> currently includes GSL v1.9),
>  * GMP -- they just released GMP v4.2.2 under GPLv3 only (sage
> currently includes on GMP v4.2.1),
> It appears that a huge number of FSF/GNU projects are having
> new releases under GPLv3 *only* right now (not GPLv2 or later).
> I.e., FSF is very aggressively pushing their license in a technical sense.
>
>   (2) We change the Sage license to GPLv2 or later, and change or
> eliminate all components of Sage that are GPL v2 only.
> As far as I can tell Singular is the only 3rd party component of
> Sage that is in fact clearly GPL v2 only.   Please correct me if I'm
> wrong about that.
>
> 
>
> I think both options are viable, since I suspect that the only projects
> Sage uses that will switch to GPLv3 only are the FSF projects -- most
> projects will just stick with "GPLv2 or greater".  Option (1) means more
> work for us, though GSL is pretty much *done* -- it hasn't changed
> much in years, and likewise GMP hasn't had anything interesting
> happen release-wise in nearly 2 years. (The most interesting GMP work has
> been outside the GMP project.)
>
> If you can think of a genuinely viable third option, or have strong
> feelings about which of 1 or 2 is better, now is the time to speak
> up.  I have put off this GPLv3 license discussion repeatedly during
> the last year when it came up.  But now it can't be ignored anymore.
> I greatly value everybody's feedback.
>
>  -- William

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: gmp 4.2.2 LGPL V3 issues and other minor tidbits

2007-09-25 Thread Jason Martin

> Actually GMP is far from stale.  Anyway, I put the chances of a
> viable GMP fork in the next year at 1% (see below).
>
> It would be very useful to figure out what the situation is with Singular's
> licensing plans.  Do they have a mailing list or something?
>
>  -- William
>
> Why I think GMP won't be  fork: Torbjörn is
> really the organizing force behind GMP, as far as I can tell, and he
> seems completely OK with LGPLv3 as a license for GMP.I don't know
> of
> any serious players who have the necessary resources and who are
> interested in forking GMP, and the license change from LGPLv2 to LGPLv3
> likely has no impact on Maple and almost none on Magma.Basically
> before any of this LGPLv3 stuff, various people have made noise about
> forking GMP for more serious reasons, and nothing happened, so I doubt it
> would happen now.

I also have serious doubts that GMP will be successfully forked.  I
looked into doing this myself, and the simple, painful, truth is that
the build scripts in GMP are more complicated than the mathematics.
(And, several times a year, someone on the GMP mailing lists suggests
forking the project, to which Torbjörn always responds, "go ahead,"
but I can't find any successful GMP forks on the web.)  Maintaining
build scripts for assembly level code for hundreds of platforms has
got to be a painfully tedious task that no one else wants to take on.
Torbjörn may be somewhat tough to work with, but you have to give the
guy credit: he's probably one of the best all around programmers on
the planet.

So, I think that there is only one realistic option: Sage must go "v2 or later".

--jason

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: SAGE on Leopard

2007-10-29 Thread Jason Martin

Has anyone tried a 64-bit build on Leopard?  I just picked up Leopard
today, so I'll hopefully have a running Leopard machine tomorrow on
which to start playing.  Just wondering if, for example, python can
build in 64-bit mode on Leopard.

--jason

On 10/29/07, William Stein <[EMAIL PROTECTED]> wrote:
>
> On 10/29/07, Justin Walker <[EMAIL PROTECTED]> wrote:
> >
> > I figured I'd give this a try.  The first to go is 'flint', with this
> > error:
> >
> > g++ -single_module -fPIC -dynamiclib -o libflint.dylib mpn_extras.o
> > Z.o memory-\
> > manager.o Z_mpn.o ZmodF.o ZmodF_mul.o ZmodF_mul-tuning.o fmpz.o
> > fmpz_poly.o mpz\
> > _poly-tuning.o mpz_poly.o ZmodF_poly.o -lgmp
> > ld: duplicate symbol ___gmpz_abs in Z.o and mpn_extras.o
> >
> > Any thoughts?  I'm doing the old 'touch and go' builds to see what
> > falls apart where.
>
> I was able to do a 100% build of Sage on Leopard yesterday.
> See
>
>http://trac.sagemath.org/sage_trac/ticket/1005
>
> for unpleasant work-arounds for all problems.  I also
> posted a binary built on my laptop under Leopard here:
>http://sagemath.org/SAGEbin/apple_osx/intel/leopard/
>
>
> --
> William Stein
> Associate Professor of Mathematics
> University of Washington
> http://wstein.org
>
> >
>

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Sage mirror modular.math.jmu.edu down through this weekend

2007-11-06 Thread Jason Martin

The sage mirror I run, modular.math.jmu.edu currently has a full disk,
so it will be down/out-of-date through this weekend when I'll
switch-out the disk with a newer one.

--jason

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Improved GMP code for Core 2 machines (Including Core 2 Macs)

2006-12-02 Thread Jason Martin

Hi All,

I've released a new patch for GMP on Core 2 machines.  The patch works
for Linux and Mac OS X (and probably other Unix type systems that run
on Core 2 machines, but I haven't tested them).  The patch includes an
installation script which detects the processor type and will only
install if it finds a Core 2 processor (so hopefully it will be easy
to include in build scripts).

The patch is available on my web page:

   http://www.math.jmu.edu/~martin

I've gotten GMPbench scores of 8263 with it on my 2.66GHz Mac Pro
(Xeon/Woodcrest cores), which means that with this patch the new Intel
chips are actually comparable to AMD cpus for computational number
theory.

If you use it, let me know how it works for you.  If you need stuff
added to make it useful, please let me know.

Thanks,
jason

---
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: google and "sage dot math"

2006-12-04 Thread Jason Martin

Another alternative for manipulating Google is to get everyone you
know to include links to (a single) Sage webpage.  Also, putting even
a short entry in Wikipedia (with a link to the same Sage page) seems
to help tremendously.  Try to get folks to link to sage from their
"official" pages (for example the pages that are listed on math
department home pages), as Google values those "authoritative" links
more.

--jason

On 12/4/06, William Stein <[EMAIL PROTECTED]> wrote:
>
> Hi,
>
> Searching for "sage" on google today dropped from page 2 all the way to
> page 4! Yuck.
> However, seaching for "sage math" or "sage.math" brings up the SAGE
> webpage as
> the top hit.  Thus I hope everyone here will tell people who they meet to
> search
> for "sage.math" when they want to find SAGE.  Perhaps, we should go so far
> as
> to officially name the software "sage.math" (pronounced "sage dot math"),
> though
> of course it will often be abbreviated as "sage".  Thoughts?!!
>
> William
>
> >
>


-- 
---
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: SAGE development machines

2006-12-13 Thread Jason Martin

Another consideration, since you've already decided on a Mac Pro, is
to get Parallels (a commerical virtualization package for OS X).  It
will allow you to run all the Linux and Windows Distributions you want
on the Mac Pro.  Plus, if you wait until late Jan. to purchase the
machine, it will probably have 2 Quad-core Xeons in it... that's 8
cores which is very well suited to virtualization.  Load that thing up
with (third party!) RAM and you'll have all the hardware you need for
Intel Core 2 testing.


On 12/13/06, Fernando Perez <[EMAIL PROTECTED]> wrote:
>
> On 12/13/06, William Stein <[EMAIL PROTECTED]> wrote:
>
> > No.  But timing and performance are primarily hardware issues
> > not flavor-of-linux issues.  Vmware would *only* be used to
> > provide access to a range of Linux disributions for build testing
> > (and this could already be set up on sage.math).  For performance
> > testing VMware wouldn't be used.
>
> I've had limited, but extremely positive experiences with VMWare (keep
> in mind it's free-as-in-beer now).  It's easy to set up each image
> with a dedicated IP on a private subnet and have a user-accessible
> startup script that does something like:
>
> start_vmware_$OSFLAVOR
> sleep($BOOT_TIME)
> ssh $VIRTUAL_IP sage_test
> ssh $VIRTUAL_IP sudo shutdown
>
> This way, using a big machine with fast disks, 4 or more cores and
> gobs of RAM will give you an always-on build/test farm that  can do
> multi-OS tests far more conveniently than anything else out there.
>
> I can't speak for Xen (never used it, I'm sure it's good too), but I'm
> a MAJOR fan of virtualization technologies these days.  They really
> cut a lot of pain out of many common problems, and they beat dual
> booting by a million miles (unless you need direct OpenGL or other
> such hardware access, which is not the case here).
>
> Cheers,
>
> f
>
> >
>


-- 
---
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---



[sage-devel] Re: sage-2.0 !

2007-01-04 Thread Jason Martin


Hi All,

Just to chime in a little bit here.  I prefer (a) (the integrate it
into PARI option) because I'm currently working on a C/OpenMPI code to
perform a bunch of the PARI linear algebra stuff in parallel systems.
Should have flakey-half-way-working code by mid. Feb., but my target
is to make many of the routines dropin replacements for many of the
PARI routines.  So, hopefully code that uses those routines will be
easily parallelizable.

By the way, if anyone is at the AMS-MAA joint meetings right now,
leave me a message at the message board (email is kind of spotty
here).

--jason

---
Jason Worth Martin
Asst. Prof. of Mathematics
James Madison University
http://www.math.jmu.edu/~martin
phone: (+1) 540-568-5101
fax: (+1) 540-568-6857

"Ever my heart rises as we draw near the mountains.
There is good rock here." -- Gimli, son of Gloin

On 1/4/07, David Harvey <[EMAIL PROTECTED]> wrote:



On Jan 4, 2007, at 6:57 PM, William Stein wrote:

>
> On Thu, 04 Jan 2007 15:43:53 -0800, David Harvey
> <[EMAIL PROTECTED]> wrote:
>> (a). This is my own personal bias, because I can see step-by-step
>> how  it could bedone; it is straightforward. PARI integration
>> sounds much  harder. For a startit's not even clear to me how we
>> would be  representing data. PARI already hasits own data
>> structure for  representing polynomials.
>
> I'm not sure what Bill Hart had in mind, but the following (below)
> is what I have in mind when I imagine PARI/Flint integration.

[]

OK, so basically you're saying, find the part of PARI that gets
called whenever polynomials need to be multiplied, add a line like
"if (size > 42) ..." which converts to FLINT format, runs the
multiplication, and converts back to PARI format. So the integration
you have in mind is not all that tight. I guess this would be doable,
and I agree it shouldn't be too difficult.

David


>




--

--~--~-~--~~~---~--~~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://sage.scipy.org/sage/ and http://modular.math.washington.edu/sage/
-~--~~~~--~~--~--~---