On Jan 22, 2008, at 19:17 , mabshoff wrote:
> > > Hi Justin, > > On Jan 23, 3:55 am, "Justin C. Walker" <[EMAIL PROTECTED]> wrote: >> On Jan 22, 2008, at 6:15 PM, mabshoff wrote: >> What are the problems that using "-arch" causes? I believe that the >> endian macros and other related mechanisms will work correctly, as >> long as you have the correct pieces installed. >> >> I have not tried to build "fat" on anything particularly complex, but >> in principle, this will work because Apple has done it for all of >> their software (except maybe the kernel; it is delivered 'fat', but I >> don't know that it is built this way or the way you suggest below). > > I am fairly certain they build universal with -arch flags. The lipo > approach seems like a rather low-tech workaround for Apple to use it, > especially since the while -arch flag business comes from Apple > themselves and isn't even in the main gcc 4.2 yet. If I understand this correctly, the use of 'lipo' and the 'arch' flag are orthogonal. The 'arch' flag tells the driver "gcc" which pieces to invoke (the details are actually a bit more complicated). For example, "-arch ppc -arch i386" will run the compiler twice and produce an "a.out" (or ".o" or ".dylib") with both architectures represented. I haven't checked the source to see whether this is done essentially by re-implementing the 'lipo' functionality, or by invoking 'lipo' directly. >>> What can be done is the following: Build x86 and ppc >>> independently on >>> native systems. Then use lipo to combine all libraries and >>> executables >>> into universal binaries. >> >> This will certainly work. The only difference between this approach >> and doing it on one system is that you have have the right SDKs >> installed, and you have to modify the build procedure to build >> "cross". This involves, in essence, the judicious use of "isysroot" >> and "libsysroot' flags in the build flags. >> >> It may be simpler to build separately and lipo everything, even if >> it's done on one system. >> >> Xcode documentation has a pretty good description of the process. > > Got any pointers? Too lazy to dig around ;) You have two (or maybe more :-}) choices: Run Xcode and search the documentation for 'cross-development'; or go to the Apple Developer website (<http://developer.apple.com>) and search for 'cross- development'. You get more or less the same information, I think (depending on what Xcode and doc versions you have installed). >> - are there build differences between Mac OS X on Intel >> and on PowerPC (as encoded in spkg-install scripts)? > > As far as I can think of not. I am currently modifying spkgs to build > 64 bit build of Sage on OSX and the normal fix is to pass proper [C| > CXX|CPP|LD]FLAGS, so passing -arch x86 x86-64 ppc ppc64 might be an > option. Note that I tried that with python 2.5.1. and it doesn't work > there and while there are various workarounds posted to get this to > work they all fail. But Apple ships a universal python, so we might > either default to use that on OSX or somebody needs to go off and > figure out what they did. In case anybody does: the posix-element.c > workaround we use in our python spkg breaks on 64 bit OSX on 10.5, but > the fix isn't too hard. Other issues include disutils for Python, i.e. > numpy fails to build currently because it seems to miss #ifdefs in a > crucial point. Something I don't know is the extent of 64-bit PowerPC support in 10.4. Also, building 'universal' (as opposed to separate builds + lipo) is a bit tricky when you want to build universals that support say 10.4 *and* 10.5. For us it may not be that big a deal, since we in general will build most of the needed libraries ourselves, and there isn't a big change in Unixy APIs. The GUI apps have a bigger problem, but that doesn't affect us (ignoring "Cocoa Sage"). >> - can the 'autoconf' scheme work in the cross-development >> environment provided by Xcode? > > I don't know if that is relevant since we don't use it and you can > make Xcode execute some custom build script. Am I misunderstanding you > somehow? Possibly :-} I meant that I don't know whether 'configure' scripts, or the 'autoconf' "script builder" can be made to do cross development, and in particular, be made to detect "i386-apple-darwin" when running on "powerpc-apple-darwin". > But in the end it boils down to this: Do people want universal > binaries, considering the size tradoff? I do think so, but I hope this > isn't like the live-CD where everybody says that it is a good idea but > when push comes to shove few people step up and use it. This is a very good point: it would be useful to know the percentage of a full "bdist" (say) of sage taken up by "executable bits" ("a.out"s, ".a"s, and ".dylib"s). As a SWAG: sage 2.10 "bdist": 992284 1k-blocks sage 2.10 "executable bits": 381944 1k-blocks and I do mean "SWAG". The latter number is the number of blocks in use in the "local/bin" and "local/lib" directories, excludin "local/ lib/python" and "local/lib/R", adding back in those numbers for the executable bits I found in "local/lib/R" (but nothing added in from "local/lib/python". Justin -- Justin C. Walker, Curmudgeon-At-Large Director Institute for the Enhancement of the Director's Income -------- "Weaseling out of things is what separates us from the animals. Well, except the weasel." - Homer J Simpson -------- --~--~---------~--~----~------------~-------~--~----~ To post to this group, send email to sage-devel@googlegroups.com To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/sage-devel URLs: http://www.sagemath.org -~----------~----~----~----~------~----~------~--~---