On Sun, Mar 10, 2013 at 09:00:02PM -0400, Ilya Kaliman wrote: > Hello, hackers! > > I have a strange problem with a big executable. I have a piece of scientific > software and some C++ module in it with a LOT of template code. When compiled > this module produces 450 MB archive file (w/o debugging symbols). Now, when I > compile the program without this module everything works perfect. With this > module turned on the linker produces an executable (around 180 MB in size) > without any errors or warnings. But when I try to start the final program zsh > just says: abort. ldd on this exe says: ./a.out: signal 6. > > I watched the memory consumption during linking and it doesn't seem to exhaust > all available memory (the linker seem to stop allocating on around 2 GB). I've > also tried to enable --no-keep-memory for ld with no luck - linking still > produces no errors but the resulting executable is unusable. > > I've tried it on 9.1 and 10-CURRENT with both gcc/g++/ld from the base system > and from ports (gcc 4.7.3, binutils 2.23.1) and with clang. > > I've tried to build some of the bigger ports like chromium (just in case): > all works fine. > > Everything works on linux though (with the same gcc/ld). With debugging > symbols > the exe is around 1GB, without them its around 200MB. Works fine in every case > with different optimization levels. > > Any ideas how to solve this?
For start, it would be nice to provide some useful information together or even instead of the long story. What is the architecture ? Show at least the output of the size(1) on the final binary. Show exact shell message on the attempt of the binary run. Show the ktrace/kdump of the start attempt. As a guess, look at the sysctl kern.maxtsiz and compare it to the text size reported by the size(1). The same for the initialized data segment and kern.maxdsiz.
pgph__UaQKK53.pgp
Description: PGP signature