On Jul 30, 2012, at 12:48 PM, Paweł Jaromin wrote:
> make all
> Building file: ../src/snd_0.1.c
> Invoking: GCC C Compiler
> mpicc -I/usr/include/mpi -O0 -g3 -Wall -c -fmessage-length=0 -MMD -MP
> -MF"src/snd_0.1.d" -MT"src/snd_0.1.d" -o "src/snd_0.1.o"
> "../src/snd_0.1.c"
> ../src/snd_0.1.c:24:
normal MPI compiling:
Build of configuration Debug for project snd_0.1
make all
Building file: ../src/snd_0.1.c
Invoking: GCC C Compiler
mpicc -I/usr/include/mpi -O0 -g3 -Wall -c -fmessage-length=0 -MMD -MP
-MF"src/snd_0.1.d" -MT"src/snd_0.1.d" -o "src/snd_0.1.o"
"../src/snd_0.1.c"
../s
I built from a tarball, not svn. In the VERSION file I have
svn_r=r26429
Is that the information you asked for?
Daniel
users-boun...@open-mpi.org wrote on 07/30/2012 04:15:45 PM:
>
> Do you know what r# of 1.6 you were trying to compile? Is this via
> the tarball or svn?
>
> thanks,
>
> --
FWIW: the rmcast framework shouldn't be in 1.6. Jeff and I are testing removal
and should have it out of there soon.
Meantime, the best solution is to "--enable-mca-no-build rmcast"
On Jul 30, 2012, at 7:15 AM, TERRY DONTJE wrote:
> Do you know what r# of 1.6 you were trying to compile? Is thi
Ralph actually suggests that we just remove rmcast from 1.6.1.
On Jul 30, 2012, at 10:15 AM, TERRY DONTJE wrote:
> Do you know what r# of 1.6 you were trying to compile? Is this via the
> tarball or svn?
>
> thanks,
>
> --td
>
> On 7/30/2012 9:41 AM, Daniel Junglas wrote:
>> Hi,
>>
>> I co
Do you know what r# of 1.6 you were trying to compile? Is this via the
tarball or svn?
thanks,
--td
On 7/30/2012 9:41 AM, Daniel Junglas wrote:
Hi,
I compiled OpenMPI 1.6 on a 64bit Solaris ultrasparc machine.
Compilation and installation worked without a problem. However,
when trying to ru
Hi,
I compiled OpenMPI 1.6 on a 64bit Solaris ultrasparc machine.
Compilation and installation worked without a problem. However,
when trying to run an application with mpirun I always faced
this error:
[hostname:14798] [[50433,0],0] rmcast:init: setsockopt() failed on
MULTICAST_IF
for m
Please show me how you are compiling the program under gcc and mpicc.
Plus do a "mpicc --showme".
--td
On 7/30/2012 8:33 AM, Paweł Jaromin wrote:
This situation is also strange for me, I spend 2 days to find a bug :(.
Unfortunately I am not a professional C/C++ programmer, but I have
to ma
This situation is also strange for me, I spend 2 days to find a bug :(.
Unfortunately I am not a professional C/C++ programmer, but I have
to make this program. Please have a look in a picture from link below,
maybe it will be more clear.
http://vipjg.nazwa.pl/sndfile_error.png
2012/7/
On 7/30/2012 6:11 AM, Paweł Jaromin wrote:
Hello
Thanks for fast answer, but the problem looks a little different.
Of course, I use this code only for master node (rank 0), because only
this node has an access to file.
As You can see i use "if" clause to check sndFile for NULL:
if (sndFile ==
Hi,
I new to MPI programming, i had a task that i have around 3 nodes and where
3 nodes consists of 50 process so where i need to write an parallel quick
sort considering 2,4,8,16 32,42,50. so ineed some ideas and materials to
develop this code so kindly help me by providing your valuable suggestio
Hello
Thanks for fast answer, but the problem looks a little different.
Of course, I use this code only for master node (rank 0), because only
this node has an access to file.
As You can see i use "if" clause to check sndFile for NULL:
if (sndFile == NULL)
and it returns not NULL value, so the
I am not sure I am understanding the problem correctly so let me
describe it back to you with a couple clarifications.
So your program using sf_open compiles successfully when using gcc and
mpicc. However, when you run the executable compiled using mpicc
sndFile is null?
If the above is rig
13 matches
Mail list logo