[OMPI users] mpirun error on MAC OSX 10.6.8

2015-02-17 Thread Tarandeep Kalra
Hello friends,

I am using mpi for the first time on my MAC OSX (10.6.8). The MPI that I
installed is Open MPI. I have installed it through homebrew. It is
installed as I can autocomplete mpi--- commands from my terminal.



*When i run mpirun --versionIt returns to mpirun (Open MPI) 1.8.4*

However, when I am using the mpirun command to run a basic code like to
print hello with 2 cores. It does not show any error message and nothing is
displayed after the command is run. To simplify the problem I tried "mpirun
uname" to check if even 1 core responds. This also does not respond.

Thank you for helping me on this.
Taran


Re: [OMPI users] mpirun error on MAC OSX 10.6.8

2015-02-17 Thread Ralph Castain
OSX 10.6.8?? Are you sure? That is incredibly old - I haven’t seen such a 
system in quite some time.


> On Feb 17, 2015, at 8:04 AM, Tarandeep Kalra  wrote:
> 
> Hello friends, 
> 
> I am using mpi for the first time on my MAC OSX (10.6.8). The MPI that I 
> installed is Open MPI. I have installed it through homebrew. It is installed 
> as I can autocomplete mpi--- commands from my terminal. 
> 
> When i run mpirun --version
> It returns to 
> mpirun (Open MPI) 1.8.4
> 
> However, when I am using the mpirun command to run a basic code like to print 
> hello with 2 cores. It does not show any error message and nothing is 
> displayed after the command is run. To simplify the problem I tried "mpirun 
> uname" to check if even 1 core responds. This also does not respond.
> 
> Thank you for helping me on this.
> Taran
> ___
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post: 
> http://www.open-mpi.org/community/lists/users/2015/02/26332.php



Re: [OMPI users] mpirun error on MAC OSX 10.6.8

2015-02-17 Thread Tarandeep Kalra
It is a 2011 Macbook pro. Depends on what you think is old.

Taran

On Tue, Feb 17, 2015 at 11:21 AM, Ralph Castain  wrote:

> OSX 10.6.8?? Are you sure? That is incredibly old - I haven't seen such a
> system in quite some time.
>
>
> On Feb 17, 2015, at 8:04 AM, Tarandeep Kalra  wrote:
>
> Hello friends,
>
> I am using mpi for the first time on my MAC OSX (10.6.8). The MPI that I
> installed is Open MPI. I have installed it through homebrew. It is
> installed as I can autocomplete mpi--- commands from my terminal.
>
>
>
> *When i run mpirun --versionIt returns to mpirun (Open MPI) 1.8.4*
>
> However, when I am using the mpirun command to run a basic code like to
> print hello with 2 cores. It does not show any error message and nothing is
> displayed after the command is run. To simplify the problem I tried "mpirun
> uname" to check if even 1 core responds. This also does not respond.
>
> Thank you for helping me on this.
> Taran
>  ___
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post:
> http://www.open-mpi.org/community/lists/users/2015/02/26332.php
>
>
>
> ___
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post:
> http://www.open-mpi.org/community/lists/users/2015/02/26333.php
>


Re: [OMPI users] mpirun error on MAC OSX 10.6.8

2015-02-17 Thread Jeff Squyres (jsquyres)
On Feb 17, 2015, at 11:36 AM, Tarandeep Kalra  wrote:
> 
> It is a 2011 Macbook pro. Depends on what you think is old.

:-)

I can't say we've tested Open MPI 1.8.x on OS X 10.6.x -- there may well be 
some kind of weirdness there.

Can you try Open MPI 1.6.5?  You'll likely need to download the source tarball 
from www.open-mpi.org and build/install it yourself (e.g., under your $HOME).

Open MPI 1.6.x is also old, but it matches the timeframe of OS X 10.6.x.

-- 
Jeff Squyres
jsquy...@cisco.com
For corporate legal information go to: 
http://www.cisco.com/web/about/doing_business/legal/cri/



Re: [OMPI users] mpirun error on MAC OSX 10.6.8

2015-02-17 Thread Ralph Castain
I believe 10.6 goes back well before 2011, but that’s beside the point - just 
haven’t heard someone running that old a version of OSX. I doubt that 1.8.4 
supports it, or that homebrew built it and tested it on something that old (I 
know we don’t).

Only thing I can suggest is perhaps getting your own tarball and trying to 
build it yourself and see if that works. Ensure you have —enable-debug on the 
configure line so you can turn on diagnostics if you hit a problem.


> On Feb 17, 2015, at 8:36 AM, Tarandeep Kalra  wrote:
> 
> It is a 2011 Macbook pro. Depends on what you think is old.
> 
> Taran
> 
> On Tue, Feb 17, 2015 at 11:21 AM, Ralph Castain  > wrote:
> OSX 10.6.8?? Are you sure? That is incredibly old - I haven’t seen such a 
> system in quite some time.
> 
> 
>> On Feb 17, 2015, at 8:04 AM, Tarandeep Kalra > > wrote:
>> 
>> Hello friends, 
>> 
>> I am using mpi for the first time on my MAC OSX (10.6.8). The MPI that I 
>> installed is Open MPI. I have installed it through homebrew. It is installed 
>> as I can autocomplete mpi--- commands from my terminal. 
>> 
>> When i run mpirun --version
>> It returns to 
>> mpirun (Open MPI) 1.8.4
>> 
>> However, when I am using the mpirun command to run a basic code like to 
>> print hello with 2 cores. It does not show any error message and nothing is 
>> displayed after the command is run. To simplify the problem I tried "mpirun 
>> uname" to check if even 1 core responds. This also does not respond.
>> 
>> Thank you for helping me on this.
>> Taran
>> ___
>> users mailing list
>> us...@open-mpi.org 
>> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users 
>> 
>> Link to this post: 
>> http://www.open-mpi.org/community/lists/users/2015/02/26332.php 
>> 
> 
> ___
> users mailing list
> us...@open-mpi.org 
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users 
> 
> Link to this post: 
> http://www.open-mpi.org/community/lists/users/2015/02/26333.php 
> 
> 
> ___
> users mailing list
> us...@open-mpi.org
> Subscription: http://www.open-mpi.org/mailman/listinfo.cgi/users
> Link to this post: 
> http://www.open-mpi.org/community/lists/users/2015/02/26334.php



Re: [OMPI users] prob in running two mpi merged program (UNCLASSIFIED)

2015-02-17 Thread Burns, Andrew J CTR (US)
Classification: UNCLASSIFIED
Caveats: NONE

Sounds like there is an issue in transferring the value to the core that is 
printing out your timestep data.

I doubt that the problem is MPI_COMM_WORLD, that controls the communication 
description and shouldn't affect the values transferred.

You will want to look at your MPI_Reduce and MPI_Send calls to make sure that 
everything is ending up on the correct cores. Keep in
mind that with MPI each process has its own local copy of a value, so changes 
on one core don't affect any others.

-Andrew

-Original Message-
From: users [mailto:users-boun...@open-mpi.org] On Behalf Of Muhammad Ashfaqur 
Rahman
Sent: Saturday, February 14, 2015 6:10 AM
To: Open MPI Users
Subject: Re: [OMPI users] prob in running two mpi merged program (UNCLASSIFIED)

Dear Andrew Burns,

Many thanks to you for providing steps to check my programs. The combined 
program is now running parallel. But the values from one
of the program are appearing as NaN. The possible reason may be the 
MPI_COMM_WORLD. I am still trying to sort it out. I have
attached here the related files and outputs for your kind suggestions:


Regards

Ashfaq


On Fri, Feb 6, 2015 at 6:35 PM, Burns, Andrew J CTR (US) 
 wrote:


Classification: UNCLASSIFIED
Caveats: NONE

The placing of clminitialize and clmstop feel a little awkward, but it 
doesn't look like they would break the program. If
you were
calling MPI_Init more than once it would throw an error and if Finalize 
were called early in clmstop the only serial section
would
be the deallocation.



One other thought is to ensure that you are properly launching the 
program as multicore.

The command should be similar to as follows (where NPROCS is the number 
of cores being used):

mpirun -n NPROCS ./program

If you were to launch the program with simply "./program" it would run 
as serial. It would also run as serial if you were to
call
"mpirun ./program" since no number of processes are specified.




If the program is properly launched in parallel and then converts to 
serial, you should be able to track down the location
where
this happens by inserting some core polling similar to the following 
pseudocode:

for (i = 0; i < numprocs; ++i)
{
  if (i = coreid)
  {
print("core ", id, " out of ", numprocs)
  }
  MPI_Barrier()
}



You will want to check all of the calls inside the main loop to ensure 
that none of them call finalize.

-Andrew Burns

-Original Message-
From: users [mailto:users-boun...@open-mpi.org] On Behalf Of Muhammad 
Ashfaqur Rahman
Sent: Friday, February 06, 2015 9:50 AM
To: Open MPI Users

Subject: Re: [OMPI users] prob in running two mpi merged program 
(UNCLASSIFIED)

Dear Andrew Burns,
Many thanks for your correct understanding and descriptive suggestion.
I have now changed the FLAGS of one program for not to take any MPI 
tags, i.e., switched off MPI. And then for the another
set kept
open for MPI options.
After that call the MPI_Initialize to the beginning of Main program 
(aadmn.F here) and call MPI_Finalize containing program
(clmstop.F90 here) at the end part.
After compilation it is still serial.
I have attached here the FILES containing MPI calling and Makefile for 
your kind consideration.


Regards
Ashfaq

On Thu, Feb 5, 2015 at 8:25 PM, Burns, Andrew J CTR (US) 
 wrote:


Classification: UNCLASSIFIED
Caveats: NONE

Okay, I think I may get what's going on. I think you're calling 
one mpi capable program from within another mpi
program.
What you
have to do is assume that the program that is being called 
already had MPI_Init called and that MPI_Finalize will be
called
after
the program returns.

Example (pseudocode for brevity):

int main()
{
  MPI_Init();

  int x;

  int p2result = Program2(x, comm);

  MPI_Bcast(p2result, comm);

  MPI_Finalize();
}

int Program2(int x, MPI_Comm comm)
{
  int returnval;
  MPI_AllReduce(&returnval, x, comm);
  return returnval;
}



If the second program were to be:

int Program2(int x, MPI_Comm comm)
{
  MPI_Init();
  int returnval;
  MPI_AllReduce(&returnval, x, comm);
  return returnval;
  MPI_Finalize()
}

The program would return to serial when MPI_Finalize is first 
called, potentially