>>>>> On Tue, 28 Nov 2006 13:52:30 -0500 (EST), 
>>>>>      Alexander Shirokov <[EMAIL PROTECTED]> said:

> I would like to embed guile interpreter into my application - a
> parallel program using MPI (message passing interface) and operating
> massive data and computations. I would like that program to be able to
> process standard input in order to be able to have a live interactive
> session with my application. Below I describe the problem i
> encountered.

...

> With guile however, I am limited to using

>     scm_shell(argc, argv);

> which is supposed to do the stdin processing itself, - I hoped it would
> even in the parallel environment. I inserted

>     MPI_Init(&argc,&argv);
>     MPI_Finalize()

> into the tortoise.c program of the guile tutorial (the complete copy of
> the program is attached) and compiled it with 'mpicc', but I do not get
> the expected behavior, for example when i run on 4 processes:

> mpirun -np 4 ./tortoise2
guile> (tortoise-move 100)

> the next guile prompt does not appear after the entered command has
> completed.

> I looked into the guile archieves using search "MPI" and found
> that another person was having the same problem one year ago.
> That user has recieved a very informative message :
> http://lists.gnu.org/archive/html/guile-user/2005-02/msg00018.html
> but unfortunately, the thread stops there.
> I did some followup and found nice documentation on setting custom
> ports on stdin at
> http://www.gnu.org/software/guile/docs/docs-1.8/guile-ref/Port-Types.html#Port-Types
> but the resources of my expertise in scheme and setting custom
> ports have exhausted there.

> There are many people using MPI, I think a solution very be greatly
> appreciated by a sizable community of MPI users.

One issue in wrapping MPI for Guile is calling the `MPI_Init()' before
entering Guile. This is done in the code you sent. With that code you
can use MPI in background (I guess). For instance try to write a small
script and then run it in background with MPI. 

    $ mpirun -np xx tortoise -s myscript.scm

That should work. (I use `scm_boot_guile' instead of `gh_enter.' I
think that the `gh_..'  stuff is deprecated, but I don't know if this
is relevant in the discussion. ) Note that with a small effort you
have something that is not completely useless: you can use it

* interactively in sequential mode, and 
* in parallel (but not interactively)
 
I have made some experiments in this line, wrapping the most simple
MPI functions (mpi-send, mpi-recv, mpi-bcast...) and some basic stuff
from PETSc. 

Now, if you want to use it in parallel and in an interactive
environment, then I think the solution is to replace the REPL
evaluator, so that each time he founds a `sexp' or whatever to be
evaluated, it sends the expression to the nodes with `MPI_Bcast'. I
know of something being done with Python and I think it's much the same.

http://www.cimec.org.ar/python/
http://sourceforge.net/projects/mpi4py/

I think that this broadcasting of the input from the master to the
nodes is something that you can't avoid whenever you want to wrap MPI
for any scripting language. 

Mario


_______________________________________________
Guile-user mailing list
Guile-user@gnu.org
http://lists.gnu.org/mailman/listinfo/guile-user

Reply via email to