I totally get your point Jeff, and thanks for pointing it out...this is an 
aspect that I didn't consider yet.

Power should not be an issue, since the devices are plugged in. Now I need to 
evaluate exactly how much power I can pull while the device is connected to the 
computer, compared to the power needed to run a process at 100% CPU load. 
Running over batteries is absolutely out of question I guess; my calculations 
should go on for at least a couple of hours, so I doubt that I can run a small 
device with batteries and accomplish my objectives.

Is there any info about how the I/O on the iPad and iPhone works? So I can have 
an idea about what I can run on that cable and for how long. As you pointed 
out, the main issue will be syncing processes...wifi may be feasible but would 
be slower I guess (without specs is hard to even make assumptions).

No worries, you are talking about things that has to be evaluated; I am just 
exploring an alternate use of old hardware; which may result in not being 
convenient at all in the end. So any comments helps :)

I will focus on calculating what you suggested. Theoretically the dual core 
Apple processors should be powerful enough to give some sort of performance 
boost, but I am new to ARM so I don't really know too much about their 
structure and pipeline, so I may be totally wrong.

-lou


On Nov 30, 2012, at 7:35 AM, Jeff Squyres wrote:

> Not to throw cold water on this, but I think the canonical problem cited with 
> doing distributed computations on mobile devices is the power requirement.  
> Meaning: if the devices are running on battery, you're really not going to 
> get much computation out of them.  
> 
> And if you have them plugged in, you have a potential IO issue (i.e., how to 
> get the input onto the device and the output out of the device).  You 
> probably only have 802.11g (maybe 802.11n?) wifi available, and you might 
> have to deal with a LOT of I/O.  Meaning: you might need to restrict this 
> work to applications that are compute-heavy but IO-light.  But then again, 
> you're dealing with small, "slow" processors, so compute-heavy problems on 
> such processors might not do so well.  Or, more precisely, you might get much 
> more compute efficiency with traditional "big" HPC servers.
> 
> Don't get me wrong; I'm not trying to say this is a bad idea.  I'm just 
> saying that it's worth doing some back-of-the-envelope calculations before 
> you spend a lot of effort on porting code to mobile platforms.
> 
> For example, here's some interesting data points that would be good to 
> calculate:
> 
> 1. How many (pick your favorite mobile device; say -- iPhone 5) would it take 
> to equal the power of one cheap Intel Sandy Bridge-based server with 16 
> cores?  Compare things like max sustained FLOPS and IOPS (integer ops, not IO 
> ops), RAM sizes, etc.
> 
> 2. What's the procurement cost differential between 1 Intel Sandy 
> Bridge-based server and N iPhone 5s?  What's the operational cost 
> differential?
> 
> 
> 
> On Nov 30, 2012, at 10:25 AM, Ralph Castain wrote:
> 
>> Just an FYI: xgrid is no longer being distributed or supported.
>> 
>> I'd start by first building OMPI against the iOS simulator in Xcode. You may 
>> run into some issues with the atomics that will need addressing, and there 
>> may be other issues with syntax and header file locations. Best to resolve 
>> those first.
>> 
>> Once you get that to build, you can test running several procs on a single 
>> iPad. If you have older iPads, I'm not sure that will work as they don't 
>> multi-task. But might be worth a try.
>> 
>> You'll then need to find a way to launch the processes across iPads. I don't 
>> know if ssh will work, so you may have to devise a new plm module. I can 
>> advise as you go.
>> 
>> FWIW: I have an iPad 1 and iOS development kit, so I can potentially help 
>> with problems.
>> 
>> 
>> On Nov 29, 2012, at 10:16 PM, shiny knight <theshinykni...@me.com> wrote:
>> 
>>> Thanks for all your replies.
>>> 
>>> As now I have access to 3 iOS devices and 1 Android, so if possible I would 
>>> be oriented to pursue more the iOS route.
>>> 
>>> So it seems that there is not yet a simple way to do so on these devices 
>>> (Thanks for the paper posted Dominik); I will have to look deeper in that 
>>> project that you mentioned and wait for some official release (at least for 
>>> the Android side)
>>> 
>>> I may install linux distro on a virtual machine; mostly I work on OSX so it 
>>> should not be that bad (OSX allows me to work with both Android and iOS 
>>> hassle free; that's why I had the thought to use my devices for MPI).
>>> 
>>> Beatty: My idea is to use the devices only when plugged in; I was reading a 
>>> paper about how to use MPI and dynamically change the number of nodes 
>>> attached, while crunching data for a process. So it would be possible to 
>>> add and remove nodes on the fly, and was trying to apply it to a portable 
>>> device (http://www.cs.rpi.edu/~szymansk/papers/ppam05.pdf) before realizing 
>>> that there is no MPI implementation for them.
>>> 
>>> I would never envision a system where a user has a device in his pocket 
>>> that is actually doing "something" behind is back...mine was a simple issue 
>>> with having devices sitting on my desk, which I use to test my apps, and I 
>>> could use these devices in a more productive way, while I have them 
>>> tethered to my main machine (which is the main server where MPI development 
>>> is done).
>>> 
>>> Would you mind elaborate on the approach that you mentioned? I never used 
>>> Xgrid, so I am not sure about how your solution would work.
>>> 
>>> Thanks!
>>> 
>>> Lou
>>> 
>>> 
>>> On Nov 29, 2012, at 4:14 PM, Beatty, Daniel D CIV NAVAIR, 474300D wrote:
>>> 
>>>> Greetings Ladies and gentlemen,
>>>> There is one alternative approach and this a psuedo-cloud based MPI.  The
>>>> idea is that MPI node list is adjusted via the cloud similar to the way
>>>> Xgrid's Bonjour used to do it for Xgrid.
>>>> 
>>>> In this case, it is applying an MPI notion to the OpenCL codelets.  There
>>>> are obvious issues with security, battery life, etc.  There is considerable
>>>> room for discussion as far expectations.  Do jobs run free if the device is
>>>> plugged in?  If the device in the pocket, can the user switch to power
>>>> conservation/ cooler pockets?  What constitutes fairness?  Do owners have a
>>>> right to be biased in judgement?   These are tough questions that I think I
>>>> will have to provide fair assurances for.  After all, everyone likes to
>>>> think they are control of what they put in their pocket.
>>>> 
>>>> V/R,
>>>> Dan
>>>> 
>>>> 
>>>> On 11/28/12 3:06 PM, "Dominik Goeddeke"
>>>> <dominik.goedd...@math.tu-dortmund.de> wrote:
>>>> 
>>>>> shameless plug: 
>>>>> http://www.mathematik.tu-dortmund.de/~goeddeke/pubs/pdf/Goeddeke_2012_EEV.pdf
>>>>> 
>>>>> In the MontBlanc project (www.montblanc-project.eu), a lot of folks from
>>>>> all around Europe look into exactly this. Together with a few
>>>>> colleagues, we have been honoured to get access to an early prototype
>>>>> system. The runs for the paper above (accepted in JCP as of last week)
>>>>> have been carried out with MPICH2 back in June, but OpenMPI also worked
>>>>> flawlessly except for some issues with SLURM integration at the time we
>>>>> did those tests.
>>>>> 
>>>>> The bottom line is: The prototype machine (128 Tegra2's) ran standard
>>>>> ubuntu, and since Android is essentially Linux, it should not be tooooo
>>>>> hard to get the system you envision up and running, Shiny Knight.
>>>>> 
>>>>> Cheers,
>>>>> 
>>>>> Dominik
>>>>> 
>>>>> 
>>>>> On 11/29/2012 12:00 AM, Vincent Diepeveen wrote:
>>>>>> You might want to post in beowulf mailing list see cc
>>>>>> and you want to install linux of course.
>>>>>> 
>>>>>> OpenFabrics releases openmpi, yet it only works at a limited number of
>>>>>> distributions - most important is having
>>>>>> the correct kernel (usually old kernel).
>>>>>> 
>>>>>> I'm gonna try get it to work at debian soon.
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> On Nov 28, 2012, at 11:50 PM, shiny knight wrote:
>>>>>> 
>>>>>>> I was looking for some info about MPI port on iOS or Android devices.
>>>>>>> 
>>>>>>> I have some old devices that may result useful, if I could be able to
>>>>>>> include them in my computation scheme.
>>>>>>> 
>>>>>>> OpenCL runs on iOS and Android, so I was wondering if there is any
>>>>>>> way to have an old iPhone/phone or iPad/tablet to run MPI.
>>>>>>> 
>>>>>>> Tried to look everywhere, but I didn't find anything that says that
>>>>>>> it is possible, nor I've found any practical example.
>>>>>>> 
>>>>>>> Thanks!
>>>>>>> _______________________________________________
>>>>>>> users mailing list
>>>>>>> us...@open-mpi.org
>>>>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>>> 
>>>>>> _______________________________________________
>>>>>> users mailing list
>>>>>> us...@open-mpi.org
>>>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>>>> 
>>>> _______________________________________________
>>>> users mailing list
>>>> us...@open-mpi.org
>>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>>> 
>>> _______________________________________________
>>> users mailing list
>>> us...@open-mpi.org
>>> http://www.open-mpi.org/mailman/listinfo.cgi/users
>> 
>> 
>> _______________________________________________
>> users mailing list
>> us...@open-mpi.org
>> http://www.open-mpi.org/mailman/listinfo.cgi/users
> 
> 
> -- 
> Jeff Squyres
> jsquy...@cisco.com
> For corporate legal information go to: 
> http://www.cisco.com/web/about/doing_business/legal/cri/
> 
> 
> _______________________________________________
> users mailing list
> us...@open-mpi.org
> http://www.open-mpi.org/mailman/listinfo.cgi/users

Reply via email to