Aurélien,
I did not know which parallel issue you wanted to address. Based on previous
comments in the thread I assumed it was not a simple edit situation.With FFT
and other fractal analysis you are considering is very GPU/CPU intensive. Glad
it is you and not me :) I have actually done parallel processing across servers
using java. It definitely helped performance, but the software headaches were
very extensive. We needed very specific use cases to justify the development
costs and also the data transfer costs to move the processing. Let me know if
you want the high level design on how it worked. We had two methods. One was
ask based the other was offer based.
Tim
------ Original message------From: Aurélien PIERREDate: Thu, Nov 2, 2017 8:23
PMTo: Cc: darktable;Subject:Re: [darktable-dev] Darktable + Cloud-computing
Tim,
I think you miss the point. Doing parallel computing in your
garage is possible, but it won't help if your I/O is slow, which
is the case if you plug desktops PC in a farm. You just have to
open darktable -d perf -d opencl to understand that most
of the time spent during an image export is lost in GPU/CPU data
transfers, when you copy the RAM in the CPU/GPU registers back and
forth. What you need to do is parallelize on "close" cores to
reduce the memory copies and work in registers as much as
possible.
The context of my question is I'm currently working to implement
blind deconvolution for DT, and this process is not a filter but
an equation solver that minimizes both the blur and the noise to
find the minimal-energy image. This is something your CUDA GPU is
not ready for, and needs several hours for large blurs on large
images. Similarly, I'm investigating the possiblity to refocus
images. We are talking about FFT, convolutions products, gradients
computations, and so on with iterative processes, and I'm still
not sure wether these functionnalities fit into a general-use
software like DT. And if they do, you may not like the computing
time.
Aurélien PIERRE
aurelienpierre.com
Le 2017-11-02 à 20:05, steve a écrit :
Worth a look at this first: https://www.youtube.com/watch?v=xuuiUhMr-lQ
On 11/02/2017 08:01 PM, [email protected]
wrote:
Instead think of how many people have multiple
computers at home. I have a desktop, a laptop, my wife has
laptop, kids....
Most homes have more than one computer. For
some of the more complex stuff it would be interesting to
potentially off load to a local spare machine.
I know audio systems do this for virtual
instruments (brother is a hobby audiophiles)
Tim
------ Original message------
From: Steven Adler
Date: Thu, Nov 2, 2017 7:08 PM
To: Aurélien PIERRE;
Cc: darktable;
Subject:Re: [darktable-dev] Darktable
+ Cloud-computing
My PC is more than capable of processing
photos and from what I've seen of v2.3, the new GPU support
in Darktable makes processing much much faster. Cloud based
GPU support would be great for advanced AI algorithms for
things like auto background replacement. But still GPUs are
advancing rapidly and I doubt we will need cloud GPU
augmentation for some years.
On Nov 2, 2017 6:17 PM, "Aurélien
PIERRE" <[email protected]>
wrote:
Hi,
as top-notch image processing algorithms become
more and more demanding in computing power, but
often highly parallelizable, and pictures
resolutions double almost every 5 years (now 52 Mpx
for the Canon 5DS R, 45 Mpx for the Nikon D850),
most computers become hardly enough to just open the
pictures. Let alone apply complex filters on them…
Serious amateurs and pro may want to buy expensive
workstations but… Cloud-computing solutions like Amazon EC3
gives you remote access to Linux instances with
Nvidia GPUs for 0.76 US $/hour (g2.2
instances, 8 vCPU). The instances are scalable
in size automatically and several Linux distros are
provided (Ubuntu, Debian, Red Hat, CentOS). At this
price, you get the price of your killer PC (5000 $)
in more than 6500 hours of use. That's 5 years of
working-time (assuming 48 weeks/year, 35 h/week,
because I'm French).
So… what do you think of having the heavy filters
processed in Darktable through the servers of Amazon
or anybody else instead of having to break the bank
for a new (almost) disposable computer ? Possible or
science-fiction ? How many of you don't have a 1MB/s
or faster internet connection ? How difficult would
it be to code ?
Full disclaimer : I have no previous experience in
cloud computing and no Amazon shares.
--
Aurélien PIERRE
aurelienpierre.com
___________________________________________________________________________
darktable developer mailing list to unsubscribe send a
mail to [email protected]
___________________________________________________________________________
darktable developer mailing list to unsubscribe send a mail to
[email protected]
___________________________________________________________________________
darktable developer mailing list
to unsubscribe send a mail to
[email protected]
___________________________________________________________________________
darktable developer mailing list
to unsubscribe send a mail to [email protected]
___________________________________________________________________________
darktable developer mailing list
to unsubscribe send a mail to [email protected]