Aurélien PIERRE wrote:
So… what do you think of having the heavy filters processed in
Darktable through the servers of Amazon or anybody else instead of
having to break the bank for a new (almost) disposable computer ?
Possible or science-fiction ? How many of you don't have a 1MB/s or
faster internet connection ? How difficult would it be to code ?
Possible? Sure. Practical? Not so much. Interesting thought, though.
For starters, you'd have to send the image there and back. A
losslessly-compressed, 14-bit NEF from my Nikon D750 runs 29 MB (bytes)
and, to make the math easy, let's say you have a 29 Mb/s (megabits)
connection to the Internet. Sending that image in each direction will
take eight seconds out and eight seconds back, so even if your slow
computer takes, say, 15 seconds to run a filter, you're already behind
the curve on data transfer alone. The actual quantity of the data DT
would need to transfer would be a lot larger since it's not dealing with
the image in a compressed format internally. On top of that, you don't
just buy CPU from Amazon. They also charge you for using their pipes to
get data in and out (more for out, because they want you to also pay
them to store your data on their other services). Faster compute is
available but comes at a premium.
Where Amazon shines for this sort of thing is in large parallel jobs
where you can spin up a bunch of machines that live long enough to do
the work and then shut them off. For DT to be practical, you'd have to
have a system at the ready to do the work or instantiate one each time
you start DT and tear it down when you exit.
You can get a lot of compute on your desk for not much money if you shop
carefully, and being able to leverage the GPU(s) in your graphics card
helps, too.
--Mar
___________________________________________________________________________
darktable developer mailing list
to unsubscribe send a mail to darktable-dev+unsubscr...@lists.darktable.org