Cycling related bits:
http://search-hadoop.com/m/LgpTk2DLMvc

On Sun, Mar 8, 2015 at 2:29 PM, Nasir Khan <nasirkhan.onl...@gmail.com>
wrote:

> HI, I am going to submit a proposal to my University to setup my Standalone
> Spark Cluster, what hardware should i include in my proposal?
>
> I will be Working on classification (Spark MLlib) of Data streams (Spark
> Streams)
>
> If some body can fill up this answers, that will be great! Thanks
>
> *Cores *= (example 64 nodes, 1024 cores, your figures) ____________?
>
> *Performance**= (example= ~5.12TFlops, ~2TFlops, your figures) ___________?
>
> *GPU*= YES/NO ___________?
>
> *Fat Node* = YES/NO ___________?
>
> *CPU Hrs/ Yr* = (example 2000, 8000, your figures) ___________?
>
> *RAM/CPU* = (example 256GB, your figures) ___________?
> *
> Storage Processing* = (example 200TB, your figures) ___________?
>
> *Storage Output* = (example 5TB, 4TB HHD/SSD, your figures) ___________?
>
> *Most processors today carryout 4 FLOPS per cycle,  thus a single-core 2.5
> GHz processor has a theoretical performance of 10 billion FLOPS = 10GFLOPS
>
> Note:I Need a *general purpose* cluster, not very high end nor very low
> specs. It will not be dedicated to just one project i guess. You people
> already have experience in setting up clusters, that's the reason i posted
> it here :)
>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/General-Purpose-Spark-Cluster-Hardware-Requirements-tp21963.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to