Check the configuration guide for a description on units (
http://spark.apache.org/docs/latest/configuration.html#spark-properties).
In your case, 5GB would be specified as 5g.

On 6 January 2016 at 10:29, unk1102 <umesh.ka...@gmail.com> wrote:

> Hi As part of Spark 1.6 release what should be ideal value or unit for
> spark.memory.offheap.size I have set as 5000 I assume it will be 5GB is it
> correct? Please guide.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/What-should-be-the-ideal-value-unit-for-spark-memory-offheap-size-tp25898.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to