>
>
> Hi guys,
>
> I was trying to figure out some counters in Spark, related to the amount
> of CPU or Memory used (in some metric), used by a task/stage/job, but I
> could not find any.
>
> Is there any such counter available ?
>
> Thank you,
> Robert
>
>
>
>
>
>
Guys,
Do you have any thoughts on this ?
Thanks,Robert
On Sunday, April 12, 2015 5:35 PM, Grandl Robert
wrote:
Hi guys,
I was trying to figure out some counters in Spark, related to the amount of CPU
or Memory used (in some metric), used by a task/stage/job, but I could not find
Hi guys,
I was trying to figure out some counters in Spark, related to the amount of CPU
or Memory used (in some metric), used by a task/stage/job, but I could not find
any.
Is there any such counter available ?
Thank you,Robert
rhoodMatching")
>>val sc = new SparkContext(conf)
>>var counter = sc.accumulable(0, "Counter")
>>var inputFilePath = args(0)
>>val inputRDD = sc.textFile(inputFilePath)
>>
>>inputRDD.map { x
gt; http://spark.apache.org/docs/1.2.0/programming-guide.html#transformations
> http://spark.apache.org/docs/1.2.0/programming-guide.html#actions
>
> Cheers,
>
> Sean
>
>
> On Feb 13, 2015, at 9:50 AM, nitinkak001 wrote:
>
> I am trying to implement counters in Spark
are:
http://spark.apache.org/docs/1.2.0/programming-guide.html#transformations
http://spark.apache.org/docs/1.2.0/programming-guide.html#actions
Cheers,
Sean
On Feb 13, 2015, at 9:50 AM, nitinkak001
mailto:nitinkak...@gmail.com>> wrote:
I am trying to implement counters in Spark an
I am trying to implement counters in Spark and I guess Accumulators are the
way to do it.
My motive is to update a counter in map function and access/reset it in the
driver code. However the /println/ statement at the end still yields value
0(It should 9). Am I doing something wrong?
def main