understanding right?
Manohar
From: Steve Loughran [mailto:ste...@hortonworks.com]
Sent: Thursday, January 5, 2017 11:05 PM
To: Manohar Reddy
Cc: user@spark.apache.org
Subject: Re: Spark Read from Google store and save in AWS s3
On 5 Jan 2017, at 09:58, Manohar753
mailto:manohar.re
Yaa got it
Thanks Akhil.
From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Tuesday, July 28, 2015 2:47 PM
To: Manohar Reddy
Cc: user@spark.apache.org
Subject: Re: java.lang.ArrayIndexOutOfBoundsException: 0 on Yarn Client
That happens when you batch duration is less than your processing
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Tuesday, July 28, 2015 2:30 PM
To: Manohar Reddy
Cc: user@spark.apache.org
Subject: Re: java.lang.ArrayIndexOutOfBoundsException: 0 on Yarn Client
..@sigmoidanalytics.com]
Sent: Tuesday, July 28, 2015 1:52 PM
To: Manohar Reddy
Cc: user@spark.apache.org
Subject: Re: java.lang.ArrayIndexOutOfBoundsException: 0 on Yarn Client
Put a try catch inside your code and inside the catch print out the length or
the list itself which causes the ArrayInd