Seems like the exexutor memory is not enough for your job and it is writing
objects to disk
On Jun 17, 2016 2:25 AM, "Cassa L" wrote:
>
>
> On Thu, Jun 16, 2016 at 5:27 AM, Deepak Goel wrote:
>
>> What is your hardware configuration like which you are running Spark on?
>>
>> It is 24core, 128GB
On Thu, Jun 16, 2016 at 5:27 AM, Deepak Goel wrote:
> What is your hardware configuration like which you are running Spark on?
>
> It is 24core, 128GB RAM
> Hey
>
> Namaskara~Nalama~Guten Tag~Bonjour
>
>
>--
> Keigu
>
> Deepak
> 73500 12833
> www.simtree.net, dee...@simtree.net
> deic...@gm
Hi,
>
> What do you see under Executors and Details for Stage (for the
> affected stages)? Anything weird memory-related?
>
Under executor Tab, logs throw these warning -
16/06/16 20:45:40 INFO TorrentBroadcast: Reading broadcast variable
422145 took 1 ms
16/06/16 20:45:40 WARN MemoryStore: Faile
What is your hardware configuration like which you are running Spark on?
Hey
Namaskara~Nalama~Guten Tag~Bonjour
--
Keigu
Deepak
73500 12833
www.simtree.net, dee...@simtree.net
deic...@gmail.com
LinkedIn: www.linkedin.com/in/deicool
Skype: thumsupdeicool
Google talk: deicool
Blog: http://lo
Hi,
What do you see under Executors and Details for Stage (for the
affected stages)? Anything weird memory-related?
How does your "I am reading data from Kafka into Spark and writing it
into Cassandra after processing it." pipeline look like?
Pozdrawiam,
Jacek Laskowski
https://medium.com/@
Hi,
Have you checked the statistics of storage memory, or something?
// maropu
On Thu, Jun 16, 2016 at 1:37 PM, Cassa L wrote:
> Hi,
> I did set --driver-memory 4G. I still run into this issue after 1 hour
> of data load.
>
> I also tried version 1.6 in test environment. I hit this issue muc
Hi,
I did set --driver-memory 4G. I still run into this issue after 1 hour of
data load.
I also tried version 1.6 in test environment. I hit this issue much faster
than in 1.5.1 setup.
LCassa
On Tue, Jun 14, 2016 at 3:57 PM, Gaurav Bhatnagar
wrote:
> try setting the option --driver-memory 4G
Hi,
I would appreciate any clue on this. It has become a bottleneck for our
spark job.
On Mon, Jun 13, 2016 at 2:56 PM, Cassa L wrote:
> Hi,
>
> I'm using spark 1.5.1 version. I am reading data from Kafka into Spark and
> writing it into Cassandra after processing it. Spark job starts fine and
Hi,
I'm using spark 1.5.1 version. I am reading data from Kafka into Spark
and writing it into Cassandra after processing it. Spark job starts
fine and runs all good for some time until I start getting below
errors. Once these errors come, job start to lag behind and I see that
job has scheduling