Re: StandardScaler failing with OOM errors in PySpark

2015-05-17 Thread Xiangrui Meng
t buffer memory >> >> >> errors. >> >> >> There should be plenty of memory around -- 10 executors with 2 cores >> >> >> each >> >> >> and 8 Gb per core. I'm giving the executors 9g of memory and have >> >> >&

Re: StandardScaler failing with OOM errors in PySpark

2015-04-28 Thread Rok Roskar
core. I'm giving the executors 9g of memory and have > also > >> >> tried > >> >> lots of overhead (3g), thinking it might be the array creation in the > >> >> aggregators that's causing issues. > >> >> > >> >

Re: StandardScaler failing with OOM errors in PySpark

2015-04-27 Thread Xiangrui Meng
utors 9g of memory and have also >> >> tried >> >> lots of overhead (3g), thinking it might be the array creation in the >> >> aggregators that's causing issues. >> >> >> >> The bizarre thing is that this isn&

Re: StandardScaler failing with OOM errors in PySpark

2015-04-23 Thread Rok Roskar
t;> > >> The bizarre thing is that this isn't always reproducible -- sometimes it > >> actually works without problems. Should I be setting up executors > >> differently? > >>

Re: StandardScaler failing with OOM errors in PySpark

2015-04-22 Thread Rok Roskar
bizarre thing is that this isn't always reproducible -- sometimes it >> actually works without problems. Should I be setting up executors >> differently? >> >> Thanks, >> >> Rok >> >> >> >> >> -- >> View this message in

Re: StandardScaler failing with OOM errors in PySpark

2015-04-22 Thread Xiangrui Meng
iew this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/StandardScaler-failing-with-OOM-errors-in-PySpark-tp22593.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > ---

StandardScaler failing with OOM errors in PySpark

2015-04-21 Thread rok
rks without problems. Should I be setting up executors differently? Thanks, Rok -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/StandardScaler-failing-with-OOM-errors-in-PySpark-tp22593.html Sent from the Apache Spark User List mailing list a