Re: map - reduce only with disk

2015-06-02 Thread Matei Zaharia
each key to all fit in memory at once. In this case, if you're >> going to reduce right after, you should use reduceByKey, which will be more >> efficient. >> >> Matei >> >> > On Jun 1, 2015, at 2:21 PM, octa

Re: map - reduce only with disk

2015-06-02 Thread Matei Zaharia
Key, which will be more > efficient. > > Matei > > > On Jun 1, 2015, at 2:21 PM, octavian.ganea > <mailto:octavian.ga...@inf.ethz.ch>> wrote: > > > > Dear all, > > > > Does anyone know how can I force Spark to use only the disk when doing a > >

Re: map - reduce only with disk

2015-06-01 Thread Matei Zaharia
.groupByKey.reduce(_ + _) ? Thank you! > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/map-reduce-only-with-disk-tp23102.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --

map - reduce only with disk

2015-06-01 Thread octavian.ganea
Dear all, Does anyone know how can I force Spark to use only the disk when doing a simple flatMap(..).groupByKey.reduce(_ + _) ? Thank you! -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/map-reduce-only-with-disk-tp23102.html Sent from the Apache Spark