Hi guys,
I built a spark package but couldn't publish them with sbt-spark-package
plugin. Any idea why these are failing?
http://spark-packages.org/staging?id=1179
http://spark-packages.org/staging?id=1168
Repo: https://github.com/spotify/spark-bigquery
Jars are published to Maven: https://repo1
Does cache eviction affect disk storage level too? I tried cranking up
replication but still seeing this.
On Wednesday, June 11, 2014, Shuo Xiang wrote:
> Daniel,
> Thanks for the explanation.
>
>
> On Wed, Jun 11, 2014 at 8:57 AM, Daniel Darabos <
> daniel.dara...@lynxanalytics.com
> > wrote:
We are seeing this issue as well.
We run on YARN and see logs about lost executor. Looks like some stages had
to be re-run to compute RDD partitions lost in the executor.
We were able to complete 20 iterations with 20% full matrix but not beyond
that (total > 100GB).
On Tue, Jun 10, 2014 at 8:32