Ah nevermind, the fix is to get rid of "return" from my method. There's
probably a bug somewhere related to the repl taking bad input more cleanly,
but this isn't the end of the world once you figure out what the issue is.
Thanks for the time,
Andrew
On Mon, Jun 2, 2014 at 11:35 PM, Andrew Ash
// observed in Spark 1.0
Scala devs,
I was observing an unusual NPE in my code recently, and came up with the
below minimal test case:
class Super extends Serializable {
lazy val superVal: String = null
}
class Sub extends Super {
lazy val subVal: String = {
try {
"l
Hey All,
I wanted to announce the the Spark 1.1 release window:
June 1 - Merge window opens
July 25 - Cut-off for new pull requests
August 1 - Merge window closes (code freeze), QA period starts
August 15+ - RC's and voting
This is consistent with the "3 month" release cycle we are targeting.
I'd
Quite often I notice that shuffle file is missing thus FileNotFoundException
is throws.
Any idea why shuffle file will be missing ? Am I running low in memory?
(I am using latest code from master branch on yarn-hadoop-2.2)
--
java.io.FileNotFoundException:
/var/storage/sda3/nm-local/usercache/npan
Yeah - check out sparkPreviousArtifact in the build:
https://github.com/apache/spark/blob/master/project/SparkBuild.scala#L325
- Patrick
On Mon, Jun 2, 2014 at 5:30 PM, Xiangrui Meng wrote:
> Is there a way to specify the target version? -Xiangrui
Madhu, can you send me your Wiki username? (Sending it just to me is fine.) I
can add you to the list to edit it.
Matei
On Jun 2, 2014, at 6:27 PM, Reynold Xin wrote:
> I tried but didn't find where I could add you. You probably need Matei to
> help out with this.
>
>
>
> On Mon, Jun 2, 20
I tried but didn't find where I could add you. You probably need Matei to
help out with this.
On Mon, Jun 2, 2014 at 7:43 AM, Madhu wrote:
> I was able to set up Spark in Eclipse using the Spark IDE plugin.
> I also got unit tests running with Scala Test, which makes development
> quick
> and
Is there a way to specify the target version? -Xiangrui
On Mon, Jun 2, 2014 at 6:05 PM, Marcelo Vanzin wrote:
> You mentioned something in your shading argument that kinda reminded
> me of something. Spark currently depends on slf4j implementations and
> log4j with "compile" scope. I'd argue that's the wrong approach if
> we're talking about Spark bein
Hi Patrick,
Thanks for all the explanations, that makes sense. @DeveloperApi
worries me a little bit especially because of the things Colin
mentions - it's sort of hard to make people move off of APIs, or
support different versions of the same API. But maybe if expectations
(or lack thereof) are s
I was able to set up Spark in Eclipse using the Spark IDE plugin.
I also got unit tests running with Scala Test, which makes development quick
and easy.
I wanted to document the setup steps in this wiki page:
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSp
11 matches
Mail list logo