Sorry not persist...I meant adding a user parameter k which does checkpoint
after every k iterations...out of N ALS iterations...We have hdfs installed
so not a big deal...is there an issue of adding this user parameter in
ALS.scala ? If it is then I can add it to our internal branch...
For me tip
Hi Xiangrui,
With 4 ALS iterations it runs fine...If I run 10 I am failing...I believe I
have to cut the lineage chain and call checkpointTrying to follow the
other email chain on checkpointing...
Thanks.
Deb
On Sun, Apr 6, 2014 at 9:08 PM, Xiangrui Meng wrote:
> Hi Deb,
>
> Are you using
Btw, explicit ALS doesn't need persist because each intermediate
factor is only used once. -Xiangrui
On Sun, Apr 6, 2014 at 9:13 PM, Xiangrui Meng wrote:
> The persist used in implicit ALS doesn't help StackOverflow problem.
> Persist doesn't cut lineage. We need to call count() and then
> checkp
The persist used in implicit ALS doesn't help StackOverflow problem.
Persist doesn't cut lineage. We need to call count() and then
checkpoint() to cut the lineage. Did you try the workaround mentioned
in https://issues.apache.org/jira/browse/SPARK-958:
"I tune JVM thread stack size to 512k via opt
Hi Deb,
Are you using the master branch or a particular commit? Do you have
negative or out-of-integer-range user or product ids? There is an
issue with ALS' partitioning
(https://spark-project.atlassian.net/browse/SPARK-1281), but I'm not
sure whether that is the reason. Could you try to see whet
The off-heap storage level is currently tied to Tachyon, but it might support
other forms of off-heap storage later. However it’s not really designed to be
mixed with the other ones. For this use case you may want to rely on memory
locality and have some custom code to push the data to the accel
see here for similar issue
http://mail-archives.apache.org/mod_mbox/spark-user/201401.mbox/%3CCALNFXi2hBSyCkPpnBJBYJnPv3dSLNw8VpL_6caEn3yfXCykO=w...@mail.gmail.com%3E
On Apr 6, 2014 4:10 PM, "Sean Owen" wrote:
> scala.None certainly isn't new in 2.10.4; it's ancient :
> http://www.scala-lang.org
scala.None certainly isn't new in 2.10.4; it's ancient :
http://www.scala-lang.org/api/2.10.3/index.html#scala.None$
Surely this is some other problem?
On Sun, Apr 6, 2014 at 6:46 PM, Koert Kuipers wrote:
> also, i thought scala 2.10 was binary compatible, but does not seem to be
> the case. the
i suggest we stick to 2.10.3, since otherwise it seems that (surprisingly)
you force everyone to upgrade
On Sun, Apr 6, 2014 at 1:46 PM, Koert Kuipers wrote:
> also, i thought scala 2.10 was binary compatible, but does not seem to be
> the case. the spark artifacts for scala 2.10.4 dont work fo
also, i thought scala 2.10 was binary compatible, but does not seem to be
the case. the spark artifacts for scala 2.10.4 dont work for me, since we
are still on scala 2.10.3, but when i recompiled and published spark with
scala 2.10.3 everything was fine again.
errors i see:
java.lang.ClassNotFoun
At the head I see persist option in implicitPrefs but more cases like the
ones mentioned above why don't we use similar technique and take an input
that which iteration should we persist in explicit runs as well ?
for (iter <- 1 to iterations) {
// perform ALS update
logInfo("Re-co
patrick,
this has happened before, that a commit introduced java 7 code/dependencies
and your build didnt fail, i think it was when reynold upgraded to jetty 9.
must be that your entire build infrastructure runs java 7...
On Sat, Apr 5, 2014 at 6:06 PM, Patrick Wendell wrote:
> If you want to s
Yeah spark builds are fine...
For solvers we are planning to use breeze optimization since it has most of
the core functions we will need and we can enhance it further (QP solver
for example)
Right now sparse kmeans in spark mllib uses breeze and that might not even
need this line of codeBut
That's a Breeze question, no? you should not need to compile Breeze
yourself to compile Spark -- why do that?
That method indeed only exists in Java 7. But Breeze seems to target
Java 6 as expected:
https://github.com/scalanlp/breeze/blob/master/build.sbt#L59
I see this particular line of code w
thats confusing. it seems to me the breeze dependency has been compiled
with java 6, since the mllib tests passed fine for me with java 6
On Sun, Apr 6, 2014 at 12:00 PM, Debasish Das wrote:
> Hi Koert,
>
> How do I specify that in sbt ?
>
> Is this the correct way ?
> javacOptions ++= Seq("-t
Hi Koert,
How do I specify that in sbt ?
Is this the correct way ?
javacOptions ++= Seq("-target", "1.6", "-source","1.6")
Breeze project for examples compiles fine with jdk7, fails with jdk6 and
the function it fails on:
error] /home/debasish/github/breeze/
src/main/scala/breeze/util/package
classes compiled with java7 run fine on java6 if you specified "-target
1.6". however if thats the case generally you should also be able to also
then compile it with java 6 just fine.
something compiled with java7 with "-target 1.7" will not run on java 6
On Sat, Apr 5, 2014 at 9:10 PM, Debasi
17 matches
Mail list logo