:) strictly speaking out of core is anything that is not in memory, e.g.
sequential algorithms are generally also considered out-of-core

btw i though 0.11.x was for 1.3? or that was re-certified for 1.4 too?


On Tue, Oct 6, 2015 at 1:09 PM, Pat Ferrel <[email protected]> wrote:

> Linear algebra stuff is what Mahout Samsara is all about. For these docs
> in-core means in-memory and out of core means distributed
> http://mahout.apache.org/users/environment/out-of-core-reference.html
>
> On Oct 4, 2015, at 7:27 PM, go canal <[email protected]> wrote:
>
> Thank you very much for the help. I will try Spark 1.4.
> I would like to try distributed matrix multiplication. not sure if there
> are sample codes available. I am very new to this stack. thanks, canal
>
>
>     On Monday, October 5, 2015 12:23 AM, Pat Ferrel <[email protected]>
> wrote:
>
>
> Mahout 0.11.0 is built on Spark 1.4 and so 1.5.1 is a bit unknown. I think
> the Mahout Shell does not run on 1.5.1.
>
> That may not be the error below, which is caused when Mahout tries to
> create a set of jars to use in the Spark executors. The code runs `mahout
> -spark classpath` to get these. So something is missing in your env in
> Eclipse. Does `mahout -spark classpath` run in a shell, if so check to see
> if you env matches in Eclipse.
>
> Also what are you trying to do? I have some example Spark Context creation
> code if you are using Mahout as a Library.
>
>
> On Oct 3, 2015, at 2:14 AM, go canal <[email protected]> wrote:
>
> Hello,I am running a very simple Mahout application in Eclipse, but got
> this error:
> Exception in thread "main" java.lang.IllegalArgumentException: Unable to
> read output from "mahout -spark classpath". Is SPARK_HOME defined?
> I have SPARK_HOME defined in Eclipse as an environment variable with value
> of /usr/local/spark-1.5.1.
> What else I need to include/set ?
>
> thanks, canal
>
>
>
>

Reply via email to