thank you Pat. I was having that issue when I was trying to do something like 
that. 
Just curious, how should I prepare the data so that it can satisfy drmDfsRead 
(path) ? DRM format and how to create the DRM file ? thanks, canal 


     On Wednesday, October 7, 2015 4:09 AM, Pat Ferrel <[email protected]> 
wrote:
   

 Linear algebra stuff is what Mahout Samsara is all about. For these docs 
in-core means in-memory and out of core means distributed 
http://mahout.apache.org/users/environment/out-of-core-reference.html

On Oct 4, 2015, at 7:27 PM, go canal <[email protected]> wrote:

Thank you very much for the help. I will try Spark 1.4. 
I would like to try distributed matrix multiplication. not sure if there are 
sample codes available. I am very new to this stack. thanks, canal 


    On Monday, October 5, 2015 12:23 AM, Pat Ferrel <[email protected]> 
wrote:


Mahout 0.11.0 is built on Spark 1.4 and so 1.5.1 is a bit unknown. I think the 
Mahout Shell does not run on 1.5.1.

That may not be the error below, which is caused when Mahout tries to create a 
set of jars to use in the Spark executors. The code runs `mahout -spark 
classpath` to get these. So something is missing in your env in Eclipse. Does 
`mahout -spark classpath` run in a shell, if so check to see if you env matches 
in Eclipse.

Also what are you trying to do? I have some example Spark Context creation code 
if you are using Mahout as a Library.


On Oct 3, 2015, at 2:14 AM, go canal <[email protected]> wrote:

Hello,I am running a very simple Mahout application in Eclipse, but got this 
error:
Exception in thread "main" java.lang.IllegalArgumentException: Unable to read 
output from "mahout -spark classpath". Is SPARK_HOME defined?
I have SPARK_HOME defined in Eclipse as an environment variable with value of 
/usr/local/spark-1.5.1.
What else I need to include/set ?

thanks, canal




  

Reply via email to