I tried Spark 1.4.1, same error. Then I saw the same error from shell command. 
So I suspect that it is the environment configuration problem.
I have followed this https://mahout.apache.org/general/downloads.html for 
Mahout configuration. 
So it seems to be a Spark configuration problem, I guess, although I can run 
spark-example without errors. Will need to figure out what are missing.
 thanks, canal 


     On Monday, October 5, 2015 12:23 AM, Pat Ferrel <[email protected]> 
wrote:
   

 Mahout 0.11.0 is built on Spark 1.4 and so 1.5.1 is a bit unknown. I think the 
Mahout Shell does not run on 1.5.1.

That may not be the error below, which is caused when Mahout tries to create a 
set of jars to use in the Spark executors. The code runs `mahout -spark 
classpath` to get these. So something is missing in your env in Eclipse. Does 
`mahout -spark classpath` run in a shell, if so check to see if you env matches 
in Eclipse.

Also what are you trying to do? I have some example Spark Context creation code 
if you are using Mahout as a Library.


On Oct 3, 2015, at 2:14 AM, go canal <[email protected]> wrote:

Hello,I am running a very simple Mahout application in Eclipse, but got this 
error:
Exception in thread "main" java.lang.IllegalArgumentException: Unable to read 
output from "mahout -spark classpath". Is SPARK_HOME defined?
I have SPARK_HOME defined in Eclipse as an environment variable with value of 
/usr/local/spark-1.5.1.
What else I need to include/set ?

thanks, canal


  

Reply via email to