This will make the compilation pass but you may not be able to run it 
correctly.

I used maven adding these two jars (I use Hadoop 1), maven added their 
dependent jars (a lot) for me.

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.10</artifactId>
    <version>1.0.0</version>
</dependency>
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-client</artifactId>
    <version>1.2.1</version>
</dependency>



Best regards,
Wei 

---------------------------------
Wei Tan, PhD
Research Staff Member
IBM T. J. Watson Research Center
http://researcher.ibm.com/person/us-wtan



From:   Krishna Sankar <ksanka...@gmail.com>
To:     user@spark.apache.org, 
Date:   06/08/2014 11:19 AM
Subject:        Re: How to compile a Spark project in Scala IDE for 
Eclipse?



Project->Properties->Java Build Path->Add External Jars
Add the /spark-1.0.0-bin-hadoop2/lib/spark-assembly-1.0.0-hadoop2.2.0.jar
Cheers
<K/>


On Sun, Jun 8, 2014 at 8:06 AM, Carter <gyz...@hotmail.com> wrote:
Hi All,

I just downloaded the Scala IDE for Eclipse. After I created a Spark 
project
and clicked "Run" there was an error on this line of code "import
org.apache.spark.SparkContext": "object apache is not a member of package
org". I guess I need to import the Spark dependency into Scala IDE for
Eclipse, can anyone tell me how to do it? Thanks a lot.





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-compile-a-Spark-project-in-Scala-IDE-for-Eclipse-tp7197.html

Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to