Hi All,
I am trying to run a sample Spark program using Scala SBT,
Below is the program,
def main(args: Array[String]) {
val logFile = "E:/ApacheSpark/usb/usb/spark/bin/README.md" // Should
be some file on your system
val sc = new SparkContext("local", "Simple App",
"E:/ApacheSpark/
the test failure. This would
> have been logged earlier. You would need to say how you ran tests too. The
> tests for 1.2.0 pass for me on several common permutations.
> On Dec 29, 2014 3:22 AM, "Naveen Madhire" wrote:
>
>> Hi,
>>
>> I am follow the below li
Hi,
I am follow the below link for building Spark 1.2.0
https://spark.apache.org/docs/1.2.0/building-spark.html
I am getting the below error during the Maven build. I am using IntelliJ
IDE.
The build is failing in the scalatest plugin,
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent
-windows-7
Now it is working fine.
Thanks all.
On Sun, Dec 28, 2014 at 6:10 PM, Naveen Madhire
wrote:
> Hi All,
>
> I am getting the below error while running a simple spark application from
> Eclipse.
>
> I am using Eclipse, Maven, Java.
>
> I've spark running lo
Hi All,
I am getting the below error while running a simple spark application from
Eclipse.
I am using Eclipse, Maven, Java.
I've spark running locally on my Windows laptop. I copied the spark files
from the spark summit 2014 training
http://databricks.com/spark-training-resources#itas
I can ru
orrect docs link is:
> https://spark.apache.org/docs/1.2.0/building-spark.html
>
> Where did you get that bad link from?
>
> Nick
>
>
>
> On Thu Dec 25 2014 at 12:00:53 AM Naveen Madhire
> wrote:
>
>> Hi All,
>>
>> I am starting to use Spark.
Hi All,
I am starting to use Spark. I am having trouble getting the latest code
from git.
I am using Intellij as suggested in the below link,
https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark#ContributingtoSpark-StarterTasks
The below link isn't working as well,
http://sp