Hi,
Can you share what's the command to run the build? What's the OS? Java?
Pozdrawiam,
Jacek Laskowski
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski
On Sun, Jul 31, 2016 at 6:54 PM, Rohit
---
T E S T S
---
Running org.apache.spark.api.java.OptionalSuite
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.052 sec -
in org.apache.spark.api.java.OptionalSuite
Running o
This also bother me for a long time. I suspect the intellij builder
conflicts with the sbt/maven builder.
I resolve this issue by rebuild spark in intellij. You may meet
compilation issue when building it in intellij.
For that you need to put external/flume-sink/target/java on the source
build pa
Is the Scala version in Intellij the same as the one used by sbt ?
Cheers
On Tue, Nov 17, 2015 at 6:45 PM, 金国栋 wrote:
> Hi!
>
> I tried to build spark source code from github, and I successfully built
> it from command line using `*sbt/sbt assembly*`. While I encountered an
> error when compili
Hi!
I tried to build spark source code from github, and I successfully built it
from command line using `*sbt/sbt assembly*`. While I encountered an error
when compiling the project in Intellij IDEA(V14.1.5).
The error log is below:
*Error:scala: *
* while compiling:
/Users/ray/Documents/P01
I got the same error message when using maven 3.3 .
On Jun 3, 2015 8:58 AM, "Ted Yu" wrote:
> I used the same command on Linux but didn't reproduce the error.
>
> Can you include -X switch on your command line ?
>
> Also consider upgrading maven to 3.3.x
>
> Cheers
>
> On Wed, Jun 3, 2015 at 2:36
I used the same command on Linux but didn't reproduce the error.
Can you include -X switch on your command line ?
Also consider upgrading maven to 3.3.x
Cheers
On Wed, Jun 3, 2015 at 2:36 AM, Daniel Emaasit
wrote:
> I run into errors while trying to build Spark from the 1.4 release
> branch:
I run into errors while trying to build Spark from the 1.4 release branch:
https://github.com/apache/spark/tree/branch-1.4. Any help will be much
appreciated. Here is the log file from my windows 8.1 PC. (F.Y.I, I
installed all the dependencies like Java 7, Maven 3.2.5 and set
the environment varia
That is a known issue uncovered last week. It fails on certain
environments, not on Jenkins which is our testing environment.
There is already a PR up to fix it. For now you can build using "mvn
package -DskipTests"
TD
On Fri, Jan 30, 2015 at 8:59 PM, Andrew Musselman <
andrew.mussel...@gmail.com>
Off master, got this error; is that typical?
---
T E S T S
---
Running org.apache.spark.streaming.mqtt.JavaMQTTStreamSuite
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.495
Hi,
I get the following error when I build spark using sbt:
[error] Nonzero exit code (128): git clone
https://github.com/ScrapCodes/sbt-pom-reader.git
/home/karthik/.sbt/0.13/staging/ad8e8574a5bcb2d22d23/sbt-pom-reader
[error] Use 'last' for the full log.
Any help please?
ons-math3 is @ test scope (core/pom.xml):
>>>>
>>>> org.apache.commons
>>>> commons-math3
>>>> 3.3
>>>> test
>>>>
>>>>
>>>> Adjusting the scope should solve the problem
t; On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa
>>> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I'm using some functions from Breeze in a spark job but I get the
>>>> following build error :
>>>>
>>>> *
solve the problem below.
>>
>> On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa
>> wrote:
>>
>>> Hi all,
>>>
>>> I'm using some functions from Breeze in a spark job but I get the
>>> following build error :
>>>
>>>
> org.apache.commons
> commons-math3
> 3.3
> test
>
>
> Adjusting the scope should solve the problem below.
>
> On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa
> wrote:
>>
>> Hi all,
>>
>> I'm using some functions f
> Adjusting the scope should solve the problem below.
>
> On Fri, Sep 26, 2014 at 8:42 AM, Jaonary Rabarisoa
> wrote:
>
>> Hi all,
>>
>> I'm using some functions from Breeze in a spark job but I get the
>> following build error :
>>
>> *Error:s
I'm using some functions from Breeze in a spark job but I get the
> following build error :
>
> *Error:scalac: bad symbolic reference. A signature in RandBasis.class
> refers to term math3*
> *in package org.apache.commons which is not available.*
> *It may be completely missi
Hi all,
I'm using some functions from Breeze in a spark job but I get the following
build error :
*Error:scalac: bad symbolic reference. A signature in RandBasis.class
refers to term math3*
*in package org.apache.commons which is not available.*
*It may be completely missing from the cu
Hi,
I am trying to build jars using the command :
mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package
Execution of the above command is throwing the following error:
[INFO] Spark Project Core . FAILURE [ 0.295 s]
[INFO] Spark Project Bagel .
rorHandling.scala:18)
> > at sbt.Execute.work(Execute.scala:244)
> > at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
> > at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
> > at
> >
> sbt.ConcurrentRestrictions$$anon$
ervice$$anon$2.call(CompletionService.scala:30)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
>
23, 2014 11:19:27 AM
I dont want spark sql, I can do without it.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/spark-github-source-build-error-tp10532.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
;node=2795&i=0>
>>> > wrote:
>>> Good morning! I'm attempting to build Apache Spark 0.9.0 on Windows 8.
>>> I've
>>> installed all prerequisites (except Hadoop) and run "sbt/sbt assembly"
>>> while
>>> in the roo
rote:
>> Good morning! I'm attempting to build Apache Spark 0.9.0 on Windows 8.
>> I've
>> installed all prerequisites (except Hadoop) and run "sbt/sbt assembly"
>> while
>> in the root directory. I'm getting an error after the line "Set
and run "sbt/sbt assembly"
>>> while
>>> in the root directory. I'm getting an error after the line "Set current
>>> project to root ". The
>>> error
>>> is:
>>> [error] Not a valid command: /
>>> [error] /sbt
>>
tory. I'm getting an error after the line "Set current
>> project to root ". The error
>> is:
>> [error] Not a valid command: /
>> [error] /sbt
>> [error] ^
>>
>> Do you know why I'm getting this error?
>>
>> Thank you very much
this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
rror] Not a valid command: /
[error] /sbt
[error] ^
Do you know why I'm getting this error?
Thank you very much,
Will
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Apache-Spark-0-9-0-Build-Error-tp2794.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
28 matches
Mail list logo