Re: Building Spark to run PySpark Tests?

2023-01-19 Thread Sean Owen
thSGDTests) >>>>> > Test that the final value of weights is close to the desired value. >>>>> > >>>>> -- >>>>> > Traceback (most recent call last): >>>&

Re: Building Spark to run PySpark Tests?

2023-01-19 Thread Adam Chhina
;> > line 469, in condition >>>>>>> > self.assertGreater(errors[1] - errors[-1], 2) >>>>>>> > AssertionError: 1.8960983527735014 not greater than 2 >>>>>>> > >>>>>>> > == >>>>>>> > FAIL: test_par

Re: Building Spark to run PySpark Tests?

2023-01-18 Thread Sean Owen
/spark/python/pyspark/mllib/tests/test_streaming_algorithms.py", >>>> line 226, in condition >>>> > self.assertAlmostEqual(rel, 0.1, 1) >>>> > AssertionError: 0.23052813480829393 != 0.1 within 1 places >>>> (0.13052813480829392 difference)

Re: Building Spark to run PySpark Tests?

2023-01-18 Thread Adam Chhina
t; > lastValue = condition() >>>>> > File >>>>> > "/Users/adam/OSS/spark/python/pyspark/mllib/tests/test_streaming_algorithms.py", >>>>> > line 226, in condition >>>>> > self.assertAlmostEqual(rel, 0

Re: Building Spark to run PySpark Tests?

2023-01-18 Thread Sean Owen
est_streaming_algorithms.StreamingLogisticRegressionWithSGDTests) >>> > Test that the model improves on toy data with no. of batches >>> > -- >>> > Traceback (most recent call last): &g

Re: Building Spark to run PySpark Tests?

2023-01-18 Thread Adam Chhina
s) >>> > Test that the model improves on toy data with no. of batches >>> > -- >>> > Traceback (most recent call last): >>> > File >>> > "/Users/adam/OSS/spark/python/pyspar

Re: Building Spark to run PySpark Tests?

2023-01-18 Thread Bjørn Jørgensen
ng/utils.py", line >> 93, in eventually >> > raise AssertionError( >> > AssertionError: Test failed due to timeout after 180 sec, with last >> condition returning: Latest errors: 0.67, 0.71, 0.78, 0.7, 0.75, 0.74, >> 0.73, 0.69, 0.62, 0.71, 0.69, 0

Re: Building Spark to run PySpark Tests?

2023-01-18 Thread Sean Owen
eout after 180 sec, with last > condition returning: Latest errors: 0.67, 0.71, 0.78, 0.7, 0.75, 0.74, > 0.73, 0.69, 0.62, 0.71, 0.69, 0.75, 0.72, 0.77, 0.71, 0.74, 0.76, 0.78, > 0.7, 0.78, 0.8, 0.74, 0.77, 0.75, 0.76, 0.76, 0.75, 0.78, 0.74, 0.64, 0.64, > 0.71, 0.78, 0.76, 0.64, 0.68, 0.69, 0.72, 0.77 > >

Re: Building Spark to run PySpark Tests?

2023-01-18 Thread Adam Chhina
r 180 sec, with last condition > returning: Latest errors: 0.67, 0.71, 0.78, 0.7, 0.75, 0.74, 0.73, 0.69, > 0.62, 0.71, 0.69, 0.75, 0.72, 0.77, 0.71, 0.74, 0.76, 0.78, 0.7, 0.78, 0.8, > 0.74, 0.77, 0.75, 0.76, 0.76, 0.75, 0.78, 0.74, 0.64, 0.64, 0.71, 0.78, 0.76, > 0.64, 0.68, 0.69, 0.72, 0.77

Building Spark to run PySpark Tests?

2022-12-27 Thread Adam Chhina
, 0.76, 0.76, 0.75, 0.78, 0.74, 0.64, 0.64, 0.71, 0.78, 0.76, 0.64, 0.68, 0.69, 0.72, 0.77 -- Ran 13 tests in 661.536s FAILED (failures=3, skipped=1) Had test failures in pyspark.mllib.tests.test_streaming_algorithms with /usr/local/bin/python3; see logs. Here’s how I’m currently b

Re: Problem building spark-catalyst_2.12 with Maven

2022-02-10 Thread Martin Grigorov
I've found the problem! It was indeed a local thingy! $ cat ~/.mavenrc MAVEN_OPTS='-XX:+TieredCompilation -XX:TieredStopAtLevel=1' I've added this some time ago. It optimizes the build time. But it seems it also overrides the env var MAVEN_OPTS... Now it fails with: [INFO] --- scala-maven-plugi

Re: Problem building spark-catalyst_2.12 with Maven

2022-02-10 Thread Sean Owen
I think it's another occurrence that I had to change or had to set MAVEN_OPTS. I think this occurs in a way that this setting doesn't affect, though I don't quite understand it. Try the stack size in test runner configs On Thu, Feb 10, 2022, 2:02 PM Martin Grigorov wrote: > Hi Sean, > > On Thu,

Re: Problem building spark-catalyst_2.12 with Maven

2022-02-10 Thread Martin Grigorov
Hi Sean, On Thu, Feb 10, 2022 at 5:37 PM Sean Owen wrote: > Yes I've seen this; the JVM stack size needs to be increased. I'm not sure > if it's env specific (though you and I at least have hit it, I think > others), or whether we need to change our build script. > In the pom.xml file, find "-Xs

Re: Problem building spark-catalyst_2.12 with Maven

2022-02-10 Thread Sean Owen
Yes I've seen this; the JVM stack size needs to be increased. I'm not sure if it's env specific (though you and I at least have hit it, I think others), or whether we need to change our build script. In the pom.xml file, find "-Xss..." settings and make them something like "-Xss4m", see if that wor

Problem building spark-catalyst_2.12 with Maven

2022-02-10 Thread Martin Grigorov
Hi, I am not able to build Spark due to the following error : ERROR] ## Exception when compiling 543 sources to /home/martin/git/apache/spark/sql/catalyst/target/scala-2.12/classes java.lang.BootstrapMethodError: call site initialization exception java.lang.invoke.CallSite.makeSite(CallSite.java:

Re: Java version for building Spark

2019-06-24 Thread Sean Owen
"The Maven-based build is the build of reference for Apache Spark. Building Spark using Maven requires Maven 3.5.4 and Java 8." It doesn't depend on a particular version of Java 8. Installing it is platform-dependent. On Mon, Jun 24, 2019 at 6:43 PM Valeriy Trofimov wrote: >

Java version for building Spark

2019-06-24 Thread Valeriy Trofimov
to install OpenJDK. But building Spark using openJDK shows me an error Googling which shows that I need to install default JDK. I can give you error details, if you are interested, but let's settle on the Java version first. Thanks, Val

Re: Difficulties building spark-master with sbt

2018-02-08 Thread ds
Thanks for the answer, but that doesn't solve my problem. The cmd doesn't recognize ./build/sbt ('.\build\sbt' is not recognized as an internal or external command, operable program or batch file.), even when the full path to the sbt file is specified. I just realized that I haven't mentioned tha

Re: Difficulties building spark-master with sbt

2018-02-08 Thread Jacek Laskowski
Hi, s,sbt ./build/sbt,./build/sbt In other words, don't execute sbt with ./build/sbt, but ./build/sbt itself (you don't even have to install sbt to build spark as it's included in the repo and the script uses it internally) Pozdrawiam, Jacek Laskowski https://about.me/JacekLaskowski Masteri

Re: Difficulties building spark-master with sbt

2018-02-07 Thread Sean Owen
The master SBT builds seem OK, like: https://amplab.cs.berkeley.edu/jenkins/view/Spark%20QA%20Test%20(Dashboard)/job/spark-master-test-sbt-hadoop-2.7/ It looks like an issue between Windows, SBT, and your env I think. On Wed, Feb 7, 2018 at 5:12 PM ds wrote: > After cloning today's version of s

Difficulties building spark-master with sbt

2018-02-07 Thread ds
After cloning today's version of spark-master, I run the following command: S:\spark-master>sbt ./build/sbt -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.0 -Phive -Phive-thriftserver clean package with the intention of building both the source and test projects and generating the corresponding .jar fil

Re: Building spark master failed

2016-05-23 Thread Ovidiu-Cristian MARCU
You’re right, I tought latest will only compile against Java8. Thanks > On 23 May 2016, at 11:35, Dongjoon Hyun wrote: > > Hi, > > That is not the latest. > > The bug was fixed 5 days ago. > > Regards, > Dongjoon. > > > On Mon, May 23, 2016 at 2:16 AM, Ovidiu-Cristian MARCU > mailto:ovi

Re: Building spark master failed

2016-05-23 Thread Dongjoon Hyun
Hi, That is not the latest. The bug was fixed 5 days ago. Regards, Dongjoon. On Mon, May 23, 2016 at 2:16 AM, Ovidiu-Cristian MARCU < ovidiu-cristian.ma...@inria.fr> wrote: > Hi > > I have the following issue when trying to build the latest spark source > code on master: > > /spark/common/net

Building spark master failed

2016-05-23 Thread Ovidiu-Cristian MARCU
Hi I have the following issue when trying to build the latest spark source code on master: /spark/common/network-common/src/main/java/org/apache/spark/network/util/JavaUtils.java:147: error: cannot find symbol [error] if (process != null && process.isAlive()) { [error]

Re: Building Spark with a Custom Version of Hadoop: HDFS ClassNotFoundException

2016-02-11 Thread Ted Yu
Hdfs class is in hadoop-hdfs-XX.jar Can you check the classpath to see if the above jar is there ? Please describe the command lines you used for building hadoop / Spark. Cheers On Thu, Feb 11, 2016 at 5:15 PM, Charlie Wright wrote: > I am having issues trying to run a test job on a built ver

Building Spark with a Custom Version of Hadoop: HDFS ClassNotFoundException

2016-02-11 Thread Charlie Wright
I am having issues trying to run a test job on a built version of Spark with a custom Hadoop JAR. My custom hadoop version runs without issues and I can run jobs from a precompiled version of Spark (with Hadoop) no problem. However, whenever I try to run the same Spark example on the Spark versi

Re: Building Spark with Custom Hadoop Version

2016-02-05 Thread Steve Loughran
> On 4 Feb 2016, at 23:11, Ted Yu wrote: > > Assuming your change is based on hadoop-2 branch, you can use 'mvn install' > command which would put artifacts under 2.8.0-SNAPSHOT subdir in your local > maven repo. > + generally, unless you want to run all the hadoop tests, set the -DskipTes

Re: Building Spark with Custom Hadoop Version

2016-02-05 Thread Steve Loughran
> On 4 Feb 2016, at 23:11, Ted Yu wrote: > > Assuming your change is based on hadoop-2 branch, you can use 'mvn install' > command which would put artifacts under 2.8.0-SNAPSHOT subdir in your local > maven repo. > > Here is an example: > ~/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.8.0-S

Re: Building Spark with Custom Hadoop Version

2016-02-04 Thread Ted Yu
Assuming your change is based on hadoop-2 branch, you can use 'mvn install' command which would put artifacts under 2.8.0-SNAPSHOT subdir in your local maven repo. Here is an example: ~/.m2/repository/org/apache/hadoop/hadoop-hdfs/2.8.0-SNAPSHOT Then you can use the following command to build Spa

Building Spark with Custom Hadoop Version

2016-02-04 Thread Charles Wright
Hello, I have made some modifications to the YARN source code that I want to test with Spark, how do I do this? I know that I need to include my custom hadoop jar as a dependency but I don't know how to do this as I am not very familiar with maven. Any help is appreciated. Thanks, Charles.

Re: Problem building Spark

2015-10-19 Thread Ted Yu
the following error: > > > > > > > Building Spark - Spark 1.5.1 Documentation > Building Spark Building with build/mvn Building a Runnable Distribution > Setting up Maven’s Memory Usage Specifying the Hadoop Version Building With > Hive and JDBC Support Building for Scala 2

Re: Problem building Spark

2015-10-19 Thread Tathagata Das
com.invalid> wrote: > I tried to build Spark according to the build directions > <http://spark.apache.org/docs/latest/building-spark.html> and the it > failed due to the following error: > > > > > > > Building Spark - Spark 1.5.1 Documentation > <http://spa

Problem building Spark

2015-10-19 Thread Annabel Melongo
I tried to build Spark according to the build directions and the it failed due to the following error:  |   | |   |   |   |   |   | | Building Spark - Spark 1.5.1 DocumentationBuilding Spark Building with build/mvn Building a Runnable Distribution Setting up Maven’s Memory Usage Specifying the

Building Spark w/ 1.8 and binary incompatibilities

2015-10-19 Thread Iulian Dragoș
Hey all, tl;dr; I built Spark with Java 1.8 even though my JAVA_HOME pointed to 1.7. Then it failed with binary incompatibilities. I couldn’t find any mention of this in the docs, so It might be a known thing, but it’s definitely too easy to do the wrong thing. The problem is that Maven is using

Re: Building Spark

2015-10-16 Thread Jean-Baptiste Onofré
<mailto:dev@spark.apache.org> Subject: Re: Building Spark bq. Access is denied Please check permission of the path mentioned. On Thu, Oct 15, 2015 at 3:45 PM, Annabel Melongo mailto:melongo_anna...@yahoo.com.invalid>> wrote: I was trying to build a cloned version of Spark on my loc

Re: Building Spark

2015-10-16 Thread Annabel Melongo
o.com> Cc: dev@spark.apache.org Subject: Re: Building Spark bq. Access is denied Please check permission of the path mentioned. On Thu, Oct 15, 2015 at 3:45 PM, Annabel Melongo wrote: I was trying to build a cloned version of Spark on my local machine using the command:        mvn -P

Re: Building Spark

2015-10-15 Thread Ted Yu
bq. Access is denied Please check permission of the path mentioned. On Thu, Oct 15, 2015 at 3:45 PM, Annabel Melongo < melongo_anna...@yahoo.com.invalid> wrote: > I was trying to build a cloned version of Spark on my local machine using > the command: > mvn -Pyarn -Phadoop-2.4 -Dhadoop.v

Building Spark

2015-10-15 Thread Annabel Melongo
I was trying to build a cloned version of Spark on my local machine using the command:        mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean packageHowever I got the error:       [ERROR] Failed to execute goal org.apache.maven.plugins:maven-shade-plugin:2.4.1:shade (default) on

Silly question about building Spark 1.4.1

2015-07-20 Thread Michael Segel
Hi, I’m looking at the online docs for building spark 1.4.1 … http://spark.apache.org/docs/latest/building-spark.html <http://spark.apache.org/docs/latest/building-spark.html> I was interested in building spark for Scala 2.11 (latest scala) and also for Hive and JDBC support. Th

Re: Building spark 1.2 from source requires more dependencies

2015-03-30 Thread yash datta
Hi all, When selecting large data in sparksql (Select * query) , I see Buffer overflow exception from kryo : 15/03/27 10:32:19 WARN scheduler.TaskSetManager: Lost task 6.0 in stage 3.0 (TID 30, machine159): com.esotericsoftware.kryo.KryoException: Buffer overflow. Available: 1, required: 2 Seri

Re: Building spark 1.2 from source requires more dependencies

2015-03-29 Thread Sean Owen
Given that's it's an internal error from scalac, I think it may be something to take up with the Scala folks to really fix. We can just look for workarounds. Try blowing away your .m2 and .ivy cache for example. FWIW I was running on Linux with Java 8u31, latest scala 2.11 AFAIK. On Sun, Mar 29, 2

Re: Building spark 1.2 from source requires more dependencies

2015-03-29 Thread Pala M Muthaia
Sean, I did a mvn clean and then build, it produces the same error. I also did a fresh git clone of spark and invoked the same build command and it resulted in identical error (I also had a colleague do a same thing, lest there was some machine specific issue, and saw the same error). Unless i mis

Re: Building spark 1.2 from source requires more dependencies

2015-03-27 Thread Sean Owen
This is not a compile error, but an error from the scalac compiler. That is, the code and build are fine, but scalac is not compiling it. Usually when this happens, a clean build fixes it. On Fri, Mar 27, 2015 at 7:09 PM, Pala M Muthaia wrote: > No, i am running from the root directory, parent of

Re: Building spark 1.2 from source requires more dependencies

2015-03-27 Thread Pala M Muthaia
No, i am running from the root directory, parent of core. Here is the first set of errors that i see when i compile from source (sorry the error message is very long, but adding it in case it helps in diagnosis). After i manually add javax.servlet dependency for version 3.0, these set of errors g

Re: Building spark 1.2 from source requires more dependencies

2015-03-27 Thread Sean Owen
I built from the head of branch-1.2 and spark-core compiled correctly with your exact command. You have something wrong with how you are building. For example, you're not trying to run this from the core subdirectory are you? On Thu, Mar 26, 2015 at 10:36 PM, Pala M Muthaia wrote: > Hi, > > We ar

Re: Building spark 1.2 from source requires more dependencies

2015-03-26 Thread Pala M Muthaia
+spark-dev Yes, the dependencies are there. I guess my question is how come the build is succeeding in the mainline then, without adding these dependencies? On Thu, Mar 26, 2015 at 3:44 PM, Ted Yu wrote: > Looking at output from dependency:tree, servlet-api is brought in by the > following: > >

Re: Building Spark with Pants

2015-02-16 Thread Ryan Williams
I worked on Pants at Foursquare for a while and when coming up to speed on Spark was interested in the possibility of building it with Pants, particularly because allowing developers to share/reuse each others' compilation artifacts seems like it would be a boon to productivity; that was/is Pants'

Re: Building Spark with Pants

2015-02-14 Thread Nicholas Chammas
FYI: Here is the matching discussion over on the Pants dev list. On Mon Feb 02 2015 at 4:50:33 PM Nicholas Chammas nicholas.cham...@gmail.com wrote: To reiterate, I'm asking from an experi

Re: Broken record a bit here: building spark on intellij with sbt

2015-02-05 Thread Arush Kharbanda
I follow these ones to import sbt projects. 1. Install sbt plugins: Goto File -> Settings -> Plugins -> Install IntelliJ Plugins -> Search for sbt and install it 2. File ->Import->browse the root of spark source code I hope this helps On Fri, Feb 6, 2015 at 1:41 AM, Stephen Boesch wrote: >

Re: Broken record a bit here: building spark on intellij with sbt

2015-02-05 Thread Stephen Boesch
Hi Akhil Those instructions you provided are showing how to manually build an sbt project that may include adding spark dependencies. Whereas my OP was about how to open the existing spark sbt project . These two are not similar tasks. 2015-02-04 21:46 GMT-08:00 Akhil Das : > Here's the sbt v

Re: Broken record a bit here: building spark on intellij with sbt

2015-02-04 Thread Deep Pradhan
Akhil, it is not able to find the SBT package when I tried the steps given in the site. On Thu, Feb 5, 2015 at 11:16 AM, Akhil Das wrote: > Here's the sbt version > > https://docs.sigmoidanalytics.com/index.php/Step_by_Step_instructions_on_how_to_build_Spark_App_with_IntelliJ_IDEA > > > Thanks >

Re: Broken record a bit here: building spark on intellij with sbt

2015-02-04 Thread Akhil Das
Here's the sbt version https://docs.sigmoidanalytics.com/index.php/Step_by_Step_instructions_on_how_to_build_Spark_App_with_IntelliJ_IDEA Thanks Best Regards On Thu, Feb 5, 2015 at 8:55 AM, Stephen Boesch wrote: > For building in intellij with sbt my mileage has varied widely: it had > built a

Broken record a bit here: building spark on intellij with sbt

2015-02-04 Thread Stephen Boesch
For building in intellij with sbt my mileage has varied widely: it had built as late as Monday (after the 1.3.0 release) - and with zero 'special' steps: just "import" as sbt project. However I can not presently repeat the process. The wiki page has the latest instructions on how to build with

Re: Building Spark with Pants

2015-02-02 Thread Nicholas Chammas
To reiterate, I'm asking from an experimental perspective. I'm not proposing we change Spark to build with Pants or anything like that. I'm interested in trying Pants out and I'm wondering if anyone else shares my interest or already has experience with Pants that they can share. On Mon Feb 02 20

Re: Building Spark with Pants

2015-02-02 Thread Nicholas Chammas
I'm asking from an experimental standpoint; this is not happening anytime soon. Of course, if the experiment turns out very well, Pants would replace both sbt and Maven (like it has at Twitter, for example). Pants also works with IDEs . On

Re: Building Spark with Pants

2015-02-02 Thread Stephen Boesch
There is a significant investment in sbt and maven - and they are not at all likely to be going away. A third build tool? Note that there is also the perspective of building within an IDE - which actually works presently for sbt and with a little bit of tweaking with maven as well. 2015-02-02 16:

Building Spark with Pants

2015-02-02 Thread Nicholas Chammas
Does anyone here have experience with Pants or interest in trying to build Spark with it? Pants has an interesting story. It was born at Twitter to help them build their Scala, Java, and Python projects as several independent components in one monolithic re

Building Spark source error with maven

2014-09-16 Thread wyphao.2007
Hi, When I building spark with maven, but failed, the error message is as following. I didn't found the satisfactory solution by google. Anyone can help me? Thank you! INFO] [INFO] Reactor Summary: [INFO] [INFO]

Re: Building Spark against Scala 2.10.1 virtualized

2014-07-18 Thread Reynold Xin
Yes. On Fri, Jul 18, 2014 at 12:50 PM, Meisam Fathi wrote: > Sorry for resurrecting this thread but project/SparkBuild.scala is > completely rewritten recently (after this commit > https://github.com/apache/spark/tree/628932b). Should library > dependencies be defined in pox.xml files after thi

Re: Building Spark against Scala 2.10.1 virtualized

2014-07-18 Thread Meisam Fathi
Sorry for resurrecting this thread but project/SparkBuild.scala is completely rewritten recently (after this commit https://github.com/apache/spark/tree/628932b). Should library dependencies be defined in pox.xml files after this commit? Thanks Meisam On Thu, Jun 5, 2014 at 4:51 PM, Matei Zaharia

Re: Building Spark against Scala 2.10.1 virtualized

2014-06-05 Thread Matei Zaharia
You can modify project/SparkBuild.scala and build Spark with sbt instead of Maven. On Jun 5, 2014, at 12:36 PM, Meisam Fathi wrote: > Hi community, > > How should I change sbt to compile spark core with a different version > of Scala? I see maven pom files define dependencies to scala 2.10.4.

Building Spark against Scala 2.10.1 virtualized

2014-06-05 Thread Meisam Fathi
Hi community, How should I change sbt to compile spark core with a different version of Scala? I see maven pom files define dependencies to scala 2.10.4. I need to override/ignore the maven dependencies and use Scala virtualized, which needs these lines in a build.sbt file: scalaOrganization := "

Re: Building Spark AMI

2014-04-11 Thread Jim Ancona
Hi, Right now my use case is setting up a small cluster for prototyping/evaluation. My hope was that I could use the scripts that come with Spark to get things up and running quickly. For a production deploy we would probably roll our own using Puppet. Jim On Fri, Apr 11, 2014 at 7:58 PM, Mayur

Re: Building Spark AMI

2014-04-11 Thread Mayur Rustagi
I am creating one fully configured & synced one. But you still need to send over configuration. Do you plan to use chef for that ? On Apr 10, 2014 6:58 PM, "Jim Ancona" wrote: > Are there scripts to build the AMI used by the spark-ec2 script? > > Alternatively, is there a place to download the A

Building Spark AMI

2014-04-10 Thread Jim Ancona
Are there scripts to build the AMI used by the spark-ec2 script? Alternatively, is there a place to download the AMI. I'm interested in using it to deploy into an internal Openstack cloud. Thanks, Jim

Re: building Spark docs

2014-03-12 Thread Patrick Wendell
Dianna I'm forwarding this to the dev list since it might be useful there as well. On Wed, Mar 12, 2014 at 11:39 AM, Diana Carroll wrote: > Hi all. I needed to build the Spark docs. The basic instructions to do > this are in spark/docs/README.md but it took me quite a bit of playing > around to