Re: The Dataset unit test is much slower than the RDD unit test (in Scala)

2022-11-01 Thread Cheng Pan
s), and we have switched > > from RDD to Dataset recently. > > > We've found that the unit test takes much longer. We profiled it and > > have found that it's the planning phase that is slow, not execution. > > > I wonder if anyone has encountered this issue b

Re: The Dataset unit test is much slower than the RDD unit test (in Scala)

2022-11-01 Thread Enrico Minack
is helps, Enrico Am 25.10.22 um 21:54 schrieb Tanin Na Nakorn: Hi All, Our data job is very complex (e.g. 100+ joins), and we have switched from RDD to Dataset recently. We've found that the unit test takes much longer. We profiled it and have found that it's the planning phase th

The Dataset unit test is much slower than the RDD unit test (in Scala)

2022-10-25 Thread Tanin Na Nakorn
Hi All, Our data job is very complex (e.g. 100+ joins), and we have switched from RDD to Dataset recently. We've found that the unit test takes much longer. We profiled it and have found that it's the planning phase that is slow, not execution. I wonder if anyone has encountered

Re: ivy unit test case filing for Spark

2021-12-21 Thread Wes Peng
Are you using IvyVPN which causes this problem? If the VPN software changes the network URL silently you should avoid using them. Regards. On Wed, Dec 22, 2021 at 1:48 AM Pralabh Kumar wrote: > Hi Spark Team > > I am building a spark in VPN . But the unit test case below is failing.

Re: ivy unit test case filing for Spark

2021-12-21 Thread Sean Owen
You would have to make it available? This doesn't seem like a spark issue. On Tue, Dec 21, 2021, 10:48 AM Pralabh Kumar wrote: > Hi Spark Team > > I am building a spark in VPN . But the unit test case below is failing. > This is pointing to ivy location which cannot be reached

ivy unit test case filing for Spark

2021-12-21 Thread Pralabh Kumar
Hi Spark Team I am building a spark in VPN . But the unit test case below is failing. This is pointing to ivy location which cannot be reached within VPN . Any help would be appreciated test("SPARK-33084: Add jar support Ivy URI -- default transitive = true") { *sc *= new SparkC

Re: Need Unit test complete reference for Pyspark

2020-11-19 Thread Sofia’s World
7;first', 'second']) print(df.show()) df2 = spark_session.createDataFrame([['one', 'two']]).toDF(*['first', 'second']) assert df.subtract(df2).count() == 0 On Thu, Nov 19, 2020 at 6:38 AM Sachit Murarka wrote: > Hi Users, > > I have to write Un

Need Unit test complete reference for Pyspark

2020-11-18 Thread Sachit Murarka
Hi Users, I have to write Unit Test cases for PySpark. I think pytest-spark and "spark testing base" are good test libraries. Can anyone please provide full reference for writing the test cases in Python using these? Kind Regards, Sachit Murarka

Re: how do i force unit test to do whole stage codegen

2017-04-05 Thread Jacek Laskowski
en can be apply to. >>> >>> So, in your test case, whole-stage codegen has been already enabled!! >>> >>> FYI. I think that it is a good topic for d...@spark.apache.org. >>> >>> Kazuaki Ishizaki >>> >>> >>> >>> Fro

Re: how do i force unit test to do whole stage codegen

2017-04-05 Thread Koert Kuipers
a good topic for d...@spark.apache.org. >> >> Kazuaki Ishizaki >> >> >> >> From:Koert Kuipers >> To:"user@spark.apache.org" >> Date:2017/04/05 05:12 >> Subject:how do i force unit test to do whole stage codegen &

Re: how do i force unit test to do whole stage codegen

2017-04-05 Thread Jacek Laskowski
; > FYI. I think that it is a good topic for d...@spark.apache.org. > > Kazuaki Ishizaki > > > > From:Koert Kuipers > To:"user@spark.apache.org" > Date:2017/04/05 05:12 > Subject:how do i force unit test to do whole stage codegen

Re: how do i force unit test to do whole stage codegen

2017-04-04 Thread Koert Kuipers
ly to. > > So, in your test case, whole-stage codegen has been already enabled!! > > FYI. I think that it is a good topic for d...@spark.apache.org. > > Kazuaki Ishizaki > > > > From:Koert Kuipers > To:"user@spark.apache.org" > Date:

Re: how do i force unit test to do whole stage codegen

2017-04-04 Thread Kazuaki Ishizaki
opic for d...@spark.apache.org. Kazuaki Ishizaki From: Koert Kuipers To: "user@spark.apache.org" Date: 2017/04/05 05:12 Subject: how do i force unit test to do whole stage codegen i wrote my own expression with eval and doGenCode, but doGenCode never gets called in test

how do i force unit test to do whole stage codegen

2017-04-04 Thread Koert Kuipers
i wrote my own expression with eval and doGenCode, but doGenCode never gets called in tests. also as a test i ran this in a unit test: spark.range(10).select('id as 'asId).where('id === 4).explain according to https://jaceklaskowski.gitbooks.io/mastering-apache-spark/spark-

Re: How to unit test spark streaming?

2017-03-07 Thread kant kodali
Agreed with the statement in quotes below whether one wants to do unit tests or not It is a good practice to write code that way. But I think the more painful and tedious task is to mock/emulate all the nodes such as spark workers/master/hdfs/input source stream and all that. I wish there is someth

Re: How to unit test spark streaming?

2017-03-07 Thread Michael Armbrust
> > Basically you abstract your transformations to take in a dataframe and > return one, then you assert on the returned df > +1 to this suggestion. This is why we wanted streaming and batch dataframes to share the same API.

Re: How to unit test spark streaming?

2017-03-07 Thread Jörn Franke
ali wrote: > > Hi All, > > How to unit test spark streaming or spark in general? How do I test the > results of my transformations? Also, more importantly don't we need to spawn > master and worker JVM's either in one or mult

Re: How to unit test spark streaming?

2017-03-07 Thread Sam Elamin
dataframe and return one, then you assert on the returned df Regards Sam On Tue, 7 Mar 2017 at 12:05, kant kodali wrote: > Hi All, > > How to unit test spark streaming or spark in general? How do I test the > results of my transformations? Also, more importantly don't we need

How to unit test spark streaming?

2017-03-07 Thread kant kodali
Hi All, How to unit test spark streaming or spark in general? How do I test the results of my transformations? Also, more importantly don't we need to spawn master and worker JVM's either in one or multiple nodes? Thanks! kant

Error in run multiple unit test that extends DataFrameSuiteBase

2016-09-23 Thread Jinyuan Zhou
After I created two test case that FlatSpec with DataFrameSuiteBase. But I got errors when do sbt test. I was able to run each of them separately. My test cases does use sqlContext to read files. Here is the exception stack. Judging from the exception, I may need to unregister RpcEndpoint after ea

RE: How this unit test passed on master trunk?

2016-04-24 Thread Yong Zhang
Subject: Re: How this unit test passed on master trunk? From: zzh...@hortonworks.com To: java8...@hotmail.com; gatorsm...@gmail.com CC: user@spark.apache.org Date: Sun, 24 Apr 2016 04:37:11 + There are multiple records for the DF scala> structDF.groupBy($"a").agg(min(st

Re: How this unit test passed on master trunk?

2016-04-23 Thread Zhan Zhang
uct(1, 2). Please check how the Ordering is implemented in InterpretedOrdering. The output itself does not have any ordering. I am not sure why the unit test and the real env have different environment. Xiao, I do see the difference between unit test and local cluster run. Do you know the reaso

Re: How this unit test passed on master trunk?

2016-04-22 Thread Ted Yu
"))).first() first: org.apache.spark.sql.Row = [1,[1,1]] BTW https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-maven-hadoop-2.7/715/consoleFull shows this test passing. On Fri, Apr 22, 2016 at 11:23 AM, Yong Zhang wrote: > Hi, > > I was trying to find out why this unit test can pass

How this unit test passed on master trunk?

2016-04-22 Thread Yong Zhang
Hi, I was trying to find out why this unit test can pass in Spark code. inhttps://github.com/apache/spark/blob/master/sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala for this unit test: test("Star Expansion - CreateStruct and CreateArray") { val structDf = testDa

Re: Unit test with sqlContext

2016-03-19 Thread Vikas Kawadia
rk-testing-base >>>> >>>> DataFrame examples are here: >>>> https://github.com/holdenk/spark-testing-base/blob/master/src/test/1.3/scala/com/holdenkarau/spark/testing/SampleDataFrameTest.scala >>>> >>>> Thanks, >>>> Silvi

Re: Unit test with sqlContext

2016-02-05 Thread Steve Annessa
m/holdenkarau/spark/testing/SampleDataFrameTest.scala >>> >>> Thanks, >>> Silvio >>> >>> From: Steve Annessa >>> Date: Thursday, February 4, 2016 at 8:36 PM >>> To: "user@spark.apache.org" >>> Subject: Unit test

Re: Unit test with sqlContext

2016-02-04 Thread Rishi Mishra
/1.3/scala/com/holdenkarau/spark/testing/SampleDataFrameTest.scala >> >> Thanks, >> Silvio >> >> From: Steve Annessa >> Date: Thursday, February 4, 2016 at 8:36 PM >> To: "user@spark.apache.org" >> Subject: Unit test with sqlContext >

Re: Unit test with sqlContext

2016-02-04 Thread Holden Karau
e/blob/master/src/test/1.3/scala/com/holdenkarau/spark/testing/SampleDataFrameTest.scala > > Thanks, > Silvio > > From: Steve Annessa > Date: Thursday, February 4, 2016 at 8:36 PM > To: "user@spark.apache.org" > Subject: Unit test with sqlContext > > I'

Re: Unit test with sqlContext

2016-02-04 Thread Silvio Fiorito
/master/src/test/1.3/scala/com/holdenkarau/spark/testing/SampleDataFrameTest.scala Thanks, Silvio From: Steve Annessa mailto:steve.anne...@gmail.com>> Date: Thursday, February 4, 2016 at 8:36 PM To: "user@spark.apache.org<mailto:user@spark.apache.org>" mailto:user@spark.apache.or

Unit test with sqlContext

2016-02-04 Thread Steve Annessa
I'm trying to unit test a function that reads in a JSON file, manipulates the DF and then returns a Scala Map. The function has signature: def ingest(dataLocation: String, sc: SparkContext, sqlContext: SQLContext) I've created a bootstrap spec for spark jobs that instantiates the Spa

Re: how to run unit test for specific component only

2015-11-13 Thread Steve Loughran
try: mvn test -pl sql -DwildcardSuites=org.apache.spark.sql -Dtest=none On 12 Nov 2015, at 03:13, weoccc mailto:weo...@gmail.com>> wrote: Hi, I am wondering how to run unit test for specific spark component only. mvn test -DwildcardSuites="org.apache.spark.sql.*" -Dtest

Re: how to run unit test for specific component only

2015-11-11 Thread Ted Yu
Have you tried the following ? build/sbt "sql/test-only *" Cheers On Wed, Nov 11, 2015 at 7:13 PM, weoccc wrote: > Hi, > > I am wondering how to run unit test for specific spark component only. > > mvn test -DwildcardSuites="org.apache.spark.sql.*" -Dtest

how to run unit test for specific component only

2015-11-11 Thread weoccc
Hi, I am wondering how to run unit test for specific spark component only. mvn test -DwildcardSuites="org.apache.spark.sql.*" -Dtest=none The above command doesn't seem to work. I'm using spark 1.5. Thanks, Weide

Re: How to unit test HiveContext without OutOfMemoryError (using sbt)

2015-08-26 Thread Michael Armbrust
I'd suggest setting sbt to fork when running tests. On Wed, Aug 26, 2015 at 10:51 AM, Mike Trienis wrote: > Thanks for your response Yana, > > I can increase the MaxPermSize parameter and it will allow me to run the > unit test a few more times before I run out of memory.

Re: How to unit test HiveContext without OutOfMemoryError (using sbt)

2015-08-26 Thread Mike Trienis
Thanks for your response Yana, I can increase the MaxPermSize parameter and it will allow me to run the unit test a few more times before I run out of memory. However, the primary issue is that running the same unit test in the same JVM (multiple times) results in increased memory (each run of

Re: How to unit test HiveContext without OutOfMemoryError (using sbt)

2015-08-25 Thread Yana Kadiyska
test On Tue, Aug 25, 2015 at 2:10 PM, Mike Trienis wrote: > Hello, > > I am using sbt and created a unit test where I create a `HiveContext` and > execute some query and then return. Each time I run the unit test the JVM > will increase it's memory usage until I get

How to unit test HiveContext without OutOfMemoryError (using sbt)

2015-08-25 Thread Mike Trienis
Hello, I am using sbt and created a unit test where I create a `HiveContext` and execute some query and then return. Each time I run the unit test the JVM will increase it's memory usage until I get the error: Internal error when running tests: java.lang.OutOfMemoryError: PermGen space Exce

Re: [Unit Test Failure] Test org.apache.spark.streaming.JavaAPISuite.testCount failed

2015-05-20 Thread Tathagata Das
> Do you get this failure repeatedly? > > > > On Thu, May 14, 2015 at 12:55 AM, kf wrote: > >> Hi, all, i got following error when i run unit test of spark by >> dev/run-tests >> on the latest "branch-1.4" branch. >> >> the latest com

Re: [Unit Test Failure] Test org.apache.spark.streaming.JavaAPISuite.testCount failed

2015-05-14 Thread Wangfei (X)
Yes it is repeatedly on my locally Jenkins. 发自我的 iPhone 在 2015年5月14日,18:30,"Tathagata Das" mailto:t...@databricks.com>> 写道: Do you get this failure repeatedly? On Thu, May 14, 2015 at 12:55 AM, kf mailto:wangf...@huawei.com>> wrote: Hi, all, i got following error w

Re: [Unit Test Failure] Test org.apache.spark.streaming.JavaAPISuite.testCount failed

2015-05-14 Thread Tathagata Das
Do you get this failure repeatedly? On Thu, May 14, 2015 at 12:55 AM, kf wrote: > Hi, all, i got following error when i run unit test of spark by > dev/run-tests > on the latest "branch-1.4" branch. > > the latest commit id: > commit d518c0369fa412567855980c3f0f42

[Unit Test Failure] Test org.apache.spark.streaming.JavaAPISuite.testCount failed

2015-05-14 Thread kf
Hi, all, i got following error when i run unit test of spark by dev/run-tests on the latest "branch-1.4" branch. the latest commit id: commit d518c0369fa412567855980c3f0f426cde5c190d Author: zsxwing Date: Wed May 13 17:58:29 2015 -0700 error [

Re: Spark unit test fails

2015-05-07 Thread NoWisdom
I'm also getting the same error. Any ideas? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-unit-test-fails-tp22368p22798.html Sent from the Apache Spark User List mailing list archive at Nabbl

Re: Cannot run unit test.

2015-04-08 Thread Mike Trienis
It's because your tests are running in parallel and you can only have one context running at a time. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Cannot-run-unit-test-tp14459p22429.html Sent from the Apache Spark User List mailing list archi

Re: Spark unit test fails

2015-04-06 Thread Manas Kar
Unknown Source) > [info] at java.lang.ClassLoader.defineClass(Unknown Source) > [info] at java.security.SecureClassLoader.defineClass(Unknown Source) > [info] at java.net.URLClassLoader.defineClass(Unknown Source) > [info] at java.net.URLClassLoader.access$100(Unknown Source)

Spark unit test fails

2015-04-03 Thread Manas Kar
Hi experts, I am trying to write unit tests for my spark application which fails with javax.servlet.FilterRegistration error. I am using CDH5.3.2 Spark and below is my dependencies list. val spark = "1.2.0-cdh5.3.2" val esriGeometryAPI = "1.2" val csvWriter = "1.0.0"

TestSuiteBase based unit test using a sliding window join timesout

2015-01-07 Thread Enno Shioji
Hi, I extended org.apache.spark.streaming.TestSuiteBase for some testing, and I was able to run this test fine: test("Sliding window join with 3 second window duration") { val input1 = Seq( Seq("req1"), Seq("req2", "req3"), Seq(), Seq("req4", "req5", "req6"),

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Emre Sevinc
gt; > > > logger.warn("!!! DEBUG !!! target: {}", r.getURI()); > > > > String response = r.accept(MediaType.APPLICATION_JSON_TYPE) > >//.header("") > >.get(String.class); > > &g

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Sean Owen
t; > logger.warn("!!! DEBUG !!! target: {}", r.getURI()); > > String response = r.accept(MediaType.APPLICATION_JSON_TYPE) >//.header("") >.get(String.class); > > logger.warn("!!! DEBUG !!! Spotlight resp

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Emre Sevinc
!! Spotlight response: {}", response); It seems to work when I use spark-submit to submit the application that includes this code. Funny thing is, now my relevant unit test does not run, complaining about not having enough memory: Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Emre Sevinc
On Wed, Dec 24, 2014 at 1:46 PM, Sean Owen wrote: > I'd take a look with 'mvn dependency:tree' on your own code first. > Maybe you are including JavaEE 6 for example? > For reference, my complete pom.xml looks like: http://maven.apache.org/POM/4.0.0"; xmlns:xsi=" http://www.w3.org/2001/XMLSchem

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Sean Owen
!!! target: {}", target.getUri().toString()); > > String response = > target.request().accept(MediaType.APPLICATION_JSON_TYPE).get(String.class); > > logger.warn("!!! DEBUG !!! Spotlight response: {}", response); > > When run inside a unit test as follo

Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

2014-12-24 Thread Emre Sevinc
}", target.getUri().toString()); String response = target.request().accept(MediaType.APPLICATION_JSON_TYPE).get(String.class); logger.warn("!!! DEBUG !!! Spotlight response: {}", response); When run inside a unit test as follows: mvn clean test -Dtest=SpotlightTest#testC

Re: How can I make Spark Streaming count the words in a file in a unit test?

2014-12-08 Thread Burak Yavuz
. Best, Burak - Original Message - From: "Emre Sevinc" To: user@spark.apache.org Sent: Monday, December 8, 2014 2:36:41 AM Subject: How can I make Spark Streaming count the words in a file in a unit test? Hello, I've successfully built a very simple Spark Streaming appl

How can I make Spark Streaming count the words in a file in a unit test?

2014-12-08 Thread Emre Sevinc
on to my local Spark, it waits for a file to be written to a given directory, and when I create that file it successfully prints the number of words. I terminate the application by pressing Ctrl+C. Now I've tried to create a very basic unit test for this functionality, but in the test I was n

Re: Cannot run unit test.

2014-09-17 Thread Jies
001560.n3.nabble.com/Cannot-run-unit-test-tp14459p14506.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org

Re: Unit Test for Spark Streaming

2014-08-08 Thread JiajiaJing
test junit junit 4.8.1 test org.scalatest scalatest_2.10 2.2.1 test Thank you very much! Best Regards, Jiajia -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streami

Re: Unit Test for Spark Streaming

2014-08-06 Thread Tathagata Das
Does it not show the name of the testsuite on stdout, showing that it has passed? Can you try writing a small "test" unit-test, in the same way as your kafka unit test, and with print statements on stdout ... to see whether it works? I believe it is some configuration issue in maven, whi

Re: Unit Test for Spark Streaming

2014-08-06 Thread JiajiaJing
e test? Are there any other methods can be used to run this test? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11570.html Sent from the Apache Spark User List

Re: Unit Test for Spark Streaming

2014-08-05 Thread Tathagata Das
ng to run the KafkaStreamSuite.scala unit > test. > I added "scalatest-maven-plugin" to my pom.xml, then ran "mvn test", and got > the follow error message: > > error: object Utils in package util cannot be accessed in package > org.apache.spark.util >

Re: Unit Test for Spark Streaming

2014-08-05 Thread JiajiaJing
Hi TD, I encountered a problem when trying to run the KafkaStreamSuite.scala unit test. I added "scalatest-maven-plugin" to my pom.xml, then ran "mvn test", and got the follow error message: error: object Utils in package util cannot be accessed in package o

Re: Unit Test for Spark Streaming

2014-08-04 Thread JiajiaJing
This helps a lot!! Thank you very much! Jiajia -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-for-Spark-Streaming-tp11394p11396.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Unit Test for Spark Streaming

2014-08-04 Thread Tathagata Das
Appropriately timed question! Here is the PR that adds a real unit test for Kafka stream in Spark Streaming. Maybe this will help! https://github.com/apache/spark/pull/1751/files On Mon, Aug 4, 2014 at 6:30 PM, JiajiaJing wrote: > Hello Spark Users, > > I have a spark streaming pro

Unit Test for Spark Streaming

2014-08-04 Thread JiajiaJing
Hello Spark Users, I have a spark streaming program that stream data from kafka topics and output as parquet file on HDFS. Now I want to write a unit test for this program to make sure the output data is correct (i.e not missing any data from kafka). However, I have no idea about how to do this

Re: Run spark unit test on Windows 7

2014-07-03 Thread Denny Lee
gt;>>>>> at junit.framework.TestSuite.runTest(TestSuite.java:232) >>>>>> at junit.framework.TestSuite.run(TestSuite.java:227) >>>>>> at >>>>>> org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:81) >>>>>>

Re: Run spark unit test on Windows 7

2014-07-03 Thread Kostiantyn Kudriavtsev
> at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:67) >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >>>> at >>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >>>> at

Re: Run spark unit test on Windows 7

2014-07-03 Thread Denny Lee
hy the path includes "null" though. Could you provide the full stack trace? Andrew 2014-07-02 9:38 GMT-07:00 Konstantin Kudryavtsev : Hi all, I'm trying to run some transformation on Spark, it works fine on cluster (YARN, linux machines). However, when I'm trying to run

Re: Run spark unit test on Windows 7

2014-07-02 Thread Konstantin Kudryavtsev
g.reflect.Method.invoke(Method.java:606) >> at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120) >> >> >> Thank you, >> Konstantin Kudryavtsev >> >> >> On Wed, Jul 2, 2014 at 8:15 PM, Andrew Or wrote: >> >>> Hi Konstati

Re: Run spark unit test on Windows 7

2014-07-02 Thread Denny Lee
udes "null" though. >> >> Could you provide the full stack trace? >> >> Andrew >> >> >> 2014-07-02 9:38 GMT-07:00 Konstantin Kudryavtsev < >> kudryavtsev.konstan...@gmail.com>: >> >> Hi all, >>>

Re: Run spark unit test on Windows 7

2014-07-02 Thread Kostiantyn Kudriavtsev
120) >> >> >> Thank you, >> Konstantin Kudryavtsev >> >> >> On Wed, Jul 2, 2014 at 8:15 PM, Andrew Or wrote: >> Hi Konstatin, >> >> We use hadoop as a library in a few places in Spark. I wonder why the path >> includ

Re: Run spark unit test on Windows 7

2014-07-02 Thread Denny Lee
Andrew Or wrote: >> Hi Konstatin, >> >> We use hadoop as a library in a few places in Spark. I wonder why the path >> includes "null" though. >> >> Could you provide the full stack trace? >> >> Andrew >> >> >> 2014-07

Re: Run spark unit test on Windows 7

2014-07-02 Thread Konstantin Kudryavtsev
> 2014-07-02 9:38 GMT-07:00 Konstantin Kudryavtsev < > kudryavtsev.konstan...@gmail.com>: > > Hi all, >> >> I'm trying to run some transformation on *Spark*, it works fine on >> cluster (YARN, linux machines). However, when I'm trying to run it on local >&g

Re: Run spark unit test on Windows 7

2014-07-02 Thread Andrew Or
t; I'm trying to run some transformation on *Spark*, it works fine on > cluster (YARN, linux machines). However, when I'm trying to run it on local > machine (*Windows 7*) under unit test, I got errors: > > java.io.IOException: Could not locate executable null\b

Run spark unit test on Windows 7

2014-07-02 Thread Konstantin Kudryavtsev
Hi all, I'm trying to run some transformation on *Spark*, it works fine on cluster (YARN, linux machines). However, when I'm trying to run it on local machine (*Windows 7*) under unit test, I got errors: java.io.IOException: Could not locate executable null\bin\winutils.exe in

Re: Unit test failure: Address already in use

2014-06-18 Thread Philip Ogren
ent:* Wednesday, June 18, 2014 12:33 AM *To:* user@spark.apache.org *Subject:* Re: Unit test failure: Address already in use Hi, Could your problem come from the fact that you run your tests in parallel ? If you are spark in local mode, you cannot have concurrent spark instances running. this means tha

RE: Unit test failure: Address already in use

2014-06-18 Thread Lisonbee, Todd
, Todd From: Anselme Vignon [mailto:anselme.vig...@flaminem.com] Sent: Wednesday, June 18, 2014 12:33 AM To: user@spark.apache.org Subject: Re: Unit test failure: Address already in use Hi, Could your problem come from the fact that you run your tests in parallel ? If you are spark in local mode

Re: Unit test failure: Address already in use

2014-06-18 Thread Anselme Vignon
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:77) > > > thanks > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Unit-test-failure-Address-already-in-use-tp7771.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >

Unit test failure: Address already in use

2014-06-17 Thread SK
erSocketChannelImpl.java:139) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:77) thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unit-test-failure-Address-already-in-use-tp7771.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

printing in unit test

2014-06-13 Thread SK
Hi, My unit test is failing (the output is not matching the expected output). I would like to printout the value of the output. But rdd.foreach(r=>println(r)) does not work from the unit test. How can I print or write out the output to a file/screen? thanks. -- View this message in cont

unit test

2014-06-06 Thread b0c1
ich) -> Elasticsearch => Spark (map/reduce) -> HBase 2. Can Spark read data from elasticsearch? What is the prefered way for this? b0c1 -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/unit-test-tp7155.html Sent from the Apache Spark User List mailing lis