Hello,
I am trying to run unit tests on pyspark.
When I try to run unit test I am faced with errors.
krishna@Krishna:~/Experiment/spark$ ./python/run-tests
Running PySpark tests. Output is in /Users/krishna/Experiment/
spark/python/unit-tests.log
Will test against the following Python executables:
I could resolve this by passing the argument below
./python/run-tests --python-executables=python2.7
Thanks,
Krishna
On Thu, Nov 3, 2016 at 4:16 PM, Krishna Kalyan
wrote:
> Hello,
> I am trying to run unit tests on pyspark.
>
> When I try to run unit test I am faced with errors.
> krishna@Kris
--
*Per Ullberg*
Data Vault Tech Lead
Odin Uppsala
+46 701612693 <+46+701612693>
Klarna AB (publ)
Sveavägen 46, 111 34 Stockholm
Tel: +46 8 120 120 00 <+46812012000>
Reg no: 556737-0431
klarna.com
+1
On Wed, Nov 2, 2016 at 5:40 PM, Reynold Xin wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 1.6.3. The vote is open until Sat, Nov 5, 2016 at 18:00 PDT and passes if a
> majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache
+1
On Thu, Nov 3, 2016 at 6:58 PM, Michael Armbrust
wrote:
> +1
>
> On Wed, Nov 2, 2016 at 5:40 PM, Reynold Xin wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 1.6.3. The vote is open until Sat, Nov 5, 2016 at 18:00 PDT and passes if a
>> majority of at l
The following simple (pyspark) code fails in Spark 2.0.1:
It only fails with all three arguments to .agg, removing any of them
prevents the failure. Similar code in Java also fails in the same way, so it
isn't specific to the Python API.
It runs without error in Spark 2.0.0, so I suspect it migh
+1
On Thu, Nov 3, 2016 at 12:57 PM, Herman van Hövell tot Westerflier <
hvanhov...@databricks.com> wrote:
> +1
>
> On Thu, Nov 3, 2016 at 6:58 PM, Michael Armbrust
> wrote:
>
>> +1
>>
>> On Wed, Nov 2, 2016 at 5:40 PM, Reynold Xin wrote:
>>
>>> Please vote on releasing the following candidate a
+1
On Wed, Nov 2, 2016 at 5:40 PM, Reynold Xin wrote:
> Please vote on releasing the following candidate as Apache Spark version
> 1.6.3. The vote is open until Sat, Nov 5, 2016 at 18:00 PDT and passes if a
> majority of at least 3+1 PMC votes are cast.
>
> [ ] +1 Release this package as Apache S
+1 (non-binding)
It's built and tested on CentOS 6.8 / OpenJDK 1.8.0_111, too.
Cheers,
Dongjoon.
On 2016-11-03 14:30 (-0700), Davies Liu wrote:
> +1
>
> On Wed, Nov 2, 2016 at 5:40 PM, Reynold Xin wrote:
> > Please vote on releasing the following candidate as Apache Spark version
> > 1.6.3.
+1
Dongjoon Hyun 于2016年11月4日周五 上午9:44写道:
> +1 (non-binding)
>
> It's built and tested on CentOS 6.8 / OpenJDK 1.8.0_111, too.
>
> Cheers,
> Dongjoon.
>
> On 2016-11-03 14:30 (-0700), Davies Liu wrote:
> > +1
> >
> > On Wed, Nov 2, 2016 at 5:40 PM, Reynold Xin wrote:
> > > Please vote on releasi
+1 (non-binding)
Cheers,
Liwei
On Fri, Nov 4, 2016 at 10:03 AM, Jeff Zhang wrote:
> +1
>
> Dongjoon Hyun 于2016年11月4日周五 上午9:44写道:
>
>> +1 (non-binding)
>>
>> It's built and tested on CentOS 6.8 / OpenJDK 1.8.0_111, too.
>>
>> Cheers,
>> Dongjoon.
>>
>> On 2016-11-03 14:30 (-0700), Davies Liu wr
Heads up that 1.6.3 RC2 might be impacted by the change of the JSON.org
licenses to category-x (disallowed dependency license) described in SPARK-18262.
Not sure if I'll have time to evaluate in time to cast a non-binding -1 before
the voting window closes.
-
busbey
On 2016-11-02 19:40 (-0500)
+1 (non-binding)
- Kousuke
On 2016/11/03 9:40, Reynold Xin wrote:
Please vote on releasing the following candidate as Apache Spark
version 1.6.3. The vote is open until Sat, Nov 5, 2016 at 18:00 PDT
and passes if a majority of at least 3+1 PMC votes are cast.
[ ] +1 Release this package as A
+1
On Thu, Nov 3, 2016 at 9:51 PM, Kousuke Saruta
wrote:
> +1 (non-binding)
>
> - Kousuke
>
> On 2016/11/03 9:40, Reynold Xin wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version
>> 1.6.3. The vote is open until Sat, Nov 5, 2016 at 18:00 PDT and passes if a
>> maj
14 matches
Mail list logo