In spark-1.0.2, I have come across an error when I try to broadcast a quite
large numpy array(with 35M dimension). The error information except the
java.lang.NegativeArraySizeException error and details is listed below.
Moreover, when broadcast a relatively smaller numpy array(30M dimension),
every
at 12:29 PM, Davies Liu-2 [via Apache Spark User List]
wrote:
> This PR fix the problem: https://github.com/apache/spark/pull/2659
>
> cc @josh
>
> Davies
>
> On Tue, Nov 11, 2014 at 7:47 PM, bliuab <[hidden email]
> <http://user/SendEmail.jtp?type=node&node=18673
r than 2G, I
> didn't read your post carefully.
>
> The broadcast in Python had been improved much since 1.1, I think it
> will work in 1.1 or upcoming 1.2 release, could you upgrade to 1.1?
>
> Davies
>
> On Tue, Nov 11, 2014 at 8:37 PM, bliuab <[hidden email]
&g
gt;> will work in 1.1 or upcoming 1.2 release, could you upgrade to 1.1?
>>
>> Davies
>>
>> On Tue, Nov 11, 2014 at 8:37 PM, bliuab <[hidden email]
>> <http://user/SendEmail.jtp?type=node&node=18684&i=0>> wrote:
>>
>> > Dear Liu:
>&g