Sim,

Can you increase the PermGen size? Please let me know what is your setting
when the problem disappears.

Thanks,

Yin

On Sun, Jul 5, 2015 at 5:59 PM, Denny Lee <denny.g....@gmail.com> wrote:

> I had run into the same problem where everything was working swimmingly
> with Spark 1.3.1.  When I switched to Spark 1.4, either by upgrading to
> Java8 (from Java7) or by knocking up the PermGenSize had solved my issue.
> HTH!
>
>
>
> On Mon, Jul 6, 2015 at 8:31 AM Andy Huang <andy.hu...@servian.com.au>
> wrote:
>
>> We have hit the same issue in spark shell when registering a temp table.
>> We observed it happening with those who had JDK 6. The problem went away
>> after installing jdk 8. This was only for the tutorial materials which was
>> about loading a parquet file.
>>
>> Regards
>> Andy
>>
>> On Sat, Jul 4, 2015 at 2:54 AM, sim <s...@swoop.com> wrote:
>>
>>> @bipin, in my case the error happens immediately in a fresh shell in
>>> 1.4.0.
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/1-4-0-regression-out-of-memory-errors-on-small-data-tp23595p23614.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>
>>>
>>
>>
>> --
>> Andy Huang | Managing Consultant | Servian Pty Ltd | t: 02 9376 0700 |
>> f: 02 9376 0730| m: 0433221979
>>
>

Reply via email to