i suggest we stick to 2.10.3, since otherwise it seems that (surprisingly)
you force everyone to upgrade


On Sun, Apr 6, 2014 at 1:46 PM, Koert Kuipers <ko...@tresata.com> wrote:

> also, i thought scala 2.10 was binary compatible, but does not seem to be
> the case. the spark artifacts for scala 2.10.4 dont work for me, since we
> are still on scala 2.10.3, but when i recompiled and published spark with
> scala 2.10.3 everything was fine again.
>
> errors i see:
> java.lang.ClassNotFoundException: scala.None$
>
> fun stuff!
>
>
> On Sun, Apr 6, 2014 at 12:13 PM, Koert Kuipers <ko...@tresata.com> wrote:
>
>> patrick,
>> this has happened before, that a commit introduced java 7
>> code/dependencies and your build didnt fail, i think it was when reynold
>> upgraded to jetty 9. must be that your entire build infrastructure runs
>> java 7...
>>
>>
>> On Sat, Apr 5, 2014 at 6:06 PM, Patrick Wendell <pwend...@gmail.com>wrote:
>>
>>> If you want to submit a hot fix for this issue specifically please do.
>>> I'm
>>> not sure why it didn't fail our build...
>>>
>>>
>>> On Sat, Apr 5, 2014 at 2:30 PM, Debasish Das <debasish.da...@gmail.com
>>> >wrote:
>>>
>>> > I verified this is happening for both CDH4.5 and 1.0.4...My deploy
>>> > environment is Java 6...so Java 7 compilation is not going to help...
>>> >
>>> > Is this the PR which caused it ?
>>> >
>>> > Andre Schumacher
>>> >
>>> >     fbebaed    Spark parquet improvements A few improvements to the
>>> Parquet
>>> > support for SQL queries: - Instead of files a ParquetRelation is now
>>> backed
>>> > by a directory, which simplifies importing data from other sources -
>>> > InsertIntoParquetTable operation now supports switching between
>>> overwriting
>>> > or appending (at least in HiveQL) - tests now use the new API - Parquet
>>> > logging can be set to WARNING level (Default) - Default compression for
>>> > Parquet files (GZIP, as in parquet-mr) Author: Andre Schumacher &...
>>>  2
>>> > days ago    SPARK-1383
>>> >
>>> > I will go to a stable checkin before this
>>> >
>>> >
>>> >
>>> >
>>> > On Sat, Apr 5, 2014 at 2:22 PM, Debasish Das <debasish.da...@gmail.com
>>> > >wrote:
>>> >
>>> > > I can compile with Java 7...let me try that...
>>> > >
>>> > >
>>> > > On Sat, Apr 5, 2014 at 2:19 PM, Sean Owen <so...@cloudera.com>
>>> wrote:
>>> > >
>>> > >> That method was added in Java 7. The project is on Java 6, so I
>>> think
>>> > >> this was just an inadvertent error in a recent PR (it was the 'Spark
>>> > >> parquet improvements' one).
>>> > >>
>>> > >> I'll open a hot-fix PR after looking for other stuff like this that
>>> > >> might have snuck in.
>>> > >> --
>>> > >> Sean Owen | Director, Data Science | London
>>> > >>
>>> > >>
>>> > >> On Sat, Apr 5, 2014 at 10:04 PM, Debasish Das <
>>> debasish.da...@gmail.com
>>> > >
>>> > >> wrote:
>>> > >> > I am synced with apache/spark master but getting error in
>>> spark/sql
>>> > >> > compilation...
>>> > >> >
>>> > >> > Is the master broken ?
>>> > >> >
>>> > >> > [info] Compiling 34 Scala sources to
>>> > >> > /home/debasish/spark_deploy/sql/core/target/scala-2.10/classes...
>>> > >> > [error]
>>> > >> >
>>> > >>
>>> >
>>> /home/debasish/spark_deploy/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetRelation.scala:106:
>>> > >> > value getGlobal is not a member of object java.util.logging.Logger
>>> > >> > [error]       logger.setParent(Logger.getGlobal)
>>> > >> > [error]                               ^
>>> > >> > [error] one error found
>>> > >> > [error] (sql/compile:compile) Compilation failed
>>> > >> > [error] Total time: 171 s, completed Apr 5, 2014 4:58:41 PM
>>> > >> >
>>> > >> > Thanks.
>>> > >> > Deb
>>> > >>
>>> > >
>>> > >
>>> >
>>>
>>
>>
>

Reply via email to