Sorry :) BTW There is another related issue here
https://issues.apache.org/jira/browse/SPARK-17756


On 11/30/2016 05:12 PM, Nicholas Chammas wrote:
> > -1 (non binding) https://issues.apache.org/jira/browse/SPARK-16589
> No matter how useless in practice this shouldn't go to another major
> release.
>
> I agree that that issue is a major one since it relates to
> correctness, but since it's not a regression it technically does not
> merit a -1 vote on the release.
>
> Nick
>
> On Wed, Nov 30, 2016 at 11:00 AM Maciej Szymkiewicz
> <mszymkiew...@gmail.com <mailto:mszymkiew...@gmail.com>> wrote:
>
>     -1 (non binding) https://issues.apache.org/jira/browse/SPARK-16589
>     No matter how useless in practice this shouldn't go to another
>     major release.
>
>
>
>     On 11/30/2016 10:34 AM, Sean Owen wrote:
>>     FWIW I am seeing several test failures, each more than once, but,
>>     none are necessarily repeatable. These are likely just flaky
>>     tests but I thought I'd flag these unless anyone else sees
>>     similar failures:
>>
>>
>>     - SELECT a.i, b.i FROM oneToTen a JOIN oneToTen b ON a.i = b.i +
>>     1 *** FAILED ***
>>       org.apache.spark.SparkException: Job aborted due to stage
>>     failure: Task 1 in stage 9.0 failed 1 times, most recent failure:
>>     Lost task 1.0 in stage 9.0 (TID 19, localhost, executor driver):
>>     java.lang.NullPointerException
>>     at
>>     
>> org.apache.spark.sql.catalyst.expressions.GeneratedClass$SpecificUnsafeProjection.<init>(Unknown
>>     Source)
>>     at
>>     org.apache.spark.sql.catalyst.expressions.GeneratedClass.generate(Unknown
>>     Source)
>>       ...
>>
>>
>>     udf3Test(test.org.apache.spark.sql.JavaUDFSuite)  Time elapsed:
>>     0.302 sec  <<< ERROR!
>>     java.lang.NoSuchMethodError:
>>     
>> org.apache.spark.sql.catalyst.JavaTypeInference$.inferDataType(Lcom/google/common/reflect/TypeToken;)Lscala/Tuple2;
>>     at
>>     test.org.apache.spark.sql.JavaUDFSuite.udf3Test(JavaUDFSuite.java:107)
>>
>>
>>
>>     - SPARK-18360: default table path of tables in default database
>>     should depend on the location of default database *** FAILED ***
>>       Timeout of './bin/spark-submit' '--class'
>>     'org.apache.spark.sql.hive.SPARK_18360' '--name' 'SPARK-18360'
>>     '--master' 'local-cluster[2,1,1024]' '--conf'
>>     'spark.ui.enabled=false' '--conf'
>>     'spark.master.rest.enabled=false' '--driver-java-options'
>>     '-Dderby.system.durability=test'
>>     
>> 'file:/home/srowen/spark-2.1.0/sql/hive/target/tmp/spark-dc9f43f2-ded4-4bcf-947e-d5af6f0e1561/testJar-1480440084611.jar'
>>     See the log4j logs for more detail.
>>     ...
>>
>>
>>     - should clone and clean line object in ClosureCleaner *** FAILED ***
>>       isContain was true Interpreter output contained 'Exception':
>>       java.lang.IllegalStateException: Cannot call methods on a
>>     stopped SparkContext.
>>       This stopped SparkContext was created at:
>>       
>>
>>
>>     On Tue, Nov 29, 2016 at 5:31 PM Marcelo Vanzin
>>     <van...@cloudera.com <mailto:van...@cloudera.com>> wrote:
>>
>>         I'll send a -1 because of SPARK-18546. Haven't looked at
>>         anything else yet.
>>
>>         On Mon, Nov 28, 2016 at 5:25 PM, Reynold Xin
>>         <r...@databricks.com <mailto:r...@databricks.com>> wrote:
>>         > Please vote on releasing the following candidate as Apache
>>         Spark version
>>         > 2.1.0. The vote is open until Thursday, December 1, 2016 at
>>         18:00 UTC and
>>         > passes if a majority of at least 3 +1 PMC votes are cast.
>>         >
>>         > [ ] +1 Release this package as Apache Spark 2.1.0
>>         > [ ] -1 Do not release this package because ...
>>         >
>>         >
>>         > To learn more about Apache Spark, please see
>>         http://spark.apache.org/
>>         >
>>         > The tag to be voted on is v2.1.0-rc1
>>         > (80aabc0bd33dc5661a90133156247e7a8c1bf7f5)
>>         >
>>         > The release files, including signatures, digests, etc. can
>>         be found at:
>>         >
>>         
>> http://people.apache.org/~pwendell/spark-releases/spark-2.1.0-rc1-bin/
>>         
>> <http://people.apache.org/%7Epwendell/spark-releases/spark-2.1.0-rc1-bin/>
>>         >
>>         > Release artifacts are signed with the following key:
>>         > https://people.apache.org/keys/committer/pwendell.asc
>>         >
>>         > The staging repository for this release can be found at:
>>         >
>>         
>> https://repository.apache.org/content/repositories/orgapachespark-1216/
>>         >
>>         > The documentation corresponding to this release can be
>>         found at:
>>         >
>>         
>> http://people.apache.org/~pwendell/spark-releases/spark-2.1.0-rc1-docs/
>>         
>> <http://people.apache.org/%7Epwendell/spark-releases/spark-2.1.0-rc1-docs/>
>>         >
>>         >
>>         > =======================================
>>         > How can I help test this release?
>>         > =======================================
>>         > If you are a Spark user, you can help us test this release
>>         by taking an
>>         > existing Spark workload and running on this release
>>         candidate, then
>>         > reporting any regressions.
>>         >
>>         > ===============================================================
>>         > What should happen to JIRA tickets still targeting 2.1.0?
>>         > ===============================================================
>>         > Committers should look at those and triage. Extremely
>>         important bug fixes,
>>         > documentation, and API tweaks that impact compatibility
>>         should be worked on
>>         > immediately. Everything else please retarget to 2.1.1 or 2.2.0.
>>         >
>>         >
>>
>>
>>
>>         --
>>         Marcelo
>>
>>         ---------------------------------------------------------------------
>>         To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>         <mailto:dev-unsubscr...@spark.apache.org>
>>
>
>     -- 
>     Maciej Szymkiewicz
>

-- 
Maciej Szymkiewicz

Reply via email to