Search JIRA ... https://issues.apache.org/jira/browse/SPARK-24417

On Mon, Mar 11, 2019 at 1:03 PM Sudhir Menon <sme...@snappydata.io> wrote:
>
> Is there a timeline for Spark 3.0?
> Or more specifically, is there a timeline for moving to Java 9 and beyond?
>
> Thanks in advance
> Suds
>
>
>
> On Tue, Nov 6, 2018 at 9:16 AM Felix Cheung <felixcheun...@hotmail.com> wrote:
>>
>> +1 for Spark 3, definitely
>> Thanks for the updates
>>
>>
>> ________________________________
>> From: Sean Owen <sro...@gmail.com>
>> Sent: Tuesday, November 6, 2018 9:11 AM
>> To: Felix Cheung
>> Cc: dev
>> Subject: Re: Java 11 support
>>
>> I think that Java 9 support basically gets Java 10, 11 support. But
>> the jump from 8 to 9 is unfortunately more breaking than usual because
>> of the total revamping of the internal JDK classes. I think it will be
>> mostly a matter of dependencies needing updates to work. I agree this
>> is probably pretty important for Spark 3. Here's the ticket I know of:
>> https://issues.apache.org/jira/browse/SPARK-24417 . DB is already
>> working on some of it, I see.
>> On Tue, Nov 6, 2018 at 10:59 AM Felix Cheung <felixcheun...@hotmail.com> 
>> wrote:
>> >
>> > Speaking of, get we work to support Java 11?
>> > That will fix all the problems below.
>> >
>> >
>> >
>> > ________________________________
>> > From: Felix Cheung <felixcheun...@hotmail.com>
>> > Sent: Tuesday, November 6, 2018 8:57 AM
>> > To: Wenchen Fan
>> > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
>> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >
>> > We have not been able to publish to CRAN for quite some time (since 2.3.0 
>> > was archived - the cause is Java 11)
>> >
>> > I think it’s ok to announce the release of 2.4.0
>> >
>> >
>> > ________________________________
>> > From: Wenchen Fan <cloud0...@gmail.com>
>> > Sent: Tuesday, November 6, 2018 8:51 AM
>> > To: Felix Cheung
>> > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
>> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >
>> > Do you mean we should have a 2.4.0 release without CRAN and then do a 
>> > 2.4.1 immediately?
>> >
>> > On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <felixcheun...@hotmail.com> 
>> > wrote:
>> >>
>> >> Shivaram and I were discussing.
>> >> Actually we worked with them before. Another possible approach is to 
>> >> remove the vignettes eval and all test from the source package... in the 
>> >> next release.
>> >>
>> >>
>> >> ________________________________
>> >> From: Matei Zaharia <matei.zaha...@gmail.com>
>> >> Sent: Tuesday, November 6, 2018 12:07 AM
>> >> To: Felix Cheung
>> >> Cc: Sean Owen; dev; Shivaram Venkataraman
>> >> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >>
>> >> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps 
>> >> we aren’t disabling it correctly, or perhaps they can ignore this 
>> >> specific failure. +Shivaram who might have some ideas.
>> >>
>> >> Matei
>> >>
>> >> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <felixcheun...@hotmail.com> 
>> >> > wrote:
>> >> >
>> >> > I don¡Št know what the cause is yet.
>> >> >
>> >> > The test should be skipped because of this check
>> >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>> >> >
>> >> > And this
>> >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>> >> >
>> >> > But it ran:
>> >> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", 
>> >> > "fit", formula,
>> >> >
>> >> > The earlier release was achived because of Java 11+ too so this 
>> >> > unfortunately isn¡Št new.
>> >> >
>> >> >
>> >> > From: Sean Owen <sro...@gmail.com>
>> >> > Sent: Monday, November 5, 2018 7:22 PM
>> >> > To: Felix Cheung
>> >> > Cc: dev
>> >> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >> >
>> >> > What can we do to get the release through? is there any way to
>> >> > circumvent these tests or otherwise hack it? or does it need a
>> >> > maintenance release?
>> >> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <felixcheun...@hotmail.com> 
>> >> > wrote:
>> >> > >
>> >> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly 
>> >> > > with vignettes but not skipping tests as would be expected.
>> >> > >
>> >> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with 
>> >> > > diagnostics:
>> >> > > Java version 8 is required for this package; found version: 11.0.1
>> >> > > Execution halted
>> >> > >
>> >> > > * checking PDF version of manual ... OK
>> >> > > * DONE
>> >> > > Status: 1 WARNING, 1 NOTE
>> >> > >
>> >> > > Current CRAN status: ERROR: 1, OK: 1
>> >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> >> > >
>> >> > > Version: 2.3.0
>> >> > > Check: tests, Result: ERROR
>> >> > > Running ¡¥run-all.R¡Š [8s/35s]
>> >> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
>> >> > > Last 13 lines of output:
>> >> > > 4: 
>> >> > > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper",
>> >> > >  "fit", formula,
>> >> > > data@sdf, tolower(family$family), family$link, tol, 
>> >> > > as.integer(maxIter), weightCol,
>> >> > > regParam, as.double(var.power), as.double(link.power), 
>> >> > > stringIndexerOrderType,
>> >> > > offsetCol)
>> >> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
>> >> > > 6: handleErrors(returnStatus, conn)
>> >> > > 7: stop(readString(conn))
>> >> > >
>> >> > > ùùùù testthat results 
>> >> > > ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
>> >> > > OK: 0 SKIPPED: 0 FAILED: 2
>> >> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
>> >> > > 2. Error: spark.glm and predict (@test_basic.R#58)
>> >> > >
>> >> > >
>> >> > >
>> >> > > ---------- Forwarded message ---------
>> >> > > Date: Mon, Nov 5, 2018, 10:12
>> >> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >> > >
>> >> > > Dear maintainer,
>> >> > >
>> >> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks 
>> >> > > automatically, please see the following pre-tests:
>> >> > > Windows: 
>> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
>> >> > > Status: 1 NOTE
>> >> > > Debian: 
>> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
>> >> > > Status: 1 WARNING, 1 NOTE
>> >> > >
>> >> > > Last released version's CRAN status: ERROR: 1, OK: 1
>> >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> >> > >
>> >> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
>> >> > >
>> >> > > Please fix all problems and resubmit a fixed version via the webform.
>> >> > > If you are not sure how to fix the problems shown, please ask for 
>> >> > > help on the R-package-devel mailing list:
>> >> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
>> >> > > If you are fairly certain the rejection is a false positive, please 
>> >> > > reply-all to this message and explain.
>> >> > >
>> >> > > More details are given in the directory:
>> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
>> >> > > The files will be removed after roughly 7 days.
>> >> > >
>> >> > > No strong reverse dependencies to be checked.
>> >> > >
>> >> > > Best regards,
>> >> > > CRAN teams' auto-check service
>> >> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
>> >> > > Check: CRAN incoming feasibility, Result: NOTE
>> >> > > Maintainer: 'Shivaram Venkataraman <shiva...@cs.berkeley.edu>'
>> >> > >
>> >> > > New submission
>> >> > >
>> >> > > Package was archived on CRAN
>> >> > >
>> >> > > Possibly mis-spelled words in DESCRIPTION:
>> >> > > Frontend (4:10, 5:28)
>> >> > >
>> >> > > CRAN repository db overrides:
>> >> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>> >> > > corrected despite reminders.
>> >> > >
>> >> > > Flavor: r-devel-linux-x86_64-debian-gcc
>> >> > > Check: re-building of vignette outputs, Result: WARNING
>> >> > > Error in re-building vignettes:
>> >> > > ...
>> >> > >
>> >> > > Attaching package: 'SparkR'
>> >> > >
>> >> > > The following objects are masked from 'package:stats':
>> >> > >
>> >> > > cov, filter, lag, na.omit, predict, sd, var, window
>> >> > >
>> >> > > The following objects are masked from 'package:base':
>> >> > >
>> >> > > as.data.frame, colnames, colnames<-, drop, endsWith,
>> >> > > intersect, rank, rbind, sample, startsWith, subset, summary,
>> >> > > transform, union
>> >> > >
>> >> > > trying URL 
>> >> > > 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
>> >> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 
>> >> > > MB)
>> >> > > ==================================================
>> >> > > downloaded 217.3 MB
>> >> > >
>> >> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
>> >> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with 
>> >> > > diagnostics:
>> >> > > Java version 8 is required for this package; found version: 11.0.1
>> >> > > Execution halted
>> >>
>
>
>
> --
> Sudhir Menon
> snappydata.io
> 503-724-1481 (c)
> Real time operational analytics at the speed of thought
> Checkout the SnappyData iSight cloud for AWS

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to