amplab jenkins is down.
On Mon, Nov 30, 2020 at 3:25 PM shane knapp ☠ wrote:
> old jenkins is getting shut down Real Soon Now[tm]! crossing my fingers!
> :)
>
> On Mon, Nov 30, 2020 at 10:05 AM shane knapp ☠
> wrote:
>
>> hey all!
>>
>> the Great Jenkins Migration[tm] is well under way, and we
old jenkins is getting shut down Real Soon Now[tm]! crossing my fingers!
:)
On Mon, Nov 30, 2020 at 10:05 AM shane knapp ☠ wrote:
> hey all!
>
> the Great Jenkins Migration[tm] is well under way, and we will be
> sunsetting the old amp-jenkins-master server and moving to a new one.
>
> i've put
Just want to say thank you to all the active SS contributors. I saw many
great features/improvements in Streaming have been merged and will be
available in the upcoming 3.1 release.
- Cache fetched list of files beyond maxFilesPerTrigger as unread file
(SPARK-32568)
- Streamline the logic
Jungtaek,
If there are contributors that you trust for reviews, then please let PMC
members know so they can be considered. I agree that is the best solution.
If there aren't contributors that the PMC wants to add as committers, then
I suggest agreeing on a temporary exception to help make progre
Thank you, Shane. :)
Bests,
Dongjoon.
On Mon, Nov 30, 2020 at 10:05 AM shane knapp ☠ wrote:
> hey all!
>
> the Great Jenkins Migration[tm] is well under way, and we will be
> sunsetting the old amp-jenkins-master server and moving to a new one.
>
> i've put jenkins in to quiet mode so that it w
hey all!
the Great Jenkins Migration[tm] is well under way, and we will be
sunsetting the old amp-jenkins-master server and moving to a new one.
i've put jenkins in to quiet mode so that it won't accept new builds and
we'll let the ones currently running finish. once that's done, i will be
rysnc
Can you try with
val encoder = RowEncoder(schema).resolveAndBind()
...
On Mon, Nov 30, 2020 at 5:07 PM Jason Jun wrote:
> Thanks Jia, Wenchen for your reply.
>
> I've change my code like belows :
>
> val sparkPlan =
> sqlContext.sparkSession.sessionState.planner.plan(pipeLinePlan).next()
>
Thanks Jia, Wenchen for your reply.
I've change my code like belows :
val sparkPlan =
sqlContext.sparkSession.sessionState.planner.plan(pipeLinePlan).next()
sparkPlan.execute().mapPartitions { itr =>
itr.map { internalRow =>
val encoder = RowEncoder(schema)
encoder.f
The fromRow method is removed in spark3.0. And the new API is :
val encoder = RowEncoder(schema)
val row = encoder.createDeserializer().apply(internalRow)
Thanks,
Jia Ke
From: Wenchen Fan
Sent: Friday, November 27, 2020 9:32 PM
To: Jason Jun
Cc: Spark dev list
Subject: Re: How to convert Inter