Fwd: Help: Configure `secrets.CODECOV_TOKEN` for Github Spark repo to ensure successful daily scheduling of workflow `build_coverage.yml`

2025-01-07 Thread Pan Bingkun
f workflow `build_coverage.yml` To: , I'm not sure if any of us (committers, PMC) have access. THis might be a question for Apache INFRA tickets. (BTW this message is fine for dev@ rather than private@) On Tue, Jan 7, 2025 at 5:46 AM Pan Bingkun wrote: > Hi, all > > - Recently, I have noticed

Re: [DISCUSS] Release Apache Spark 3.5.4

2024-12-03 Thread Pan Bingkun
+1, thank you, Jie Bests Bingkun On 2024/12/04 03:27:11 杨杰 wrote: > Hi dev, > > It's been approximately 3 months since Sep 9, 2024, when we released > version 3.5.3 for branch-3.5. The patchset differing from 3.5.3 has grown > significantly, now it contains 57 commits. > > The JIRA[2] also indi

答复: [ANNOUNCE] Apache Spark 3.5.1 released

2024-03-05 Thread Pan,Bingkun
Okay, Let me double-check it carefully. Thank you very much for your help! 发件人: Jungtaek Lim 发送时间: 2024年3月5日 21:56:41 收件人: Pan,Bingkun 抄送: Dongjoon Hyun; dev; user 主题: Re: [ANNOUNCE] Apache Spark 3.5.1 released Yeah the approach seems OK to me - please double

答复: [ANNOUNCE] Apache Spark 3.5.1 released

2024-03-05 Thread Pan,Bingkun
:07 收件人: Pan,Bingkun 抄送: Dongjoon Hyun; dev; user 主题: Re: [ANNOUNCE] Apache Spark 3.5.1 released Let me be more specific. We have two active release version lines, 3.4.x and 3.5.x. We just released Spark 3.5.1, having a dropdown as 3.5.1 and 3.4.2 given the fact the last version of 3.4.x is 3.4.2

答复: [ANNOUNCE] Apache Spark 3.5.1 released

2024-03-05 Thread Pan,Bingkun
time of each new document release. Of course, if we need to keep the latest in every document, I think it's also possible. Only by sharing the same version. json file in each version. 发件人: Jungtaek Lim 发送时间: 2024年3月5日 16:47:30 收件人: Pan,Bingkun 抄送: Dongjoon

答复: [ANNOUNCE] Apache Spark 3.5.1 released

2024-03-05 Thread Pan,Bingkun
According to my understanding, the original intention of this feature is that when a user has entered the pyspark document, if he finds that the version he is currently in is not the version he wants, he can easily jump to the version he wants by clicking on the drop-down box. Additionally, in t