Re: [DISCUSS] SPIP: Support Docker Official Image for Spark

2022-09-18 Thread bo zhaobo
+1 (non-binding) This will bring the good experience to customers. So excited about this. ;-) Yuming Wang 于2022年9月19日周一 10:18写道: > +1. > > On Mon, Sep 19, 2022 at 9:44 AM Kent Yao wrote: > >> +1 >> >> Gengliang Wang 于2022年9月19日周一 09:23写道: >> > >> > +1, thanks for the work! >> > >> > On Sun, S

Re: Ask for ARM CI for spark

2019-11-19 Thread bo zhaobo
Hi @Sean Owen , Thanks for your reply and patient. First, we are so apologized for the bad words in the previous emails. We just want to make the users can see the current support status in some place of spark community. I'm really appreciated that you and spark community make spark better on ARM

Re: Ask for ARM CI for spark

2019-11-15 Thread bo zhaobo
Hi @Sean Owen , Thanks for your idea. We may use the bad words to describe our request. That's true that we cannot just say "Spark support ARM from release 3.0.0", and we also cannot say the past releases cannot run on ARM. But the reality is the past releases didn't get a fully test on ARM like

Re: Ask for ARM CI for spark

2019-11-14 Thread bo zhaobo
ource=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;> Sender notified by Mailtrack <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;> 19/11/15 上午11:16:28 bo zhaobo 于2019年11月15日周五 上午11:00写道: > Hi @Sean O

Re: Ask for ARM CI for spark

2019-11-14 Thread bo zhaobo
es aren't fast enough, use bigger instances? >> I don't think anyone would create a separate release of Spark for ARM, >> no. But why would that be necessary? >> >> On Thu, Nov 14, 2019 at 7:28 PM bo zhaobo >> wrote: >> >>> Hi Spark team, &

Re: Ask for ARM CI for spark

2019-11-14 Thread bo zhaobo
es.apache.org/jira/browse/SPARK-29106 trace the > whole work, thank you very much Shane:) > > On Thu, Oct 17, 2019 at 2:52 PM bo zhaobo > wrote: > >> Just Notes: The jira issue link is >> https://issues.apache.org/jira/browse/SPARK-29106 >> >> >> >&

Re: Ask for ARM CI for spark

2019-09-22 Thread bo zhaobo
t;>>>>> necessary, >>>>>>> right? >>>>>>> >>>>>>> The second thing is: >>>>>>> We plan to run the jobs for a period of time, and you can see the >>>>>>> result and logs from 'bu

Re: Ask for ARM CI for spark

2019-08-16 Thread bo zhaobo
ed from these, but > this is a function of Maven and SBT, not Spark. You may find that the > initial download takes a long time. > > On Thu, Aug 15, 2019 at 9:02 PM bo zhaobo > wrote: > >> Hi Sean, >> >> Thanks very much for pointing out the roadmap. ;-). Then I

Re: Ask for ARM CI for spark

2019-08-15 Thread bo zhaobo
Hi Sean, Thanks very much for pointing out the roadmap. ;-). Then I think we will continue to focus on our test environment. For the networking problems, I mean that we can access Maven Central, and jobs cloud download the required jar package with a high network speed. What we want to know is th

Re: Ask for ARM CI for spark

2019-08-05 Thread bo zhaobo
&utm_medium=signature&utm_campaign=signaturevirality5&;> 19/08/06 上午09:06:23 shane knapp 于2019年8月2日周五 下午10:41写道: > i'm out of town, but will answer some of your questions next week. > > On Fri, Aug 2, 2019 at 2:39 AM bo zhaobo > wrote: > >> >> Hi

Re: Ask for ARM CI for spark

2019-08-02 Thread bo zhaobo
m_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;> Sender notified by Mailtrack <https://mailtrack.io?utm_source=gmail&utm_medium=signature&utm_campaign=signaturevirality5&;> 19/08/02 下午05:37:30 bo zhaobo 于2019年7月31日周三 上午11:56写道: > Hi, team

Re: Ask for ARM CI for spark

2019-07-30 Thread bo zhaobo
Hi, team. I want to make the same test on ARM like existing CI does(x86). As building and testing the whole spark projects will cost too long time, so I plan to split them to multiple jobs to run for lower time cost. But I cannot see what the existing CI[1] have done(so many private scripts called)

Re: Ask for ARM CI for spark

2019-07-26 Thread bo zhaobo
Hi all, Thanks for your concern. Yeah, that's worth to also test in backend database. But need to note here, this issue is hit in Spark SQL, as we only test it with spark itself, not integrate other databases. Best Regards, ZhaoBo [image: Mailtrack]