+1

2018년 7월 17일 (화) 오전 7:34, Sean Owen <sro...@apache.org>님이 작성:

> Fix is committed to branches back through 2.2.x, where this test was added.
>
> There is still some issue; I'm seeing that archive.apache.org is
> rate-limiting downloads and frequently returning 503 errors.
>
> We can help, I guess, by avoiding testing against non-current releases.
> Right now we should be testing against 2.3.1, 2.2.2, 2.1.3, right? 2.0.x is
> now effectively EOL right?
>
> I can make that quick change too if everyone's amenable, in order to
> prevent more failures in this test from master.
>
> On Sun, Jul 15, 2018 at 3:51 PM Sean Owen <sro...@gmail.com> wrote:
>
>> Yesterday I cleaned out old Spark releases from the mirror system --
>> we're supposed to only keep the latest release from active branches out on
>> mirrors. (All releases are available from the Apache archive site.)
>>
>> Having done so I realized quickly that the
>> HiveExternalCatalogVersionsSuite relies on the versions it downloads being
>> available from mirrors. It has been flaky, as sometimes mirrors are
>> unreliable. I think now it will not work for any versions except 2.3.1,
>> 2.2.2, 2.1.3.
>>
>> Because we do need to clean those releases out of the mirrors soon
>> anyway, and because they're flaky sometimes, I propose adding logic to the
>> test to fall back on downloading from the Apache archive site.
>>
>> ... and I'll do that right away to unblock
>> HiveExternalCatalogVersionsSuite runs. I think it needs to be backported to
>> other branches as they will still be testing against potentially
>> non-current Spark releases.
>>
>> Sean
>>
>

Reply via email to