Great, thanks for confirming, Reynold. Appreciate it!

On Tue, Apr 19, 2016 at 4:20 PM, Reynold Xin <r...@databricks.com> wrote:

> I talked to Lianhui offline and he said it is not that big of a deal to
> revert the patch.
>
>
> On Tue, Apr 19, 2016 at 9:52 AM, Mark Grover <m...@apache.org> wrote:
>
>> Thanks.
>>
>> I'm more than happy to wait for more people to chime in here but I do
>> feel that most of us are leaning towards Option B anyways. So, I created a
>> JIRA (SPARK-14731) for reverting SPARK-12130 in Spark 2.0 and file a PR
>> shortly.
>> Mark
>>
>> On Tue, Apr 19, 2016 at 7:44 AM, Tom Graves <tgraves...@yahoo.com.invalid
>> > wrote:
>>
>>> It would be nice if we could keep this compatible between 1.6 and 2.0 so
>>> I'm more for Option B at this point since the change made seems minor
>>> and we can change to have shuffle service do internally like Marcelo
>>> mention. Then lets try to keep compatible, but if there is a forcing
>>> function lets figure out a good way to run 2 at once.
>>>
>>>
>>> Tom
>>>
>>>
>>> On Monday, April 18, 2016 5:23 PM, Marcelo Vanzin <van...@cloudera.com>
>>> wrote:
>>>
>>>
>>> On Mon, Apr 18, 2016 at 3:09 PM, Reynold Xin <r...@databricks.com>
>>> wrote:
>>> > IIUC, the reason for that PR is that they found the string comparison
>>> to
>>> > increase the size in large shuffles. Maybe we should add the ability to
>>> > support the short name to Spark 1.6.2?
>>>
>>> Is that something that really yields noticeable gains in performance?
>>>
>>> If it is, it seems like it would be simple to allow executors register
>>> with the full class name, and map the long names to short names in the
>>> shuffle service itself.
>>>
>>> You could even get fancy and have different ExecutorShuffleInfo
>>> implementations for each shuffle service, with an abstract
>>> "getBlockData" method that gets called instead of the current if/else
>>> in ExternalShuffleBlockResolver.java.
>>>
>>>
>>> --
>>> Marcelo
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
>>> For additional commands, e-mail: dev-h...@spark.apache.org
>>>
>>>
>>>
>>>
>>>
>>
>

Reply via email to