Hi Mark,
I know but that could harm readability. AFAIK, for this reason, that is not
(or rarely) used in Spark.
2016-04-17 15:54 GMT+09:00 Mark Hamstra :
> FWIW, 3 should work as just `.map(function)`.
>
> On Sat, Apr 16, 2016 at 11:48 PM, Reynold Xin wrote:
>
>> Hi Hyukjin,
>>
>> Thanks for as
FWIW, 3 should work as just `.map(function)`.
On Sat, Apr 16, 2016 at 11:48 PM, Reynold Xin wrote:
> Hi Hyukjin,
>
> Thanks for asking.
>
> For 1 the change is almost always better.
>
> For 2 it depends on the context. In general if the type is not obvious, it
> helps readability to explicitly d
Hi Hyukjin,
Thanks for asking.
For 1 the change is almost always better.
For 2 it depends on the context. In general if the type is not obvious, it
helps readability to explicitly declare them.
For 3 again it depends on context.
So while it is a good idea to change 1 to reflect a more consist
First, really thank you for leading the discussion.
I am concerned that it'd hurt Spark more than it helps. As many others have
pointed out, this unnecessarily creates a new tier of connectors or 3rd
party libraries appearing to be endorsed by the Spark PMC or the ASF. We
can alleviate this concer
Hi all,
First of all, I am sorry that this is relatively trivial and too minor but
I just want to be clear on this and careful for the more PRs in the future.
Recently, I have submitted a PR (https://github.com/apache/spark/pull/12413)
about Scala style and this was merged. In this PR, I changed
On Sat, Apr 16, 2016 at 5:38 PM, Evan Chan wrote:
> Hi folks,
>
> Sorry to join the discussion late. I had a look at the design doc
> earlier in this thread, and it was not mentioned what types of
> projects are the targets of this new "spark extras" ASF umbrella
>
> Is the desire to have a
Hey folks,
I'd like to use local-cluster mode in my Spark-related projects to
test Spark functionality in an automated way in a simulated local
cluster.The idea is to test multi-process things in a much easier
fashion than setting up a real cluster. However, getting this up and
running in a
Hi folks,
Sorry to join the discussion late. I had a look at the design doc
earlier in this thread, and it was not mentioned what types of
projects are the targets of this new "spark extras" ASF umbrella
Is the desire to have a maintained set of spark-related projects that
keep pace with the
On 15/04/2016, 17:41, "Mattmann, Chris A (3980)"
wrote:
>Yeah in support of this statement I think that my primary interest in
>this Spark Extras and the good work by Luciano here is that anytime we
>take bits out of a code base and “move it to GitHub” I see a bad precedent
>being set.
>
>C