fitting more than
> shrinkage).
>
> Thanks.
>
> Sincerely,
>
> DB Tsai
> --
> Web: https://www.dbtsai.com
> PGP Key ID: 0xAF08DF8D
>
>
> On Mon, Oct 26, 2015 at 8:37 PM, Meihua Wu
> wrote:
>> Hi DB Tsai,
>>
>> Thank you very
; Interesting. For feature sub-sampling, is it per-node or per-tree? Do
>>> you think you can implement generic GBM and have it merged as part of
>>> Spark codebase?
>>>
>>> Sincerely,
>>>
>>> DB Tsai
>>> --------
ly,
>
> DB Tsai
> --
> Web: https://www.dbtsai.com
> PGP Key ID: 0xAF08DF8D
>
>
> On Mon, Oct 26, 2015 at 11:42 AM, Meihua Wu
> wrote:
>> Hi Spark User/Dev,
>>
>> Inspired by the success of XGBoost, I have created
Hi Spark User/Dev,
Inspired by the success of XGBoost, I have created a Spark package for
gradient boosting tree with 2nd order approximation of arbitrary
user-defined loss functions.
https://github.com/rotationsymmetry/SparkXGBoost
Currently linear (normal) regression, binary classification, Po
on, Oct 12, 2015 at 1:36 PM, Ted Yu wrote:
> You can go to:
> https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-Maven-with-YARN
>
> and see if the test failure(s) you encountered appeared there.
>
> FYI
>
> On Mon, Oct 12, 2015 at 1:24 PM, Meihua Wu
> wrote:
Hi Spark Devs,
I recently encountered several cases that the Jenkin failed tests that
are supposed to be unrelated to my patch. For example, I made a patch
to Spark ML Scala API but some Scala RDD tests failed due to timeout,
or the java_gateway in PySpark fails. Just wondering if these are
isolat
I think the team is preparing for the 1.5 release. Anything to help with
the QA, testing etc?
Thanks,
MW
ink it
> would help clean up your intent, but, often it's clearer to leave the
> review and commit history of your branch since the review comments go
> along with it.
>
> On Tue, Jul 28, 2015 at 9:46 PM, Meihua Wu
> wrote:
>> I am planning to update my PR to incorpora
I am planning to update my PR to incorporate comments from reviewers.
Do I need to rebase/squash the commits into a single one?
Thanks!
-MW
-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-ma