Yes, it's on a hold out segment from the data set being fitted.
On Wed, Sep 7, 2016 at 1:02 AM Sean Owen wrote:
> Yes, should be.
> It's also not necessarily nonnegative if you evaluate R^2 on a
> different data set than you fit it to. Is that the case?
>
> On Tue, Sep 6, 2016 at 11:15 PM, Evan Z
Yes, should be.
It's also not necessarily nonnegative if you evaluate R^2 on a
different data set than you fit it to. Is that the case?
On Tue, Sep 6, 2016 at 11:15 PM, Evan Zamir wrote:
> I am using the default setting for setting fitIntercept, which *should* be
> TRUE right?
>
> On Tue, Sep 6,
I am using the default setting for setting *fitIntercept*, which *should*
be TRUE right?
On Tue, Sep 6, 2016 at 1:38 PM Sean Owen wrote:
> Are you not fitting an intercept / regressing through the origin? with
> that constraint it's no longer true that R^2 is necessarily
> nonnegative. It basica
Are you not fitting an intercept / regressing through the origin? with
that constraint it's no longer true that R^2 is necessarily
nonnegative. It basically means that the errors are even bigger than
what you'd get by predicting the data's mean value as a constant
model.
On Tue, Sep 6, 2016 at 8:4
That does seem strange. Can you provide an example to reproduce?
On Tue, 6 Sep 2016 at 21:49 evanzamir wrote:
> Am I misinterpreting what r2() in the LinearRegression Model summary means?
> By definition, R^2 should never be a negative number!
>
>
>
> --
> View this message in context:
> http: