Hi Ivan,
Thanks for the suggestions. Will try them.
Keith
On Fri, Aug 23, 2024 at 1:57 AM Ivan Krylov wrote:
>
> В Thu, 22 Aug 2024 13:07:37 -0600
> Keith Christian пишет:
>
> > I'm interested in R construct(s) to be entered at the command
> > line that would output slope, y-intercept
В Thu, 22 Aug 2024 13:07:37 -0600
Keith Christian пишет:
> I'm interested in R construct(s) to be entered at the command
> line that would output slope, y-intercept, and r-squared values read
> from a csv or other filename entered at the command line, and the same
> for standard deviation calcula
ology and Paliative Care,
> 10 North Greene Street
> GRECC (BT/18/GR)
> Baltimore, MD 21201-1524
> Cell phone 443-418-5382
>
>
>
>
> ________
> From: R-help on behalf of Keith Christian
>
> Sent: Thursday, August 22, 2024 3:07 PM
> To: r-help@r-project.org
Keith Christian
Sent: Thursday, August 22, 2024 3:07 PM
To: r-help@r-project.org
Subject: [R] Linear regression and stand deviation at the Linux command line
R List,
Please excuse this ultra-newbie post.
I looked at this page but it's a bit beyond me.
https://www2.kenyon.edu/Depts/Math/har
R List,
Please excuse this ultra-newbie post.
I looked at this page but it's a bit beyond me.
https://www2.kenyon.edu/Depts/Math/hartlaub/Math305%20Fall2011/R.htm
I'm interested in R construct(s) to be entered at the command
line that would output slope, y-intercept, and r-squared values read
fro
Hi Kenneth,
My guess is that you have tried to send screenshots of your output and
these were blocked. Try to cut and paste the output into your message.
Jim
On Tue, Aug 7, 2018 at 6:38 PM, John wrote:
> On Mon, 6 Aug 2018 20:18:38 +0200
> kenneth Barnhoorn wrote:
>
> Your examples did not app
On Mon, 6 Aug 2018 20:18:38 +0200
kenneth Barnhoorn wrote:
Your examples did not appear. Remember to use plain text rather
than html.
JWDougherty
> I have a problem with a linear regression output.
>
> In January I made an analysis of some data and received an certain
> output, if I run the
I have a problem with a linear regression output.
In January I made an analysis of some data and received an certain output, if I
run the same code now I don’t receive the same output and I don’t see why. It
is important to know the country, so I would like to see the country names
behind the
Generally, statistics questions are off topic here, although they do
sometimes intersect R programming issues, as perhaps here.
Nevertheless, I believe your post would fit better on the
r-sig-mixed-models list, where repeated measures and other mixed
effects (/variance components) models are discus
Dear list,
this seemed to me like a very trivial question, but finally I haven't found
any similar postings with suitable solutions on the net ...
Basically, instead of regressing two simple series of measures 'a' and 'b'
(like b ~ a), I would like to use independent replicate measurements for
eac
Step back a minute: normality is NOT required for predictors in a
multiple regression model, though the sqrt(x) transformation may
also make the relationship more nearly linear, and linearity IS
assumed when you fit a simple model such as y ~ x + w + z.
(Normality is only required for the residu
Before going to stackexchange you should consider if a square root
transformation is appropriate for the model that you are trying to
estimate. If you do so, you may be able to interpret the coefficients
yourself. If no explanation is obvious you probably should not be using a
square root transform
Hello,
R-Help answers questions on R code, your question is about statistics.
You should try posting the question to
https://stats.stackexchange.com/
Hope this helps,
Rui Barradas
Em 23-10-2017 18:54, kende jan via R-help escreveu:
Dear all, I am trying to fit a multiple linear regression
Dear all, I am trying to fit a multiple linear regression model with a
transformed dependant variable (the normality assumption was not verified...).
I have realised a sqrt(variable) transformation... The results are great, but I
don't know how to interprete the beta coefficients... Is it possib
> Yes, and I think that the suggestion in another post to look at censored
> regression is more in the right direction.
I think this is right and perhaps the best (or at least better) pathway to
pursue than considering this within the framework of measurement error (ME). Of
course there *is* M
: Ravi Varadhan; r-help@r-project.org
Subject: Re: [R] Linear regression with a rounded response variable
> On 21 Oct 2015, at 19:57 , Charles C. Berry wrote:
>
> On Wed, 21 Oct 2015, Ravi Varadhan wrote:
>
>> [snippage]
>
> If half the subjects have a value of 5 second
Hi Ravi,
And remember that the vanilla rounding procedure is biased upward. That is,
an observation of 5 actually may have ranged from 4.5 to 5.4.
Jim
On Thu, Oct 22, 2015 at 7:15 AM, peter salzman
wrote:
> here is one thought:
>
> if you plug in your numbers into any kind of regression you wil
> On 21 Oct 2015, at 19:57 , Charles C. Berry wrote:
>
> On Wed, 21 Oct 2015, Ravi Varadhan wrote:
>
>> [snippage]
>
> If half the subjects have a value of 5 seconds and the rest are split between
> 4 and 6, your assertion that rounding induces an error of
> dunif(epsilon,-0.5,0.5) is surely
here is one thought:
if you plug in your numbers into any kind of regression you will get
prediction that are real numbers and not necessarily integers, it may be
that you predictions are good enough with this approximate value of Y. you
could test this by randomly shuffling your data by +- 0.5 an
This could be modeled directly using Bayesian techniques. Consider the
Bayesian version of the following model where we only observe y and X. y0
is not observed.
y0 <- X b + error
y <- round(y0)
The following code is based on modifying the code in the README of the CRAN
rcppbugs R package.
Hi Ravi,
Thanks for this interesting question. My thoughts are given below.
If you believe the rounding is indeed uniformly distributed, then the
problem is equivalent with adding a uniform random error between (-0.5,
0.5) for every observation in addition to the standard normal error, which
will
On Wed, 21 Oct 2015, Ravi Varadhan wrote:
Hi, I am dealing with a regression problem where the response variable,
time (second) to walk 15 ft, is rounded to the nearest integer. I do
not care for the regression coefficients per se, but my main interest is
in getting the prediction equation fo
Hi,
I am dealing with a regression problem where the response variable, time
(second) to walk 15 ft, is rounded to the nearest integer. I do not care for
the regression coefficients per se, but my main interest is in getting the
prediction equation for walking speed, given the predictors (age,
lp@r-project.org
Subject: [R] Linear regression of 0/1 response ElemStatLearn (Fig. 2.1 the
elements of statistical learning)
Hello
In chapter 2 ESL book authors write: Let's look at example of linear
model in a classification context
They fit a simple linear model g = 0.3290614 -0.0226360x1 + 0.2495
One way to see where the first warning comes from is to turn warnings
into errors with options(warn=2) and when the error happens call
traceback().
Bill Dunlap
TIBCO Software
wdunlap tibco.com
On Sun, Jun 29, 2014 at 4:12 AM, wat tele wrote:
>
>
>
> Hello,
>
> I'm a R beginner and I want to make
Hello,
I'm a R beginner and I want to make a Multiple Regression about birds. My data
is stord in a .csv file.
I tried to do this with the following code:
reg.data <- read.table(file.choose(),header=T, sep=";",dec=",")
attach(reg.data)
names(reg.data)
model <- lm(Flights ~ Age + Gender + wei
summary(lm(Canopy_Height~Ground_Elevation, data=young400_1)) #use
data= instead of attach!
Or even
mylm <- lm(Canopy_Height~Ground_Elevation, data=young400_1)
mylm
summary(mylm)
coefficients(mylm)
Most intro to R guides cover the basics of modeling; you might benefit
from reading one of them.
S
First of I am new to using R.
I have a dataset that I plotted using R, I created a scatter plot and used
abline to create the line, what I need is to find the equation of the line.
Below is the script I have used up until this point.
>young400_1<-read.csv("Z:\\SOFTEL\\North Key Largo
project\\Can
help@r-project.org
Cc:
Sent: Saturday, July 20, 2013 7:55 PM
Subject: [R] Linear regression repeat for each column
Hi everyone
I need to calculate abnormal returns for different events applying event study
methodology. I must create a market model in order to perform the analysis. I
apply regr
Sure. Read the Posting Guide, and provide a reproducible example with sample
data.
I suspect the basic idea will be to merge the data frames and then setup the
model to refer to the desired columns.
---
Jeff Newmiller
Hi everyone
I need to calculate abnormal returns for different events applying event study
methodology. I must create a market model in order to perform the analysis. I
apply regression analysis to get OLS estimators.
I have a problem to create a linear regression which I could repeat for each
I fount out a solution ... check it out
random_sample = function (dt, sample_size) {
# determine the number of records in the data frame
length_dt = length (dt[,1])
# extract the sample
dt [sample (1:length_dt, size = sample_size),]
}
--
View this message in context:
http://r.789
Hi guys :)
I have a dataframe a
a <- read.dta("imprese.dta")
15867 obs. and 23 varaible
I want to make a linear regression on a sample of 300 units
how can i do?
Thank you
--
View this message in context:
http://r.789695.n4.nabble.com/Linear-regression-on-dataframe-s-sample-tp4648288.htm
Note that your equations can be written:
y = alpha*A + (1-alpha)*B, which is equivalent to
y = (A-B) * alpha + B , i.e. of form
y = C*alpha + B a simple linear equation in alpha
You have two different values of alpha at which y was measured, so
just stack up all your results into a single
To: Diviya Smith
Cc: r-help@r-project.org
Sent: Monday, March 19, 2012 11:32 PM
Subject: Re: [R] Linear regression
1. Homework assignment? We don't do homework here.
2. If not, a mixture model of some sort? I suggest you state the
context of the problem more fully. R has several package
Hello Bert,
This is definitely not for a homework problem. I am trying to estimate
frequencies of mutations in different groups. The mutation frequencies can
be modeled as a linear relation in cases of mixtures. So I have a lot of
populations that follow the relationship -
y = alpha*A + beta*B an
1. Homework assignment? We don't do homework here.
2. If not, a mixture model of some sort? I suggest you state the
context of the problem more fully. R has several packages to do
mixture modeling, if that's what you're trying to do.
3. In any case, this cannot be done with lm() (at least withou
Hello there,
I am new to using regression in R. I wanted to solve a simple regression
problem where I have 2 equations and 2 unknowns.
So lets say -
y1 = alpha1*A + beta1*B
y2 = alpha2*A + beta2*B
y1 <- runif(10, 0,1)
y2 <- runif(10,0,1)
alpha1 <- 0.6
alpha2 <- 0.75
beta1 <- 1-alpha1
b
Thank everyone for your help.
Problem solved. I'm getting more used with vectorization with your help,
Regards,
Phil
--
View this message in context:
http://r.789695.n4.nabble.com/linear-regression-by-column-tp4432564p4434937.html
Sent from the R help mailing list archive at Nabble.com.
__
On 2012-02-29 15:45, David Winsemius wrote:
On Feb 29, 2012, at 6:39 PM, David Winsemius wrote:
On Feb 29, 2012, at 1:53 PM, Filoche wrote:
Hi everyone.
I have a DF with the first column being my independant variable and
all
other columns the dependent variables.
Something like:
x
On Feb 29, 2012, at 6:39 PM, David Winsemius wrote:
On Feb 29, 2012, at 1:53 PM, Filoche wrote:
Hi everyone.
I have a DF with the first column being my independant variable and
all
other columns the dependent variables.
Something like:
x y1 y2 y3
... ... ...
On Feb 29, 2012, at 1:53 PM, Filoche wrote:
Hi everyone.
I have a DF with the first column being my independant variable and
all
other columns the dependent variables.
Something like:
x y1 y2 y3
... ... ... ...
... ... ... ...
What I'm trying to do
Hi everyone.
I have a DF with the first column being my independant variable and all
other columns the dependent variables.
Something like:
x y1 y2 y3
... ... ... ...
... ... ... ...
What I'm trying to do is to perform a linear model for each of my "y". I
OK thanks.
In my case I think it might be possible to work around this by reshaping my
data and then using lmlist() to run separate regressions for each data
group. lmlist() is new to me but it looks like it will do the job.
--
View this message in context:
http://r.789695.n4.nabble.com/Linear-
I've done a lot of research on this very topic and found a few solutions. But
all the ways I've discovered involve loops.
Applying it to what you want, the best way I've found is to do (stolen from
an experienced R user, of course):
y<-array(rnorm(100),dim=c(10,10))
x<-array(rnorm(100),dim=c(10,1
I'm new to R and I'm not a Statistician I'm an Accountant, but I'm finding it
an excellent tool for the business analysis work I do.
I need to run LM() where both response and predictor are held in matrices.
The model follows the form:-
regression1 = matrix1.col1 <-> matrix2.col1
regression2 = mat
I think Alabama package on CRAN can do this;
http://cran.r-project.org/web/packages/alabama/index.html
>-Original Message-
>From: r-help-boun...@r-project.org
[mailto:r-help-boun...@r-project.org]
>On Behalf Of JW
>Sent: 31 October 2011 17:57
>To: r-help@r-project.org
>S
On Oct 31, 2011, at 5:12 PM, Comcast wrote:
>
>
> On Oct 31, 2011, at 3:31 PM, Bert Gunter wrote:
>
>> Well, if I understand the question correctly (following the posting guide
>> would have spared guessing, as usual), forget packages -- nothing more than
>> elementary algebra is needed.
>>
On Oct 31, 2011, at 3:31 PM, Bert Gunter wrote:
> Well, if I understand the question correctly (following the posting guide
> would have spared guessing, as usual), forget packages -- nothing more than
> elementary algebra is needed.
>
> e.g.
>
> lm(y ~ x1 + x2 + x3) subject to the constrain
Well, if I understand the question correctly (following the posting guide
would have spared guessing, as usual), forget packages -- nothing more than
elementary algebra is needed.
e.g.
lm(y ~ x1 + x2 + x3) subject to the constraint beta_2 = beta_3, the
coefficients of x2 and x3, is the same as
I believe the package systemfit can help you with that. Haven't tried it
myself, but give it a go.
Regards,
Kristian
2011/10/31 JW
> Please advice on the package I should use to run a linear regression model
> (weighted least squared) with linear equality constraint. I initially
> tried
> "co
Please advice on the package I should use to run a linear regression model
(weighted least squared) with linear equality constraint. I initially tried
"constrOptim" but it turned out that it only supported inequality linear
constraint. Thank you very much in advance.
Cheers,
Jon
--
View this m
Andrey A gmail.com> writes:
>
> Hello I performed a linear regression, my equation is Y = âo+ â1A + â2B +
> â3AB.
> Is there a way to separate interaction terms, say â3AB and plot it against a
> certain variable?
> Thanks, Andrew
Not quite sure what you mean here. Possibly something like
L1
Hello I performed a linear regression, my equation is Y = âo+ â1A + â2B +
â3AB.
Is there a way to separate interaction terms, say â3AB and plot it against a
certain variable?
Thanks, Andrew
[[alternative HTML version deleted]]
__
R-help@r-projec
I think that you have not understood your lecturer. You can log transform
any positive number. You can not log transform a negative number. Adding a
constant to a negative number to make it constant before log transformation
is sometimes suggested by those who do not understand what they are doin
Hello,
I've some questions concerning log-transformations and plotting of the
regression lines. So far as I know is it a problem to log-transform values
smaller than 1 (0-1). In my statistics lecture I was told to do a log(x+1)
transformation in such cases. So I provide here a small example to
Thanks Dennis! Worked perfectly. I keep forgetting that plyr can split data
based on multiple subsetting variables.
Thanks so much,
Nate
On Mon, Aug 22, 2011 at 10:12 PM, Dennis Murphy wrote:
> Hi:
>
> You're kind of on the right track, but there is no conditioning
> formula in lm(); it's not l
At 02:15 23/08/2011, Nathan Miller wrote:
Hi all,
See comment in-line
I have a data set that looks a bit like this.
feed1
RFU Site Vial Time lnRFU
1 811 10 10.702075
2 4752111 20 10.768927
3 4290511 30 10.66674
4 4686711 40
Hi:
You're kind of on the right track, but there is no conditioning
formula in lm(); it's not lattice :) This is relatively easy to do
with the plyr package, though:
library('plyr')
# Generate a list of models - the subsetting variables (Site, Vial) are
# used to generate the data splits and the
You can do something like this
sp<-split(dat, list(dat$Vial,dat$Site))
seq.model<-lapply(sp, function(x) lm(x$InRFU~x$Time))
Then, extract whatever you want from seq.model
Weidong Gu
On Mon, Aug 22, 2011 at 9:15 PM, Nathan Miller wrote:
> Hi all,
>
> I have a data set that looks a bit li
Hi all,
I have a data set that looks a bit like this.
feed1
RFU Site Vial Time lnRFU
1 811 10 10.702075
2 4752111 20 10.768927
3 4290511 30 10.66674
4 4686711 40 10.755069
5 4299511 50 10.668839
6 4307411
Don't forget to load `lattice` package. `latticeExtra` with
`panel.ablineeq` can be also helpful.
This was however for plotting. For subset regression by each WR without
plotting you'd use something like `lapply` or `sapply`.
ans <- sapply(unique(data$WR), function(dir) {
out <- list(lm(PM10~
Hi:
Try something like this, using dat as the name of your data frame:
xyplot(PM10 ~ Ref | WR, data = dat, type = c('p', 'r'))
The plot looks silly with the data snippet you provided, but should
hopefully look more sensible with the complete data. The code creates
a four panel plot, one per dire
your dataframe needs to be called "Nord". If it is not, then replace
"Nord" with the actual name of your dataframe
On Sat, Aug 13, 2011 at 10:43 PM, maggy yan wrote:
> dear R users,
> my data looks like this
>
> PM10 Ref UZ JZ WT RH FT WR
> 1 10.973195 4.338
dear R users,
my data looks like this
PM10 Ref UZ JZ WT RH FT WR
1 10.973195 4.338874 nein Winter Dienstag ja nein West
26.381684 2.250446 nein SommerSonntag nein ja Süd
3 62.586512 66.304869 ja SommerSonntag nein nein Ost
45.59010
Hi Everyone,
I need to run several simple linear regressions at once, using the
following data. Response variables: Bird species (sp 1, sp2, sp3...spn).
Independent variable: Natprop - proportion of natural area. covarate:
Effort = hours). One single linear regression would be: lmSp1 <- lm(sp1~
na
Hi all,
I need to run several simple linear regressions at once, using the
following data. Response variables: Bird species (sp 1, sp2, sp3...spn).
Independent variable: Natprop - proportion of natural area. covarate:
Effort = hours). One single linear regression would be: lmSp1 <- lm(sp1~
natprop
Hi,
we want to calculate a linear regression that results from three test
series. (The "Average linear regression" over 3 replication.)
Each test serie consists of the values for 3 times (0min, 25min,
50min) and is repeated three times.
We need such a regression for each combination of subject
1, 2011 3:26 AM
> To: r-help@r-project.org
> Subject: [R] linear regression in a ragged array
>
> Hello,
> I have a large dataset of the form
>
> subj var1 var2
> 001100200
> 001120226
> 001130238
> 001140245
> 0011503
On Mar 21, 2011, at 5:25 AM, Marcel Curlin wrote:
Hello,
I have a large dataset of the form
subj var1 var2
001100200
001120226
001130238
001140245
001150300
002110205
002125209
003101233
003115254
I would like to perf
Hello,
I have a large dataset of the form
subj var1 var2
001100200
001120226
001130238
001140245
001150300
002110205
002125209
003101233
003115254
I would like to perform linear regression of var2 on var1 for each subj
Seconded
On 03/16/2011 05:37 PM, Bert Gunter wrote:
Ha! -- A fortunes candidate?
-- Bert
If this is really a time series, then you will have serious validity
problems due to auto-correlation among non-independent units. (But if you
are just searching for a way to pull the wool over the eyes of
Ha! -- A fortunes candidate?
-- Bert
>
> If this is really a time series, then you will have serious validity
> problems due to auto-correlation among non-independent units. (But if you
> are just searching for a way to pull the wool over the eyes of the
> statistically uninformed, then I guess th
On Mar 16, 2011, at 3:19 PM, Justin Haynes wrote:
I have a very large dataset with columns of id number, actual value,
predicted value. This used to be a time series but I have dropped the
time component. So I now have a data.frame where the id number is
repeated but each value in the actual
I have a very large dataset with columns of id number, actual value,
predicted value. This used to be a time series but I have dropped the
time component. So I now have a data.frame where the id number is
repeated but each value in the actual and predicted columns are
unique.
I assume I need to
On 2011-02-21 02:42, Rosario Garcia Gil wrote:
Hello
I have a data set with outlier and it is not normally distributed. I would
instead like to use a more robust distribution like t-distribution.
My question is if the coefficients of the regression are different from zero,
but assuming a t-di
did you try
RSiteSearch("t-regression")
?
That seems to give some usefull hits.
On Mon, Feb 21, 2011 at 7:42 AM, Rosario Garcia Gil
wrote:
> Hello
>
> I have a data set with outlier and it is not normally distributed. I would
> instead like to use a more robust distribution like t-distribution
library(zoo)
library(tseries)
library(quantmod) #for access to FRED
require(quantmod)
require(TTR)
secA <- getSymbols("DEXUSEU",src="FRED")
secB <- getSymbols("DEXUSUK",src="FRED")
secA <- zoo(DEXUSEU[,1])
secB <- zoo(DEXUSUK[,1])
t.zoo <- merge(secA, secB, all=FALSE)
t <- as.data.frame(t.zoo
Hello
I have a data set with outlier and it is not normally distributed. I would
instead like to use a more robust distribution like t-distribution.
My question is if the coefficients of the regression are different from zero,
but assuming a t-distribution.
Could someone hint me what package t
At 02:23 29/12/2010, Entropi ntrp wrote:
Hi,
I have been examining large data and need to do simple linear regression
with the data which is grouped based on the values of a particular
attribute. For instance, consider three columns : ID, x, y, and I need to
regress x on y for each distinct val
Thanks alot for the quick responses.
I have some additional questions related to this topic. In fact, my
intention was to be able to answer questions like what percent of the
regressions have p_values less than a certain threshold, how do
residuals look like, how do the plots of y vs. x look like,
Hi:
There are some advantages to taking a plyr approach to this type of problem.
The basic idea is to fit a linear model to each subgroup and save the
results in a list, from which you can extract what you want piece by piece.
library(plyr)
# One of those SAS style data sets...
> df <- data.fram
: [R] linear regression for grouped data
Hi,
I have been examining large data and need to do simple linear regression
with the data which is grouped based on the values of a particular
attribute. For instance, consider three columns : ID, x, y, and I need to
regress x on y for each distinct value
On Dec 28, 2010, at 9:23 PM, Entropi ntrp wrote:
Hi,
I have been examining large data and need to do simple linear
regression
with the data which is grouped based on the values of a particular
attribute. For instance, consider three columns : ID, x, y, and I
need to
regress x on y for ea
Hi,
I have been examining large data and need to do simple linear regression
with the data which is grouped based on the values of a particular
attribute. For instance, consider three columns : ID, x, y, and I need to
regress x on y for each distinct value of ID. Specifically, for the set of
data
On Dec 27, 2010, at 8:56 PM, Entropi ntrp wrote:
Thanks for the response. I proivded the necessary details below,
and also
have a general question for how to deal with dates in R. Is there a
way to
make R read dates as numbers?
Here is the details of the R code:
egfr <- read.csv(file.ch
Thanks for the response. I proivded the necessary details below, and also
have a general question for how to deal with dates in R. Is there a way to
make R read dates as numbers?
Here is the details of the R code:
egfr <- read.csv(file.choose(), header=TRUE, sep=",") #egfr is a matrix
read fr
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
> On Behalf Of Entropi ntrp
> Sent: Monday, December 27, 2010 3:05 PM
> To: r-help@r-project.org
> Subject: [R] linear regression with dates
>
> Hi,
> I am trying to
Hi,
I am trying to do simple linear regression using dates in R but receiving
error messages. With the data shown below, I would like to regress x on y.
x y
11/12/1999 56.8 11/29/1999 17.9 01/04/2000 27.4 1/14/2000 96.8
1/31/2000 49.5
R gives the following erro
On Oct 12, 2010, at 9:01 AM, Vittorio Colagrande wrote:
Dear R-group,
We have begun to use it for teaching Statistics. In this context we
have run into a problem with linear regression
where we found the results of are confusing.
Specifically, considering the data:
x=c(4,5,6,3,7,8,10,14,
hn Tukey
> -Oorspronkelijk bericht-
> Van: r-help-boun...@r-project.org
> [mailto:r-help-boun...@r-project.org] Namens Vittorio Colagrande
> Verzonden: dinsdag 12 oktober 2010 15:01
> Aan: r-help@r-project.org
> Onderwerp: [R] Linear Regression
>
> Dear R-group,
>
> We have b
Dear R-group,
We have begun to use it for teaching Statistics. In this context we have run
into a problem with linear regression
where we found the results of are confusing.
Specifically, considering the data:
x=c(4,5,6,3,7,8,10,14,13,15,6,7,8,10,11,4,5,17,12,11)
y=c(rep(7,20))
and set
[mailto:r-help-boun...@r-
> project.org] On Behalf Of ashz
> Sent: Thursday, August 19, 2010 3:02 AM
> To: r-help@r-project.org
> Subject: Re: [R] Linear regression equation and coefficient matrix
>
>
> Dear Greg,
>
> Thanks for the tip. As I am new in R can you please provide me
Dear Greg,
Thanks for the tip. As I am new in R can you please provide me a script how
do to so. It will help my learning process.
Thanks,
Asher
--
View this message in context:
http://r.789695.n4.nabble.com/Linear-regression-equation-and-coefficient-matrix-tp2329804p2330867.html
Sent from the
hz
> Sent: Wednesday, August 18, 2010 8:43 AM
> To: r-help@r-project.org
> Subject: Re: [R] Linear regression equation and coefficient matrix
>
>
> Hi,
>
> Thanks, the cor() works.
>
> Regarding the simple linear regression equation (mainly, the slope
> parameter) an
Hi,
Thanks, the cor() works.
Regarding the simple linear regression equation (mainly, the slope
parameter) and r2. I think I was not writing it well. I need to do it just
for the columns. If I have a, b, c, d columns I wish to compute the relation
of there data, e.g., between a-b, a-c, a-d, b-a
Hmm, after reading one of your other posts, I am thinking you may
*just* want all pairwise combinations. This worked for me:
# Create a sample data frame with 60 named columns
x <- data.frame(matrix(rnorm(1200), ncol = 60, dimnames = list(NULL,
paste("Col", 1:60, sep=''
# calculate the corre
On Wed, Aug 18, 2010 at 6:09 AM, ashz wrote:
>
> Hi,
>
> I have 20*60 data matrix (with some NAs) and I wish to perfom a Pearson
> correlation coefficient matrix as well as simple linear regression equation
The correlation matrix can be readily obtained by calling cor() on the
entire matrix.
>
Hi,
I have 20*60 data matrix (with some NAs) and I wish to perfom a Pearson
correlation coefficient matrix as well as simple linear regression equation
and coefficient of determination (R2) for every possible combination. Any
tip/idea/library/script how do to so.
Thanks,
As hz
--
View thi
Have a look at lmList() in the nlme package.
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-
> project.org] On Behalf Of JesperHybel
> Sent: Friday, August 13, 2010 7:56 AM
> To: r-help@r-project.org
> Subject: Re: [R] Linear reg
1 - 100 of 171 matches
Mail list logo