On Oct 12, 2011, at 12:04 AM, Rolf Turner wrote:
> On 12/10/11 08:31, Timothy Bates wrote:
>> To do matrix multiplication: m x n, the Rows and columns of m must be equal
>> to the columns and rows of n, respectively.
> No. The number of columns of m must equal the number of rows of n,
> that's a
Hi,
?p.adjust
Christoph
2011/10/12 Cristina Ramalho
> Hi all,
>
> This is probably a very simple question but I cannot figure out how to do
> it. I run the fourthcorner method with my data and would like to adjust the
> p values for multiple comparisons using Holm correction. When I run the
>
Dear R People:
Here is a really goofy question.
I have some objects which have 2 classes: data.frame and ucr.
Also, the classes will always be in that order.
I have tried all sorts of things, but to no avail.
listucrModels <- function(envir=.GlobalEnv, ...) {
objects <- ls(envir=envir, ...
names(lapply(.GlobalEnv, function(x) inherits(x, "ucr")))
HTH,
Josh
On Wed, Oct 12, 2011 at 12:46 AM, Erin Hodgess wrote:
> Dear R People:
>
> Here is a really goofy question.
>
> I have some objects which have 2 classes: data.frame and ucr.
>
> Also, the classes will always be in that order.
>
Hi Sandeep,
still missing an answer? Perhaps you cross check your post with the
rules of the posting guide and find what is missing at all here.
Anyway, depending on your OS, package multicore, snow/snowfall
may fit your needs - but you have to re-formulate your loop using
adequate multicore *appl
On Wed, Oct 12, 2011 at 8:46 AM, Erin Hodgess wrote:
> Dear R People:
>
> Here is a really goofy question.
>
> I have some objects which have 2 classes: data.frame and ucr.
>
> Also, the classes will always be in that order.
>
> I have tried all sorts of things, but to no avail.
>
> listucrModels
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
Hi Michael,
Thanks for the reply, but still the problem exists. When I list the objects
and return the result, they get printed on the console but the objects do
not get created. I am really baffled and clueless as to what the problem is.
Divya
--
View this message in context:
http://r.789695.
I have 2 series of variables, I want to plot the probability density function
of these 2 variabels (i.e. two curves in one graph), I just want to compare
these two variable distribution.
what should I do?
can I use ggplot2 package?
--
View this message in context:
http://r.789695.n4.nabble.com/pl
On 10/11/2011 12:13 PM, Sandeep Patil wrote:
> I have an R script that consists of a for loop
> that repeats a process for many different files.
>
>
> I want to process this parallely on machine with
> multiple cores, is there any package for it ?
>
> Thanks
...I mostly use the foreach package...
Hi,
the function fourthcorners() returns a list. You can access the pvalues
shown in the summary by four1[["tabGProb"]]. This is a matrix containing the
pvalues in the summary.
I did some copy and paste from the summary.4thcorners() function (which is
called, by using summary on your four1 objec
show the work you did. what is happening, probably, is that you are returning
the values, but not assigning them.
Sent from my iPad
On Oct 12, 2011, at 3:47, Divyam wrote:
> Hi Michael,
>
> Thanks for the reply, but still the problem exists. When I list the objects
> and return the result, t
>
> Hi Michael,
>
> Thanks for the reply, but still the problem exists. When I list the
objects
> and return the result, they get printed on the console but the objects
do
> not get created. I am really baffled and clueless as to what the problem
is.
I bet you do not understand basic operati
Dear R-community,
When doing this:
> test<-data.frame(a=c(1,2,3))
> rbind(test$a, 3)
I expect something like:
> 1
> 2
> 3
> 2
but get:
> [,1] [,2] [,3]
>[1,]123
>[2,]222
the same for:
rbind(test[["a"]], 2)
or
rbind(as.vector(test[["a"]]), 2)
or
rbind(t(as.vector(
Hi Dom,
This is because "3" is recycled. It is necessary because the number of
columns have to be the same for every row.
Try this instead:
c(test$a, 2) ## you wrote "3" but meant "2" I guess
HTH,
Ivan
Le 10/12/2011 10:47, behave a écrit :
Dear R-community,
When doing this:
test<-data.f
what are you going to do with the data? If just for presentation, then keep as
character. If you are going to compute on the data, then keep as numeric.
Since you are using floating point, FAQ 7.31 reminds you that the data "is
kept" as inputted to the best that can be done with 54 bits of pr
Hi Christoph,
Thank you soo much, this is great!
All the very best,
Cristina
On Wed, Oct 12, 2011 at 4:52 PM, Christoph Molnar <
christoph.mol...@googlemail.com> wrote:
> Hi,
>
> the function fourthcorners() returns a list. You can access the pvalues
> shown in the summary by four1[["tabGPro
On Wed, Oct 12, 2011 at 1:20 AM, Erin Hodgess wrote:
> Dear R People:
>
> I have the following set of data
>> Block[1:5]
> [1] "5600-5699" "6100-6199" "9700-9799" "9400-9499" "8300-8399"
>
> and I want to split at the -
>
>> strsplit(Block[1:5],"-")
> [[1]]
> [1] "5600" "5699"
>
> [[2]]
> [1] "610
gheine wrote on 10/11/2011 02:31:46 PM:
>
> An organization has asked me to comment on the validity of their
> recent all-employee survey. Survey responses, by geographic region,
> compared
> with the total number of employees in each region, were as follows:
>
> > ByRegion
>All.Emp
The quantmod package might be a good start.
http://cran.r-project.org/web/packages/quantmod/index.html
Regards,
Wolfgang Wu
- Ursprüngliche Message -
Von: Yves S. Garret
An: r-help@r-project.org
Cc:
Gesendet: 2:29 Mittwoch, 12.Oktober 2011
Betreff: [R] R and Forex
Hi all,
I am having the following problem. I want to calculate the maximum of each row
in a matrix. If I pass in the matrix split up by each column then this is no
problem and works great. However I don't know how many columns I have in
advance. In the example below I have 3 columns, but the number of c
In addition to Ivan, test$a is not a data.frame anymore but a numerical vector.
> class(test$a)
[1] "numeric"
> test$a
[1] 1 2 3
So adding a row to your data.frame would be
> rbind(test, 2)
a
1 1
2 2
3 3
4 2
Wolfgang Wu
- Ursprüngliche Message -
Von: Ivan Calandra
An: r-help@r-
ok. I tested it in two ways. I want to externalise my odbcConnection details
dsn, uid, and pwd. Hence I created a csv file to have these information.
Like I showed in the sample function initially, the order of the steps were
1) loading of the packages,
2) fetching the csv file,
3)assigning the dsn
sorry correction in the first and result hence received code:
ok. I tested it in two ways. I want to externalise my odbcConnection details
dsn, uid, and pwd. Hence I created a csv file to have these information.
Like I showed in the sample function initially, the order of the steps were
1) loading
x=rnorm(100,1,0.8) # A series.
y=rnorm(100,0,0.5) # Another series with different mean and variance.
plot(density(x),ylim=c(0,1))
lines(density(y),col="red")
Remember that density() is a nonparametric estimator. You should properly
choose the bandwith.
--
View this message in context:
http://r.
Hello all,
I have an ordered factor that I would like to include in the linear
predictor of a binomial glm, where the estimated coefficients are
constrained to be monotonic. Does anyone know how to do this? I've tried
using an ordered factor but this does not have the desired effect, an
(artifi
You wrote:
"It may be best to either write to the package maintainer (me, as you
did) or post to the group but not both."
This is just a note that I disagree wrt my own packages:
I go on vacation or trips, or have other projects so won't always
answer
Other folks on the list often have g
Dennis Fisher wrote on 10/11/2011 07:20:35 PM:
>
> Colleagues,
>
> I am fitting an Emax model using nls. The code is:
>START <- list(EMAX=INITEMAX, EFFECT=INITEFFECT, C50=INITC50)
>CONTROL <- list(maxiter=1000, warnOnly=T)
>#FORMULA <- as.formula(YVAR ~ EMAX - EFFEC
Dear R-listers,
I have a little problem with a boxplot and I hope you can help me figuring
it out.
I'll try to make up some data to illustrate the issue. Sorry, if my
procedures look naive, but these are my first steps in R. Any comments
and/or suggestions are very welcome.
let's create a vector
Terry I would just add that if someone contacts the maintainer and does
not follow the advice of the maintainer, still has problems, and posts a
message to the list, then some time is wasted.
Frank
On 10/12/2011 07:38 AM, Terry Therneau wrote:
You wrote:
"It may be best to either write to
Hi Wolfgang,
how about a loop?
matRandom <- matrix(runif(n=60), ncol=6)
## variant 1
system.time(test1 <- pmax(matRandom[,1], matRandom[,2], matRandom[,3],
matRandom[,4], matRandom[,5], matRandom[,6]))
User System verstrichen
0.010.000.01
#
Thank you, Greg. This indeed works well for this purpose.
> -Original Message-
> From: Greg Snow [mailto:greg.s...@imail.org]
> Sent: Tuesday, October 11, 2011 4:27 PM
> To: Doran, Harold; r-help@r-project.org
> Subject: RE: stop()
>
> Replace "stop()" with "break" to see if that does wha
My data frame consists of character variables, factors, and proportions,
something like
c1 <- c("A", "B", "C", "C")
c2 <- factor(c(1, 1, 2, 2), labels = c("Y","N"))
x <- c(0.5234, 0.6919, 0.2307, 0.1160)
y <- c(0.9251, 0.7616, 0.3624, 0.4462)
df <- data.frame(c1, c2, x, y)
pct <- function(x) roun
Comments inline below.
On 12/10/2011 7:47 AM, Divyam wrote:
sorry correction in the first and result hence received code:
ok. I tested it in two ways. I want to externalise my odbcConnection details
dsn, uid, and pwd. Hence I created a csv file to have these information.
Like I showed in the sa
George,
Perhaps the site of the RISQ project (Representativity indicators for
Survey Quality) might be of use: http://www.risq-project.eu/ . They
also provide R-code to calculate their indicators.
HTH,
Jan
Quoting ghe...@mathnmaps.com:
An organization has asked me to comment on the val
Hi,
if the rows in your data.frame are numeric, this solution will work.
(numeric.index <- unlist(lapply(df, is.numeric)))
df[, numeric.index] <- apply(df[,numeric.index], 2, pct)
This does not work for the example you gave, unless you coerce the columns
with the your numerics to numeric:
c1 <- c
Hi
>
> Yes thank you Gu…
> I am just trying to do this as a rough step and will try other
> imputation methods which are more appropriate later.
> I am just learning R, and was trying to do the for loop and
> f-statement by hand but something is going wrong…
>
> This is what I have until now:
>
plyr isn't necessary in this case. You can use the following:
cols <- sapply(df, is.numeric)
df[, cols] <- pct(df[,cols])
round (and therefore pct) accepts a data.frame and returns a
data.frame with the same dimensions. If that hadn't been the case
colwise might have been of help:
librar
>
> Dear R-listers,
>
> I have a little problem with a boxplot and I hope you can help me
figuring
> it out.
> I'll try to make up some data to illustrate the issue. Sorry, if my
> procedures look naive, but these are my first steps in R. Any comments
> and/or suggestions are very welcome.
>
>
Thanks a lot Andrés.
It was easier than I expected.
f.
2011/10/12 Andrés Aragón
> Francesco,
>
> Try cex.axis=0.6
>
> Regards,
>
> Andrés AM
>
> 2011/10/12, Francesco Sarracino :
> > Dear R-listers,
> >
> > I have a little problem with a boxplot and I hope you can help me
> figuring
> > it out.
Hi Petr,
thanks a lot for your reply. Unfortunately, your suggestion does not work
for me.
I even tried larger boxes such as 15,15 , but the result does not change.
Is there some setting that I am missing?
However, once more thanks a lot for your help.
f.
On 12 October 2011 15:58, Petr PIKAL wrot
On Oct 11, 2011, at 2:28 AM, Steve Powell wrote:
> Dear all,
> I can't get the labels slot in ICLUST to accept a character vector.
> library(psych)
> test.data <- Harman74.cor$cov
> ic.out <- ICLUST(test.data,nclusters
> =4,labels=letters[1:ncol(test.data)]) ## Error in !labels : invalid
> argume
however, if i have an excel file, but there have 6 variables, a,b,c,d,e,f.
how to plot the probability density function of a and d in one graph, b and
e in another graph?
--
View this message in context:
http://r.789695.n4.nabble.com/plot-probability-density-function-pdf-tp3897055p3898183.html
Hello;
Does anybody knows that R have a function for Generelized Negative Binomial
model, something like "gnbreg" in "STATA" where dispersion parameter itself
is a function of covaraites ?
Thanks;
[[alternative HTML version deleted]]
__
R-help
This doesn't work becaues the rollappy is non-overlapping. My rolling step is
1-day and rolling window is 1-year, so there is 364 days overlapping.
--
View this message in context:
http://r.789695.n4.nabble.com/how-to-calculate-the-statistics-of-a-yearly-window-with-a-rolling-step-as-1-day-tp3891
No, that's not what I meant. I was curious if anyone has ever done this
before and how well it worked. Any tips for a novice?
On Wed, Oct 12, 2011 at 12:19 AM, Liviu Andronic wrote:
> On Wed, Oct 12, 2011 at 3:29 AM, Yves S. Garret
> wrote:
> > Hi all,
> >
> > I recently started learning abo
Hi,
I'm working on a loop function for a large dataset which contains 1000
different groups. I would like to reconstruct the order of events within
each group by using a loop function in R. (Currently the order of events are
based on the ascending order of prev_event within the group)
A demo dat
Hi, everyone,
I am just trying to use Rcpp in my computer, and I would like to try a
simple example from website, but R keeps reporting me error. I am using
Windows XP, and has installed Rtools and GSI.
Here is the response:
> src = '
+ Rcpp::NumericVector xa(a);
+ Rcpp::NumericVector xb(b);
+
Thanks a lot Duncan! It works fine now.
Divya
--
View this message in context:
http://r.789695.n4.nabble.com/Problem-executing-function-tp3894359p3898054.html
Sent from the R help mailing list archive at Nabble.com.
__
R-help@r-project.org mailing lis
When can I send stuff to the mailing list without having moderator approval?
Is that possible?
On Wed, Oct 12, 2011 at 12:19 AM, Liviu Andronic wrote:
> On Wed, Oct 12, 2011 at 3:29 AM, Yves S. Garret
> wrote:
> > Hi all,
> >
> > I recently started learning about Forex and found this O'Reilly
Francesco,
Try cex.axis=0.6
Regards,
Andrés AM
2011/10/12, Francesco Sarracino :
> Dear R-listers,
>
> I have a little problem with a boxplot and I hope you can help me figuring
> it out.
> I'll try to make up some data to illustrate the issue. Sorry, if my
> procedures look naive, but these ar
I've also tried to make the function work for one particular group, then
apply the same function to the whole data frame with all groups using by()
or lapply() as follow. But I'm still receiving error messages. Could someone
please explain what is happening here?
dfdfdf <- function(localdata){
ord
Hey,
I need some help.
I want to obtain a cross validation for a regression model (binary response)
but I got an error with CVbinary. Well I did this:
fit <- lm(resp ~ PC1 + PC2 + PC3 + PC4 + PC5 + PC6 + PC7 + PC8 +
PC9+PC10+PC11+PC12+PC13+PC14+PC15+PC16+PC17+PC18+PC19+PC20+PC21+PC22+PC23+PC24
i want to plot probability density function,predictvalue has missing value
and observevalue has not missing value, I tried ..
attach(test) #test is the name of the data file
names(test)
plot(density(predictvalue,na.rm=TRUE))
lines(density(observevalue))
is it correct?
--
View this message in c
Assuming you mean you want them on the same device:
layout(1:2)
plot(density(a))
lines(density(d),col=2)
plot(density(b))
lines(density(e),col=2)
Getting your data into R is more of a challenge, but if you want my
unsolicited advice, you can do far worse than saving as CSV and using
read.csv()
M
Hi all,
I'm working on a loop function for a large dataset which contains 1000
different groups. I would like to reconstruct the order of events within
each group by using a loop function in R. (Currently the order of events are
based on the ascending order of prev_event within the group)
A demo
x11()
plot(density(a))
lines(density(d))
x11()
plot(density(b))
lines(density(e))
2011/10/12 pigpigmeow [via R] :
> however, if i have an excel file, but there have 6 variables, a,b,c,d,e,f.
>
>
> how to plot the probability density function of a and d in one graph, b and
> e in another graph?
>
>
Did you read the posting guide (it says R-help is not for questions
about compiled code)?
Or the rw-FAQ (it warned you that lots of contributed packages did not
take account of spaces in path names, so suggested you install R in a
path without one)?
This is a bug in one of the packages you a
Hi Gabor,
I'm looking for minimum cutsets in the igraph manual but I didn't
find the functions you mentioned above. Also, how can I see their source
code.
Thanks,
Mohammed
--
View this message in context:
http://r.789695.n4.nabble.com/Minimum-cutsets-tp885346p3898347.html
Sent from the
x11()
what does it mean?
if my data has missing value, can I plot the graph?
--
View this message in context:
http://r.789695.n4.nabble.com/plot-probability-density-function-pdf-tp3897055p3898362.html
Sent from the R help mailing list archive at Nabble.com.
Hi everyone,
I have a large data set with about 3'000 columns and I would like to exclude
all columns which include three or more consecutive zeros (see below
example). A further issue is that it should just jump NA values if any. How
can I do this?
In the below example R should exclude col
Does anyone know if there is a method available to read in cluster files (cdt,
atr and gtr)? I found one method
(http://bioinformatics.holstegelab.nl/manuals/R/library/integromicsMethods/html/iMethods.read.tv.html),
but it doesn't seem to create an object that can be used by "heatmap."
The reas
"This" being what exactly?
Traded in FX using R? Yes, its done everyday, even as I type
Michael
On Wed, Oct 12, 2011 at 8:10 AM, Yves S. Garret
wrote:
> No, that's not what I meant. I was curious if anyone has ever done this
> before and how well it worked. Any tips for a novice?
>
> On W
Hello,
I am running an ARMA model to run forecast for changes in S&P 500 prices.
My ARMA calculations look as follows
armacal <- arma( spdata, order = c(0,4), lag = list(ma = c(1,2,4)) )
Output:
Call:
arma(x = spdata, order = c(0, 4), lag = list(ma = c(1, 2, 4)) )
Coefficient(s):
ma1
Hello everybody,
is there any way to treat NA's as zero when they are summed up with numbers,
but to treat them as NA's when summed up only with NA's. Specifically want
that: 5+NA=5, but NA+NA=NA (and not zero).
Any ideas?
Best, S.B.
[[alternative HTML version deleted]]
_
Please -- this is not your personal help advisor. Use R's Help facilities
before posting.
-- Bert
On Wed, Oct 12, 2011 at 8:20 AM, pigpigmeow wrote:
> x11()
> what does it mean?
>
?x11
>
> if my data has missing value, can I plot the graph?
>
Try it and see.
>
> --
> View this message in c
I am currently trying to make a forecast based on past observations of the
dependent variable AND external variables at the same time.
I know that ARIMAX allows you to do this, however when I use this function
it fits the model using the last k lags. What i actually want is to decide
on the best mo
Maybe it was caused by your modeling binary variable using lm rather than glm.
Weidong Gu
On Wed, Oct 12, 2011 at 9:59 AM, anamiguita wrote:
> Hey,
>
> I need some help.
>
> I want to obtain a cross validation for a regression model (binary response)
> but I got an error with CVbinary. Well I di
Hi,
On Wed, Oct 12, 2011 at 11:23 AM, Akram Khaleghei Ghosheh balagh
wrote:
> Hello;
>
> Does anybody knows that R have a function for Generelized Negative Binomial
> model, something like "gnbreg" in "STATA" where dispersion parameter itself
> is a function of covaraites ?
Take a look at the ed
Dennis Fisher wrote on 10/12/2011 08:06:12 AM:
>
> jean
>
> initial values:
> INITEMAX <- -25
> INITEFFECT <- 25
> INITC50 <- 14
> GAMMA <- INITGAMMA <- 500
>
> see below for other issues.
>
> dennis
>
> Dennis Fisher MD
> P < (The "P Less
Hi Mohammed,
http://igraph.sourceforge.net/doc/R/graph.maxflow.html
For directed graphs, and s-t cuts you need the development version,
from igraph.sf.net. The source code is either here:
http://cran.r-project.org/web/packages/igraph/index.html
or here:
http://code.google.com/p/igraph/downloads/l
It's better to avoid loop in this situation. If you want to reorder
subsets of the data based on event, the follow works
df<-read.table('clipboard',header=TRUE)
sp.or<-lapply(split(df,df$group),function(ldf) ldf[order(ldf$event),])
new.df<-do.call('rbind',sp.or)
Weidong Gu
On Wed, Oct 12, 2011 a
First define a function that returns TRUE if a column
should be dropped. E.g.,
has3Zeros.1 <- function(x)
{
x <- x[!is.na(x)] == 0 # drop NA's, convert 0's to TRUE, others to FALSE
if (length(x) < 3) {
FALSE # you may want to further test short vectors
} else {
On Oct 12, 2011, at 12:45 PM, Samir Benzerfa wrote:
Hello everybody,
is there any way to treat NA's as zero when they are summed up with
numbers,
but to treat them as NA's when summed up only with NA's.
Specifically want
that: 5+NA=5, but NA+NA=NA (and not zero).
sum(x , na.rm=TRUE)
Hello,
This is my solution. This is pretty fast (tested with a larger data set)! If
you have a more elegant way to do it (of similar speed), please reply.
Thanks for the help!
## get highest and lowest values and names of a matrix
# create sample data
x <- swiss$Education[1:25]
da
On Oct 12, 2011, at 10:55 AM, Sally Zhen wrote:
Hi all,
I'm working on a loop function for a large dataset which contains 1000
different groups. I would like to reconstruct the order of events
within
each group by using a loop function in R.
Not generally a good first strategy in R.
(Cu
On Oct 12, 2011, at 1:33 PM, David Winsemius wrote:
On Oct 12, 2011, at 12:45 PM, Samir Benzerfa wrote:
Hello everybody,
is there any way to treat NA's as zero when they are summed up with
numbers,
but to treat them as NA's when summed up only with NA's.
Specifically want
that: 5+NA=
Hi,
I hope someone can help me with the following issue.
I need find the minimum beta that satisfies the following:
inf{beta>0 | f(x+beta*f(x))*f(x)<=0}
where f() is a function and x is a sample statistic.
Functions such as "nlminb" and "constrOptim" minimize a function and output
the paramete
Hi Gabor,
Thanks for the reply. I'm actually working on directed graphs and using
Windows. Please, send me the windows version of the source code.
Regards,
Mohammed
--
View this message in context:
http://r.789695.n4.nabble.com/Minimum-cutsets-tp885346p3899012.html
Sent from the R hel
I have hunted around but cannot find the command which allows me to specify
parameters of a model.
For example,
model.m1 <- nls(y ~ alpha * x1/(beta + x1), data = data, start = list(beta =
20, alpha = 120), trace = TRUE)
This will estimate the parameters, which allows to investigate the
residual
As the posting guide tells you, the first thing to try is to install the
latest version of R.
-Original Message-
From: roche...@free.fr
Sent: Tuesday, October 11, 2011 5:44 AM
To: r-help@r-project.org
Subject: [R] How to run Rcmdr with OS 10.4?
I've installed Rcmdr package and it do
On Oct 12, 2011, at 2:25 PM, BvZ wrote:
I have hunted around but cannot find the command which allows me to
specify
parameters of a model.
For example,
model.m1 <- nls(y ~ alpha * x1/(beta + x1), data = data, start =
list(beta =
20, alpha = 120), trace = TRUE)
This will estimate the par
Hello,
Does anyone know how to change the Tinn-R editor background color? White is
rough on the eyes...
Thanks,
Ben
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE d
Dear R People:
I am using R-2.13.2 on a Windows 7 machine.
I compiled from source on 32 bit a couple of weeks ago and the R.dll
got removed by my anti-virus software.
Same thing on 64 bit today.
Is anyone having this problem, please?
I'm using Norton AV.
Thanks,
Erin
--
Erin Hodgess
Associ
Never mind: option > color preference
Sorry...overlooked that 10 times I guess.
regards
On Wed, Oct 12, 2011 at 12:54 PM, Ben qant wrote:
> Hello,
>
> Does anyone know how to change the Tinn-R editor background color? White is
> rough on the eyes...
>
> Thanks,
> Ben
>
[[alternative H
As was pointed out to you before, this is really more of an
R-SIG-Finance question, but I wouldn't expect too much explanation
there either, just people pointing you to the standard R finance tools
(quantmod, zoo/xts, TTR, RBloomberg, and the Rmetrics suite; there's
also some fantastic tools in dev
On 12/10/2011 3:03 PM, Erin Hodgess wrote:
Dear R People:
I am using R-2.13.2 on a Windows 7 machine.
I compiled from source on 32 bit a couple of weeks ago and the R.dll
got removed by my anti-virus software.
Same thing on 64 bit today.
Is anyone having this problem, please?
I'm using Norto
Tried this, it does not seem to work.
It is really simple what I am trying to do. I have a pre-specified best-fit
line, and wish to run some diagnostic tests for goodness of fit.
I will play around with the predict function, thanks a lot David!
--
View this message in context:
http://r.78969
Hi,
When I import an excel "CSV" file, large numbers such as " 43988014.3" is
imported as "43988014", leaving out the decimal ".4". How to import keeping
the fraction?
Thanks.
Chetty
--
Professor of Family Medicine
Boston University
Tel: 617-414-6221, Fax:617-414-3345
emails: chett...@gmail.com,v
Are you sure its being imported into R without the decimal and that
it's not just a print option? I can't off the cuff think of a reason
why that would happen...
Try print(valueFromImport - 43988014) and see what you get
Michael
On Wed, Oct 12, 2011 at 3:16 PM, Veerappa Chetty wrote:
> Hi,
> Wh
Hi,
This happens when I read in large numbers;
> as.numeric(4398801.3)
[1] 4398801
> as.numeric(439880.3)
[1] 439880.3
Please help to read in numbers with more than 8 characters!
Thanks.
Chetty
--
Professor of Family Medic
Well that's different. Can you restart R and get the same error
message? If so, change print() to print.default() to get around the
class error for now.
And I just want to make sure of one thing: you did actually change
"valueFromImport" to whatever you named the output of the read.csv()
call, rig
I believe you were already answered.
Nothing is happening to your numbers. The default digits
used to *display* your numbers is too small to show all
the decimal places.
There's nothing to worry about; full precision is being used
for all calculations.
But if for some reason you'd like to see th
Like I said on the other thread you started on this same issue, this
is just a print setting, not something with the data.
Consider this:
R> as.numeric(4398801.3) == 4398801.3
TRUE
If you want to change this, try
options(digits = 14) # or however many you want to see
And don't jump ship on t
Is this the result you are after, where the event number
(within a group) are sorted according to the event/prev_event
pairs (prev_event in a row matches event of the previous row)?
> ave(d, d$group, FUN=function(z) z[ match(tsort(z$prev_event, z$event)[-1],
> z$event), ])
event prev_event gr
Steve Lianoglou gmail.com> writes:
>
> Hi,
>
> On Wed, Oct 12, 2011 at 11:23 AM, Akram Khaleghei Ghosheh balagh
> gmail.com> wrote:
> > Hello;
> >
> > Does anybody knows that R have a function for Generelized Negative Binomial
> > model, something like "gnbreg" in "STATA" where dispersion para
Before coding this in C, I wanted to test the idea out in R.
But I'm unsure if the theory is well-founded.
I have a (user-supplied) black-box function which takes R^n -> R^3
and a defined domain for each of the input reals.
I want to send some samples through the box to determine an
approximatio
How about, more simply:
> mysum <- function(x)sum(x,na.rm = any(!is.na(x)))
> mysum(c(1,NA))
[1] 1
> mysum(c(NA,NA))
[1] NA
-- Bert
On Wed, Oct 12, 2011 at 11:03 AM, David Winsemius wrote:
>
> On Oct 12, 2011, at 1:33 PM, David Winsemius wrote:
>
>
>> On Oct 12, 2011, at 12:45 PM, Samir Benzerf
I think Enrico's solution is probably better overall and doesn't
require as much ugly behind-the-scenes trickery, but here's another
fun way that seems to run ever-so-marginally faster on my machine.
The vapply call is messy, but it seems to get the job done -- if it's
not clear, the point is to b
If you know nothing about the black box except that its domain is
bounded, then I would random sample uniformly from the domain. If the
function is monotonically increasing in all variables, then you only
need to test the two extreme points. If you know other things, you may
be able to use th
1 - 100 of 138 matches
Mail list logo