Hi Dave,
I run into this problem with windows 7 and 8. I used the latest
versions of r and ncdf as taken from icran.
I found my way around this problem breaking the files but it is a
petty that ncdf has this bug as it is a quite useful package.
Thanks,
Camilo
Camilo Mora, Ph.D.
Departmen
Dear R users,
I have written a couple of R functions, some are through the help of the
R group members. However, running them takes days instead of minutes or
a few hours. I am wondering whether there is a quick way of doing that.
Here are all my R functions. The last one calls almost all of
That worked great!
Best regards,
Chris
--
View this message in context:
http://r.789695.n4.nabble.com/Adding-additional-points-to-ggplot2-tp4673928p4674059.html
Sent from the R help mailing list archive at Nabble.com.
__
R-help@r-project.org maili
At 10:28 19/08/2013, Laz wrote:
Dear R users,
I have written a couple of R functions, some are through the help of
the R group members. However, running them takes days instead of
minutes or a few hours. I am wondering whether there is a quick way
of doing that.
Your example code is rather
... and read the "R Language Definition" manual. I noticed unnecessary
constructs
(e.g., z <- f(something); return(z)) that suggest you have more basics
to learn to write efficient, well-structured R code.
-- Bert
On Mon, Aug 19, 2013 at 3:55 AM, Michael Dewey wrote:
> At 10:28 19/08/2013, Laz w
Yes Bert, I am a beginner in writing R functions. I just don't know what
to avoid or what to use in order to make the R functions faster.
When I run the individual functions, they run quite well.
However, calling all of them using the final function it becomes too slow.
So I don't know how to m
Greetings,
I am a newbie too. I will share what I do normally for speeding up the code.
1. Restrict defining too many variables (Global/ Local)
2. Use apply functions (apply,sapply,lapply,tapply, etc.) whenever feasible
3. Having multiple user defined functions doesn't help. Try to compact
everyt
On 13-08-17 7:05 PM, Xiao He wrote:
Hi dear R-users,
I encountered an interesting pattern. Take for example the function
combn(), I copied and pasted the function definition and saved it as a new
function named combn2() (see the end of this email). As it turned out,
combn2() seems to be substant
On 13-08-18 7:57 AM, Jim Lemon wrote:
Hi all,
In trying to write an Rd file for a new package, I was stumped at
something that is probably quite simple. I have % characters in the
examples of one Rd file. Both my previous experience and some searching
argeed that one can escape % with \. This wor
1. Keeping the number of variables down encourages you to structure your data,
which allows you to re-use code more efficiently. However, I don't believe that
the number of variables intrinsically slows down your code significantly.. e.g.
storing a computed value in a local variable is almost al
Greetings,
Thanks Jeff. I appreciate your 'to the point' explanation. Will read into
it more.
Best,
Heramb Gadgil
2013/8/19 Jeff Newmiller
> 1. Keeping the number of variables down encourages you to structure your
> data, which allows you to re-use code more efficiently. However, I don't
> be
> I am a newbie too. I will share what I do normally for speeding up the code.
>
> 1. Restrict defining too many variables (Global/ Local)
> 2. Use apply functions (apply,sapply,lapply,tapply, etc.) whenever feasible
> 3. Having multiple user defined functions doesn't help. Try to compact
> everyt
Have used the suggested solution (the second method, as it seems a bit
leaner) on my real data, and it works perfectly. Thanks.
--
View this message in context:
http://r.789695.n4.nabble.com/Create-new-records-based-on-event-dates-in-a-data-frame-tp4673839p4674082.html
Sent from the R help mail
dear R experts---I was programming a fama-macbeth panel regression (a
fama-macbeth regression is essentially T cross-sectional regressions, with
statistics then obtained from the time-series of coefficients), partly
because I wanted faster speed than plm, partly because I wanted some
additional fea
I tried ti run a simple example with function solnp() from Rsolnp: find
max(x+y) on a unit circle x*x+y*y=1. The answer should be x=y=1/sqrt(2)
with max=sqrt(2).
Here are my code and results
> # test Rsolnp
> library(Rsolnp)
> fn1 <- function(x)
Hi,
It is better to ?dput() your example dataset.
temp<-
list(structure(c(16.91569, 190.89517, 169.70239, 162.14503, 190.8952,
2573.2651, 1966.6202, 1829.3142, 169.7024, 1966.6202, 2335.3354,
1676.7749, 162.145, 1829.314, 1676.775, 2106.543), .Dim = c(4L,
4L), .Dimnames = list(NULL, NULL)), st
On 19-08-2013, at 18:24, William Dunlap wrote:
>> I am a newbie too. I will share what I do normally for speeding up the code.
>>
>> 1. Restrict defining too many variables (Global/ Local)
>> 2. Use apply functions (apply,sapply,lapply,tapply, etc.) whenever feasible
>> 3. Having multiple user
Bad starting value.
Try
c(3/5,4/5)
On Aug 19, 2013, at 1:33 PM, wrote:
> I tried ti run a simple example with function solnp() from Rsolnp: find
> max(x+y) on a unit circle x*x+y*y=1. The answer should be x=y=1/sqrt(2)
> with max=sqrt(2).
>
> Here are my code and results
>
> #
Dear all,
I am using cv.glmnet in r and I have the following question: The default is
10-fold cross-validation, but it is not clear to me how many times are
repeated? Is it 50 repeats?
I am sore if me question will be very easy for some people!
Many thanks in advance,
Samuel
[[altern
Ivo:
I may not get your question, but you seem to be confusing the name of
an object, which is essentially a pointer into memory and a language
construct -- (correction requested if I have misstated! -- and the
"names" attribute of (some) objects. You can, of course, attach a
"lab" or (whatever)
On Aug 19, 2013, at 9:45 AM, ivo welch wrote:
> dear R experts---I was programming a fama-macbeth panel regression (a
> fama-macbeth regression is essentially T cross-sectional regressions, with
> statistics then obtained from the time-series of coefficients), partly
> because I wanted faster spe
This is an R and operating system related question. I am not able to submit an
R program to be executed in batch mode by Windows. I have confirmed the
location of the R.exe file. I am following the directions I found in Quick-R,
but neither of the following work from a DOS prompt ...
D:\chip\Pr
On Aug 19, 2013, at 12:48 PM, David Winsemius wrote:
>
> On Aug 19, 2013, at 9:45 AM, ivo welch wrote:
>
>> dear R experts---I was programming a fama-macbeth panel regression (a
>> fama-macbeth regression is essentially T cross-sectional regressions, with
>> statistics then obtained from the ti
thank you. but uggh...sorry for my html post. and sorry again for
having been obscure in my attempt to be brief. here is a working
program.
fama.macbeth <- function( formula, din ) {
fnames <- terms( formula )
dnames <- names( din )
stopifnot( all(dimnames(attr(fnames, "factors"))[[1]] %i
On Sat, 17-Aug-2013 at 05:09PM -0700, Jeff Newmiller wrote:
|> In most threaded multitasking environments it is not safe to
|> perform IO in multiple threads. In general you will have difficulty
|> performing IO in parallel processing so it is best to let the
|> master hand out data to worker tas
On Aug 19, 2013, at 16:05, ivo welch wrote:
> thank you. but uggh...sorry for my html post. and sorry again for
> having been obscure in my attempt to be brief. here is a working
> program.
>
> fama.macbeth <- function( formula, din ) {
I think most users would expect 'din' to be 'data' he
When I want to manipulate expressions, including formulae, I first
think of things like bquote() and substitute(). E.g.,
> for(nm in lapply(c("x","z"), as.name)) {
fmla <- formula( bquote( y ~ .(nm) ))
print(fama.macbeth(fmla, din=d))
}
(Intercept) x
-0.02384804 0.181
Dear All,
My project requires the use of a specific R package. However, this R
package has been removed from CRAN. But its older version can be found.
Unfortunately, the older version cannot be used. The thing is, after I
downloaded the older version and unzip it into the library folder of R,
and
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
> On Behalf Of Shang Zuofeng
> Sent: Monday, August 19, 2013 1:26 PM
> To: r-help@r-project.org
> Subject: [R] How to rebuild an R package that has been removed from CRAN?
>
> Dear All,
>
> My
Thanks, Dan!
The package is "assist" which can be downloaded from the following link:
http://cran.r-project.org/src/contrib/Archive/assist/
The one I chose was assist_3.1.2.tar.gz
I have changed this file to .zip and installed it from local directory
through R. However, this method is still not
The file you had, assist_3.1.2.tar.gz, was not a Windows binary zip file. It
was a source tarball. That kind of file needs to be built and installed
differently. In order to do that, you need to have all the tools necessary for
building packages. This package does not have just pure R code i
On 20.08.2013 01:06, Daniel Nordlund wrote:
The file you had, assist_3.1.2.tar.gz, was not a Windows binary zip file. It
was a source tarball. That kind of file needs to be built and installed
differently. In order to do that, you need to have all the tools necessary for
building packages
I don't know... I suppose it depends how it fails. I recommend that you
restrict yourself to using only the data that was passed as parameters to your
parallel function. You may be able to tackle parts of the task and return only
those partial results to confirm how far through the code you can
On Aug 19, 2013, at 4:12 PM, Uwe Ligges wrote:
>
> On 20.08.2013 01:06, Daniel Nordlund wrote:
>> The file you had, assist_3.1.2.tar.gz, was not a Windows binary zip file.
>> It was a source tarball. That kind of file needs to be built and installed
>> differently. In order to do that, you
The R high performance computing sig might be useful for some of these
questions.
https://stat.ethz.ch/mailman/listinfo/r-sig-hpc
Dan
Daniel Nordlund
Bothell, WA USA
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
> On Behalf Of Jeff
Yeah, I tried building the package and got essentially the same warnings and
decided that further assistance required someone above my pay grade. :-)
Daniel Nordlund
Bothell, WA USA
> -Original Message-
> From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org]
> On Beh
> further assistance required someone above my pay grade. :-)
Perhaps you just need someone above your age - that is FORTRAN IV syntax
(maybe I and II as well) which was finally removed from the standard for
Fortran 95. If I recall that correctly from 1966 you can replace the
ASSIGN 30 TO NEX
Hello - my message was bounced, but I cannot tell why. Please let me know what
I need to do to post / overcome bounce.
-Original Message-
From: kevin.shaney [mailto:kevin.sha...@rosetta.com]
Sent: Monday, August 19, 2013 8:35 PM
To: r-help@r-project.org
Subject: arules LHS & RHS AND Cond
On Aug 19, 2013, at 5:07 PM, Shang Zuofeng wrote:
> So this is an alternative method. The package can be installed from source()
> rather than rebuilt. Although the warnings exist, the package itself may
> still be useful. Can you let me know how to installed from source?
>
Quite a bit of effo
Hello all,
We would like to use R to run Bekk-Garch with 2-group to 4-group financial
data sets. However, the package "rmgarch" seems to only support DCC-GARCH,
not BEKK-GARCH. Any suggestions on using R to do BEKK-GARCH test?
Thanks a lot! & Best,
Hong Yu
___
x=rchisq(100,1)
density(x)
the density plot will give density for negative part also. of course I can
truncate the plot to only view the non-negative part.
I wonder if there is a program to compute density for a user-specified
range, in this case, only [0, infinity).
[[alternative HTML
Dear R-users,
I have a shape file (.shp, sbx, .sbn, .shx and .dbf).
Can I create the Spatial Weights inside R?
Thanks,
Sebastián.
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo
Hello,
Before asking, did you search on Internet using a web search engine?
Regards,
Pascal
2013/8/20 Sebastian Kruk
> Dear R-users,
>
> I have a shape file (.shp, sbx, .sbn, .shx and .dbf).
>
> Can I create the Spatial Weights inside R?
>
> Thanks,
>
> Sebastián.
>
> [[alternative H
Hi,
I am able to proceed like this.
Input :
4000 36 36 0 r-x-- java
40108000 8 8 8 rwx-- java
4010a000 12 0 0 -[ anon ]
4010d0001016 44 44 rwx--[ anon ]
4020b000 12
On 20/08/2013 03:35, Gallon Li wrote:
x=rchisq(100,1)
density(x)
the density plot will give density for negative part also. of course I can
truncate the plot to only view the non-negative part.
I wonder if there is a program to compute density for a user-specified
range, in this case, only [0
On Aug 19, 2013, at 7:35 PM, Gallon Li wrote:
> x=rchisq(100,1)
> density(x)
>
> the density plot will give density for negative part also. of course I can
> truncate the plot to only view the non-negative part.
>
> I wonder if there is a program to compute density for a user-specified
> range,
So this is an alternative method. The package can be installed from
source() rather than rebuilt. Although the warnings exist, the package
itself may still be useful. Can you let me know how to installed from
source?
Thanks a lot!
Zuofeng
2013/8/19 David Winsemius
>
> On Aug 19, 2013, at 4:12
I wrap functions to run via multicore with tryCatch() to gather stats on
failure rate and capture state.
I'm still interested in how/whether core fuctions were verified as being
threadsafe.
Bill Hopkins
Written using a virtual Android keyboard...
-- Original message --
From: Jef
I am running permutation regressions in package lmPerm using lmp(). I
am getting what I find to be confusing results and I would like help
understanding what is going on. To illustrate my problem, I created a
simple example and am running lmp() such that the results of the lmp()
models should be
On 20/08/2013 03:13, Hopkins, Bill wrote:
I wrap functions to run via multicore with tryCatch() to gather stats on
failure rate and capture state.
I'm still interested in how/whether core fuctions were verified as being
threadsafe.
What does 'threads' have to do with this? Multicore forks
50 matches
Mail list logo