I routinely write graphics into multi-page PDFs, but some graphics (i.e.
plots of large spatial datasets using levelplot()) can result in enormous
files. I'm curious if there is a better way. For example:
#First, make some data:
library(lattice)
d=expand.grid(x=1:1000,y=1:1000)
d$z=rnorm(nrow(d)
Greetings all,
First of all, thanks to all of you for creating such a useful, powerful
program.
I regularly work with very large datasets (several GB) in R in 64-bit Fedora
8 (details below). I'm lucky to have 16GB RAM available. However if I am
not careful and load too much into R's memory, I
Greetings all,
This isn't a request for help, but I thought the following article from the
New York Times would be of interest to you all.
Enjoy!
Adam
*Data Analysts Captivated by R's Power*
By ASHLEE VANCE
Published ONLINE: January 6, 2009
URL:
http://www.nytimes.com/2009/01/07/technology/bu
packages are in
/tmp/RtmpaoUZiC/downloaded_packages
Updating HTML index of packages in '.Library'
Warning messages:
1: In install.packages("msm", lib = "/usr/lib64/R/library") :
installation of package 'msm' had non-zero exit status
2: In tools:::un
,
[INCDIR_NETCDF_H=TRUE], [INCDIR_NETCDF_H=FALSE])
if test "${INCDIR_NETCDF_H}" = TRUE; then
HAVE_NETCDF_H=TRUE
fi
fi
I've tried fiddling around in this, and then typing
#autoconf configure.ac > newconfigure
sh ./newconfigure
But it always ends the same:
checking for main
t;- 0
accs <- 0
A = B = S = rep(NaN,n)
for(i in 1:n){
z = sampleab(x,y,a,b,s,da,db)
q <- samples(x,y,a,b,s,ds)
A[i] = a = z$a
B[i] = b = z$b
S[i]=s=q$s
accab = accab + z$acc
accs <- accs +q$accs
}
invisible(list(a=A, b=B, s=S, accab=accab/n,accs=accs/n))
}
Cheers,
Ted
Dept. of Biology,
Greetings all,
I am running R 2.5.1, RMySQL 0.6 , and DBI 0.2-3 on Windows XP
Like others, I am having trouble with NA/Null value conversions between R
and a MySQL database via DBI, but I could not find my exact problem in the
archives. Most of the time NA values in R get transferred correctly t
7 matches
Mail list logo