If I understand correctly, this query is really about how to organize your data, not how to use R code to do it, though that may come later. Of course, the first question is why mess with Excel at all? But I shall assume you have good reason to get your data into Excel and do what you want to with it there rather than in R. That being the case, the next question is why mess with your data in R ?-- I assume Excel has tools to extract data from files and organize it for analysis -- that, presumably, is its purpose!
However, as the kids say, whatever... I assume the tables you describe come one per location. If I were doing this in R, I would organize the data ("tidyverse" style to use Hadley's phrase) for each location into a data frame of 3 columns: Year Month Value . I would then combine all columns into one data frame with columns Location Year Month Value which could then be exported as a CSV if you like. But this likely depends on what you wish/need to do with the data in Excel and possibly also your Excel skills in creating/converting the data structures you need there, about which, of course, you have given no information. So you may need to provide further detail to get useful help. Or wait for someone smarter to reply. Cheers, Bert Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into it." -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip ) On Wed, Mar 28, 2018 at 9:32 AM, orlin mitov via R-help <r-help@r-project.org> wrote: > Hello, > I have no previous experience with R, but had to learn on the fly in the > past couple of weeks. Basically, what I am trying to do is read a certain > variable from a series of files and save it as csv-table. The variable has an > hourly value for each month in a year for the past 20 years and has to be > read for different geographical locations. So there will be 12 files per year > (1 for each month) and the values for the variable from each file will be 696 > to 744 (depending on how many days x 24 hours there were in the month).What I > achieved is to to read the values from all 12 files stored in directory with > a function and add them as vectors to a lapply-list: > > > > Myfunction <- function(filename) { > nc <- nc_open(filename) > lon <- ncvar_get(nc, "lon") > lat <- ncvar_get(nc, "lat") > RW <- ncvar_get(nc, "X") > HW <- ncvar_get(nc, "Y") > pt.geo <- c(8.6810 , 50.1143) > dist <- sqrt( (lon - pt.geo[1])^2 + (lat - pt.geo[2])^2 ) > ind <- which(dist==min(dist, na.rm=TRUE),arr.ind=TRUE) > sis <- ncvar_get(nc, "SIS", start=c(ind[1],ind[2],1), count=c(1,1,-1)) > vec <- c(sis) > } > > filenames <- list.files(path = "C:/Users/Desktop/VD/Solardaten/NC", pattern = > "nc", full.names = TRUE) > output <- lapply(filenames, Myfunction) > > > > And here start my problems with saving "output" as a csv table. Output would > contain 12 vectors of different lenght.I want to have them as 12 columns (1x > per month) in Excel and each column should have as many row-entries as there > are values for this month.Whatever I tried with write.table I was not able to > achieve this (tried converting the output to a matrix, also no > successes).Please help! Or should I be trying to have the 12 elements as data > frames and not vectors? > This is how I want the table for each year to look - 12 columns and all the > respective values in the rows (column names I can add by myself): > Best regardsOrlin > > > ______________________________________________ > R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see > https://stat.ethz.ch/mailman/listinfo/r-help > PLEASE do read the posting guide http://www.R-project.org/posting-guide.html > and provide commented, minimal, self-contained, reproducible code. > ______________________________________________ R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.