Dear Sir/Madam:
Thank you for your concern. I’m using getNodeSet in XML package to craw data
from web pages, and I need to save it as a document. However, I tried
write.table, write, and cat, but none of above could save the data. The error
messages just like these:
> ac<-getNodeSet(article, "//div[@class='entry-content']") #attention to the
>majuscule> write.table(ac[[1]], file="E:/学术资源网络分析/article.txt")Error in
>as.data.frame.default(x[[i]], optional = TRUE) :
c("cannot coerce class \"c(\"XMLInternalElementNode\", \"XMLInternalNode\",
\"XMLAbstractNode\"\" to a data.frame", "cannot coerce class \")\" to a
data.frame")
> write(ac, file="E:/学术资源网络分析/article.txt")Error in cat(list(...), file, sep,
>fill, labels, append) :
argument 1 (type 'list') cannot be handled by 'cat'
I have tried to force the data from getNodeSet into data frame, but it does not
work either:
> tmp <- data.frame(ac)Error in as.data.frame.default(x[[i]], optional = TRUE,
>stringsAsFactors = stringsAsFactors) :
不能把""XMLNodeSet""类别强迫变成数据框
I really confused how to save the data crawled by getNodeSet into a document. I
wonder if you could give me some advice on solving this problem with R. And I
would be most grateful if you could reply at your earliest convenience. Looking
forward to hearing from you. Thank you very much.
Sincerely yours
Humphrey Zhao
[[alternative HTML version deleted]]
______________________________________________
[email protected] mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.