Thanks Josh and Dan! I did figure it had something to do with the machine epsilon...
But so what do I do now? I'm calculating the total absolute error over thousands of tables e.g.: tae<-sum(abs(obs-exp)) Is there any easy way to I keep these ignorable errors from showing up? And furthermore, why does this happen only sometimes? The two (2D) tables I attached are actually just one 'layer' in a 3D table. And only 2 out of about 400 layers had this happen, all the other ones are identical - perfectly! And out of 2000 3D tables, about 60 of which should have no error, only 10 actually show an error of zero, and in the rest this same thing happens in a few layers. OK, this is a bit messy for a real question. I mean I can just round down all the errors that are under 1e-8 or something, but I'd much rather this not happen in the first place? Thanks again to the two posters for bothering with me! Maja. -- View this message in context: http://r.789695.n4.nabble.com/identical-values-not-so-identical-newbie-help-please-tp3346078p3346516.html Sent from the R help mailing list archive at Nabble.com. ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.