Hi Jessica, Talking may help you, but it's kind of a waste of time for the thousands of readers of R-help unless you provide a reproducible example and full context. We'd like to help, but can't without adequate information.
Sarah On Thu, Nov 15, 2012 at 9:48 AM, Jessica Streicher <j.streic...@micromata.de> wrote: > Now i let it run for one specific set and got the same bad result, then i > deactivated the probabilities and got a good result, then i activated the > probabilities again and got a good result .. huh??? > > On 15.11.2012, at 15:32, Jessica Streicher wrote: > >> Its not scaling.. so.. >> >> I guess i'll stay severely frustrated, and yes i know this is probably not >> enough information for anyone to help. >> Still, talking helps ;) >> >> On 15.11.2012, at 15:15, Jessica Streicher wrote: >> >>> with >>> >>> pred.pca<-predict(splits[[i]]$pca,trainingData@samples)[,1:nPCs] >>> dframe<-as.data.frame(cbind(pred.pca,class=isExplosive(trainingData,2))); >>> results[[i]]$classifier<-ksvm(class~.,data=dframe,scaled=T,kernel="polydot",type="C-svc", >>> >>> C=C,kpar=list(degree=degree,scale=scale,offset=offset),prob.model=T) >>> >>> and a degree of 5 i get an error of 0 reported by the ksvm object. But when >>> doing >>> >>> pred.pca<-predict(splits[[i]]$pca,trainingData@samples)[,1:nPCs] >>> pred.svm<-kernlab::predict(results[[i]]$classifier,pred.pca,type="probabilities"); >>> results[[i]]$trainResults$predicted<-pred.svm[,2] >>> >>> the results vary widely from the class vector. Nearly all predictions are >>> somewhat around 0.29. Its just strange. And i have no idea where things go >>> wrong. They're in the same loop with i, so its probably not an indexing >>> issue. >>> >>> Maybe kernlabs predict doesn't scale the data or something? >>> >> -- Sarah Goslee http://www.functionaldiversity.org ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.