Hi everybody,

I am new in e1071 and with SVMs. I am trying to understand the performance
of SVMs but I face with a situation that I thought as not meaningful.

I added the R code for you to see what I have done. 

/set.seed(1234)
data <- data.frame( rbind(matrix(rnorm(1500, mean = 10, sd = 5),ncol = 10), 
matrix(rnorm(1500, mean = 5, sd = 5),ncol = 10)))
class <- as.factor(rep(1:2, each=150))
data<- cbind(data,class)

tuned<-best.svm(class~., data=data, kernel =  "linear", cost =
seq(0.24,0.44, by = .01), tunecontrol=tune.control(cross=300) )

# test with train data
predicts <- predict(model, data, probability=TRUE, decision.values = TRUE)
tab<-table(predicts, data$class)
tab/

This is what I face:
/Parameters:
   SVM-Type:  C-classification 
 SVM-Kernel:  linear 
       cost:  0.26 
      gamma:  0.1 

Number of Support Vectors:  61/

But, when I try cost=0.31, I get a lower misclassification error rate than
when I get with cost=0.26 .

Is this difference because the error used while tuning is different from the
misclassification value? 

 Thanks in advance.



--
View this message in context: 
http://r.789695.n4.nabble.com/e1071-tuning-is-not-giving-the-best-within-the-range-tp4640747.html
Sent from the R help mailing list archive at Nabble.com.

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to