Re: [問題]實作linear svm

作者: empireisme (empireisme)   2020-03-31 00:36:17
※ 引述《empireisme (empireisme)》之銘言:
: https://imgur.com/uImTpvF
: 各位大神好
: 小弟我最近在實作linear svm
: 使用的是台大李宏毅老師教的梯度下降法,去尋找weight
: 具體如下
: https://imgur.com/uImTpvF
: 但發現我的效果好像有點錯
: https://imgur.com/j2uME6R
: 如圖 support vector 應該要通過A點和B點才對 不知道哪裡錯了QQ
: 看看有沒有大神可以幫忙找
: 這是我的程式碼
: https://ideone.com/qg0zVC
: 我有想過是不是 W[i+1,j]<- W[i,j]+eta*sum(((y*(X%*%W[i,]))<1)*1 * y * X[,j] )
: eta前面的要換成 減號 但是換成減號反而錯更多
: 檢查蠻久了 想問問看大神們
找到BUG了
關鍵點在INTIAL GUESS 要都是0
w_intial <- rep(0,p)
以及我
abline((1-w_answer[1])/w_answer[3],-w_answer[2]/w_answer[3],lty=2)
abline((-1-w_answer[1])/w_answer[3],-w_answer[2]/w_answer[3],lty=2)
這邊要加上括號
svm_gradient<- function(x,eta=0.001,R=10000){
X<- cbind(1,x)#make design matrix
n <- nrow(X) #number of sample
p <- ncol(X) #number of feature+1 (bias)
w_intial <- rep(0,p)
W <- matrix(w_intial ,nrow = R+1,ncol = p,byrow = T) #matrix put intial guess
and the procedure to do gradient descent
for(i in 1:R){
for(j in 1:p)
{
W[i+1,j]<- W[i,j]+eta*sum(((y*(X%*%W[i,]))<1)*1 * y * X[,j] )
}
}
return(W)
}
getsvm <- function(x){
w_answer<- svm_gradient(x)[nrow(svm_gradient(x)),]
return(w_answer )
}
set.seed(3)
n = 5
a1 = rnorm(n)
a2 = 1 - a1 + 2* runif(n)
b1 = rnorm(n)
b2 = -1 - b1 - 2*runif(n)
x = rbind(matrix(cbind(a1,a2),,2),matrix(cbind(b1,b2),,2))
y <- matrix(c(rep(1,n),rep(-1,n)))
plot(x,col=ifelse(y>0,4,2),pch=".",cex=3,xlab = "x1",ylab = "x2")
w_answer<- getsvm(x)
abline(-w_answer[1]/w_answer[3],-w_answer[2]/w_answer[3])
abline((1-w_answer[1])/w_answer[3],-w_answer[2]/w_answer[3],lty=2)
abline((-1-w_answer[1])/w_answer[3],-w_answer[2]/w_answer[3],lty=2)

Links booklink

Contact Us: admin [ a t ] ucptt.com