词条 | 梯度下降算法 |
释义 | VB梯度下降算法 function grad_ascent(x,y,z,px,py,N,mu,xstart,ystart) xga(1)= xstart; yga(1)= ystart; zga(1)=func(xga(1),yga(1)); for i=1:N gradx = ( func(xga(i)+eps,yga(i))-func(xga(i),yga(i)) )/eps; grady = ( func(xga(i),yga(i)+eps)-func(xga(i),yga(i)) )/eps; xga(i+1) = xga(i) + mu*gradx; yga(i+1) = yga(i) + mu*grady; zga(i+1)=func(xga(i+1),yga(i+1)); end hold off contour(x,y,z,10) hold on quiver(x,y,px,py) hold on plot(xga,yga) S = sprintf('Gradiant Ascent: N = %d, Step Size = %f',N,mu); title(S) xlabel('x axis') ylabel('yaxis') DEMO clear print_flag = 1; width = 1.5; xord = -width:.15:width; yord = -width:.15:width; [x,y] = meshgrid(xord,yord); z = func(x,y); hold off surfl(x,y,z) xlabel('x axis') ylabel('yaxis') if print_flag, print else, input('Coninue?'), end [px,py] = gradient(z,.2,.2); xstart = 0.9*width; ystart =-0.3*width; N = 100; mu = 0.02; grad_ascent(x,y,z,px,py,N,mu,xstart,ystart) if print_flag, print else, input('Coninue?'), end N = 100; mu = 0.06; grad_ascent(x,y,z,px,py,N,mu,xstart,ystart) if print_flag, print else, input('Coninue?'), end N = 100; mu = 0.18; grad_ascent(x,y,z,px,py,N,mu,xstart,ystart) if print_flag, print else, input('Coninue?'), end |
随便看 |
百科全书收录4421916条中文百科知识,基本涵盖了大多数领域的百科知识,是一部内容开放、自由的电子版百科全书。