Matlab实现线性回归和逻辑回归: Linear Regression Logistic Reg

function PlotFunc( xstart,xend )%PLOTFUNC Summary of this function goes here% draw original data and the fitted%===================cost function 2====linear regression%original datax1=[1;2;3;4];y1=[1.1;2.2;2.7;3.8];%plot(x1,y1,’ro-‘,’MarkerSize’,10);plot(x1,y1,’rx’,’MarkerSize’,10);hold on;%fitted line – 拟合曲线x_co=xstart:0.1:xend;y_co=0.3+0.86*x_co;%plot(x_co,y_co,’g’);plot(x_co,y_co);hold off;end

function [ jVal,gradient ] = costFunction( theta )%COSTFUNCTION Summary of this function goes here% Detailed explanation goes herejVal= (theta(1)-5)^2+(theta(2)-5)^2;gradient = zeros(2,1);%code to compute derivative to thetagradient(1) = 2 * (theta(1)-5);gradient(2) = 2 * (theta(2)-5);endGradient_descent,,进行参数优化

function [optTheta,functionVal,exitFlag]=Gradient_descent( )%GRADIENT_DESCENT Summary of this function goes here% Detailed explanation goes here options = optimset(‘GradObj’,’on’,’MaxIter’,100); initialTheta = zeros(2,1) [optTheta,functionVal,exitFlag] = fminunc(@costFunction,initialTheta,options); endmatlab主窗口中调用,得到优化厚的参数(θ1,θ2)=(5,5)

[optTheta,functionVal,exitFlag] = Gradient_descent()initialTheta =00Local minimum found.Optimization completed because the size of the gradient is less thanthe default value of the function tolerance.<stopping criteria details>optTheta =55functionVal =0exitFlag =1

第四部分:Y=1/(1+e^X)型—逻辑回归(sigmod 函数拟合)

hypothesis function:

function [res] = h_func(inputx,theta)%cost function 3tmp=theta(1)+theta(2)*inputx;%m*1res=1./(1+exp(-tmp));%m*1endcost function:

function [ jVal,gradient ] = costFunction3( theta )%COSTFUNCTION3 Summary of this function goes here% Logistic Regressionx=[-3;-2;-1;0;1;2;3];y=[0.01; 0.05; 0.3; 0.45; 0.8; 1.1; 0.99];m=size(x,1);%hypothesis datahypothesis = h_func(x,theta);%jVal-cost function & gradient updatingjVal=-sum(log(hypothesis+0.01).*y + (1-y).*log(1-hypothesis+0.01))/m;gradient(1)=sum(hypothesis-y)/m; %reflect to theta1gradient(2)=sum((hypothesis-y).*x)/m; %reflect to theta 2endGradient_descent:

function [optTheta,functionVal,exitFlag]=Gradient_descent( ) options = optimset(‘GradObj’,’on’,’MaxIter’,100); initialTheta = [0;0]; [optTheta,functionVal,exitFlag] = fminunc(@costFunction3,initialTheta,options);end运行结果:

[optTheta,functionVal,exitFlag] = Gradient_descent()Local minimum found.Optimization completed because the size of the gradient is less thanthe default value of the function tolerance.<stopping criteria details>optTheta =0.35261.7573functionVal =0.2498exitFlag =1画图验证:

function PlotFunc( xstart,xend )%PLOTFUNC Summary of this function goes here% draw original data and the fitted %===================cost function 3=====logistic regression%original datax=[-3;-2;-1;0;1;2;3];y=[0.01; 0.05; 0.3; 0.45; 0.8; 1.1; 0.99];plot(x,y,’rx’,’MarkerSize’,10);hold on%fitted linex_co=xstart:0.1:xend;theta = [0.3526,1.7573];y_co=h_func(x_co,theta);plot(x_co,y_co);hold offend

==============================

Reference:

1.

2.

3.

4.

松树亭亭玉立的耸立在周围小草小花的中间,

Matlab实现线性回归和逻辑回归: Linear Regression Logistic Reg

相关文章:

你感兴趣的文章:

标签云: