Programming Exercise 2: Logistic Regression

斯坦福大学机器学习公开课—Programming Exercise 2: Logistic Regression—Matlab实现

1 Logistic Regression

In this part of the exercise, I will build a logistic regression model to predict whether a student gets admitted into a university.

You want to determine each applicant’s chance of admission based on their scores on two exams.

1.1 Visualizing the data

function plotData(X, y)%PLOTDATA Plots the data points X and y into a new figure % PLOTDATA(x,y) plots the data points with + for the positive examples% and o for the negative examples. X is assumed to be a Mx2 matrix.% Create New Figurefigure; hold on;% ====================== YOUR CODE HERE ======================% Instructions: Plot the positive and negative examples on a%2D plot, using the option 'k+' for the positive%examples and 'ko' for the negative examples.% Find Indices of Positive and Negative Examplespos = find(y==1); neg = find(y == 0); % 对应0/1的相应地址向量% Plot Examplesplot(X(pos, 1), X(pos, 2), 'k+','LineWidth', 2, …'MarkerSize', 7);plot(X(neg, 1), X(neg, 2), 'ko', 'MarkerFaceColor', 'y', …'MarkerSize', 7);% =========================================================================hold off;end

1.2 Plotting the decision boundary

plotDecisionBoundary.musing the θ value to plot a Decision boundary.

function plotDecisionBoundary(theta, X, y)%PLOTDECISIONBOUNDARY Plots the data points X and y into a new figure with%the decision boundary defined by theta% PLOTDECISIONBOUNDARY(theta, X,y) plots the data points with + for the % positive examples and o for the negative examples. X is assumed to be % a either % 1) Mx3 matrix, where the first column is an all-ones column for the %intercept.% 2) MxN, N>3 matrix, where the first column is all-ones% Plot DataplotData(X(:,2:3), y);hold onif size(X, 2) <= 3 %feature =1,,2,% Only need 2 points to define a line, so choose two endpoints 端点plot_x = [min(X(:,2))-2, max(X(:,2))+2];% 两点横坐标,第一个特征的最大最小值,横坐标的始末地址-+2。% Calculate the decision boundary lineplot_y = (-1./theta(3)).*( theta(2).*plot_x + theta(1)); %第二个特征的预测值???p=0.5,决策边界 theta*X=0% Plot, and adjust axes for better viewingplot(plot_x, plot_y)% Legend, specific for the exerciselegend('Admitted', 'Not admitted', 'Decision Boundary')axis([30, 100, 30, 100])else %% Here is the grid rangeu = linspace(-1, 1.5, 50); % -1->1.5 区间50 等分取点v = linspace(-1, 1.5, 50);z = zeros(length(u), length(v));% Evaluate z = theta*x over the gridfor i = 1:length(u)for j = 1:length(v)z(i,j) = mapFeature(u(i), v(j))*theta;endendz = z'; % important to transpose z before calling contour% Plot z = 0% Notice you need to specify the range [0, 0]contour(u, v, z, [0, 0], 'LineWidth', 2)%???endhold offend% MAPFEATURE Feature mapping function to polynomial features % MAPFEATURE(X1, X2) maps the two input features % to quadratic features used in the regularization exercise. % Returns a new feature array with more features, comprising of % X1, X2, X1.^2, X2.^2, X1*X2, X1*X2.^2, etc.. % Inputs X1, X2 must be the same size

1.2 Implementation

1.2.1 Warmup exercise: sigmoid function

The logistic regression hypothesis is :

For a matrix, the SIGMOID function perform the sigmoid function on every element.

function g = sigmoid(z)%SIGMOID Compute sigmoid functoon% J = SIGMOID(z) computes the sigmoid of z.% You need to return the following variables correctly g = zeros(size(z));% ====================== YOUR CODE HERE ======================% Instructions: Compute the sigmoid of each value of z (z can be a matrix,%vector or scalar).g = 1./(ones(size(z))+e.^(-z));% =============================================================end<strong></strong>

1.2.2 Cost function and gradient

Cost

Gradient

不要做刺猬能不与人结仇就不与人结仇,

Programming Exercise 2: Logistic Regression

相关文章:

你感兴趣的文章:

标签云: