# Summary

This week are covering linear regression with multiple variables. It show us how linear regression can be extended to accommodate multiple input features. It also discuss best practices for implementing linear regression.

## 题目

## plotData.m

- coding
1

2

3

4

5

6

7

8

9

10

11

12

13function plotData(X, y)

% Create New Figure

figure; hold on;

% ====================== YOUR CODE HERE ======================

% Instructions: Plot the positive and negative examples on a

% 2D plot, using the option 'k+' for the positive

% examples and 'ko' for the negative examples.

%

pos = find(y == 1); neg = find(y == 0);

plot(X(pos,1),X(pos,2),'k+','LineWidth',2,'MarkerSize',7);

plot(X(neg,1),X(neg,2),'ko','MarkerFaceColor','y','MarkerSize',7);

end

## 1) Sigmmoid function

### formula

1 | h(x) = g(theta'*x) |

### coding

1 | function g = sigmoid(z) |

## 2) Compute cost and Gradient for logistic regression

### formula

1 | J(theta) = 1/m* sum[-y*log(h(x))-(1-y)log(1-h(x))] |

### coding

1 | function [J, grad] = costFunction(theta, X, y) |

### result

## 3) Predict function

1 | function p = predict(theta, X) |

## 4) Compute cost and Gradient for regularized LR

### fomula

1 | J(theta) = 1/m* sum[-y*log(h(x))-(1-y)log(1-h(x))] + lambda/2m * sum(theta_j^2) |

### coding

1 | function [J, grad] = costFunctionReg(theta, X, y, lambda) |