Stanford 機器學習 Week2 作業: Linear Regression

Plotting the Data

data = load('ex1data1.txt');       % read comma separated data
X = data(:, 1); y = data(:, 2);
m = length(y);                     % number of training examples
plot(x, y, 'rx', 'MarkerSize', 10); % Plot the data,'rx'表示用紅叉畫點,'MarkerSize' = 10設定紅叉大小

ylabel('Profit in $10,000s'); % Set the y−axis label xlabel('Population of City in 10,000s'); % Set the x−axis label

Computing the cost J(θ)

l = length(X);
T = 1 / 2 / l * ( X * theta - y) .^ 2;
J = sum(T);

Gradient descent

t1 = theta(1) - alpha / m * sum( X * theta - y);
t2 = theta(2) - alpha / m * sum((X * theta - y) .* X(:,2));
theta = [t1; t2];

Feature Normalization

l = size(X_norm,2);
for i = 1:l
    mu(i) = mean(X_norm(:,i));
    sigma(i) = std(X_norm(:,i));
    X_norm(:,i) = (X_norm(:,i) - mu(i)) / sigma(i);
end;

Gradient Descent(muitiple variables)

m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
n = size(X,2);
tmp = zeros(n,1);

for iter = 1:num_iters
    for i = 1:n
        tmp(i) = theta(i) - alpha / m * (X * theta - y)' * X(:,i);
    end;
    theta = tmp;
    J_history(iter) = computeCostMulti(X, y, theta);
end

Normal Equations

theta = pinv(X'*X)*X'*y
發佈了293 篇原創文章 · 獲贊 39 · 訪問量 44萬+
發表評論
所有評論
還沒有人評論,想成為第一個評論的人麼? 請在上方評論欄輸入並且點擊發布.
相關文章