Machine Learning Week 3-Classification

17/11/03 MWhite's learning notes

1. Classification

Linear Regression :y∈R
Classification(logistic regression) : y∈{0,1,...}

1.1 Hypothesis Representation

1.2 Logistic regression's Cost Function


Simplified Cost Function


Simplified Cost Function

1.3 Logistic regression's Gradient Descent


Vectorized implementation:


1.4 Advanced Optimization

library function——fminunc()

function [jVal, gradient] = costFunction(theta)
  jVal = [...code to compute J(theta)...];
  gradient = [...code to compute derivative of J(theta)...];
end
options = optimset('GradObj', 'on', 'MaxIter', 100);
initialTheta = zeros(2,1);
   [optTheta, functionVal, exitFlag] = fminunc(@costFunction, initialTheta, options);

2. Multiclass Classification

One-vs-all



3. Overfitting



skips θ0

3.1 Regularized Linear Regression

  • Cost Function


  • Gradient descent


  • Normal Equation


3.2 Regularized Logistic Regression

  • Cost Function


  • Gradient descent


最后編輯于
?著作權歸作者所有,轉載或內容合作請聯系作者
平臺聲明:文章內容(如有圖片或視頻亦包括在內)由作者上傳并發布,文章內容僅代表作者本人觀點,簡書系信息發布平臺,僅提供信息存儲服務。