文章目录
Regression(Output a scalar)
机器学习三个步骤
Step 1: Model
Step 2: Goodness of Function
Step 3: Best Function
Gradient Descent
Tip 1: Tuning your learning rates
Adaptive Learning Rates
- Adagrad
Tip 2: Stochastic(随机) Gradient Descent(Make the training faster)
Tip 3: Feature Scaling(特征缩放)
Overfitting(过拟合)
在训练集上表现很好,测试集上表现不好才叫 过拟合。
如果在训练集上表现都不好,那叫 欠拟合。
先搞定训练集再说!
Regularization(正则化)
Bias and Variance
Cross Validation(交叉验证)
N-fold Cross Validation
Classification
Probabilistic Generative Model
贝叶斯公式
sigmod激活函数
Logistic Regression
Step 1: Function Set
Step 2: Goodness of a Function
Cross entropy(交叉熵,损失函数的一种)
Step 3: Find the best function
Cross Entropy v.s. Square Error
Multi-class Classification
Softmax
Limitation of Logistic Regression