Abstract

  1. Define a loss function that quantifies our unhappiness with the scores across the training data.
  2. Come up with a way of efficiently finding the parameters that minimize the loss function.(optimization)

1 Loss Function

  • loss的一些想法
    CS231n 2017Spring Lecture3 Loss Function and Optimization总结

1.1 SVM的loss

CS231n 2017Spring Lecture3 Loss Function and Optimization总结
该SVM Loss的一个特点
CS231n 2017Spring Lecture3 Loss Function and Optimization总结
SVM Loss的形式

Li=ijmax(0,sjsyi+1)

* 对例子计算Loss
CS231n 2017Spring Lecture3 Loss Function and Optimization总结
平均Loss
CS231n 2017Spring Lecture3 Loss Function and Optimization总结
这里有一个问题: What if the sum was over all classes?
其实是不会改变的,因为所有的值(j=i时)都+1所以相当于没有加还多做了操作

CODE例子:

CS231n 2017Spring Lecture3 Loss Function and Optimization总结

另外一点需要注意的是:

CS231n 2017Spring Lecture3 Loss Function and Optimization总结
奥卡姆剃刀原理:Among competing hypotheses,the simplest is the best(model选择的一点sense)

1.2 Regularization

CS231n 2017Spring Lecture3 Loss Function and Optimization总结

1.3 SoftMax Classfier Loss

CS231n 2017Spring Lecture3 Loss Function and Optimization总结

  • SoftMax与SVM不一样,它是找到每个分类的概率来进行分类(如公式)

  • 计算Loss
    CS231n 2017Spring Lecture3 Loss Function and Optimization总结

    SVM与SoftMax Loss的对比

    CS231n 2017Spring Lecture3 Loss Function and Optimization总结

  • 前所述过程总结
    CS231n 2017Spring Lecture3 Loss Function and Optimization总结

2 optimization

  1. 首先有一点找F也叫H(就是数据分布下隐含的函数)的方式是Random Search,这种想法完全无法使用,尤其是数据维度高的时候。
  2. Follow the Scope(GD方法)
    CS231n 2017Spring Lecture3 Loss Function and Optimization总结
    但是计算剃度的时候是直接计算的(即analytic gradient)
    但是这种方法的好处是可以验证你的梯度是不是对的在编程的时候
    In practice: Always use analytic gradient, but check implementation with numerical gradient. This is called a gradient check.

    • SGD的例子
      CS231n 2017Spring Lecture3 Loss Function and Optimization总结

Deal With Image Features

  • RGB
    CS231n 2017Spring Lecture3 Loss Function and Optimization总结
    CS231n 2017Spring Lecture3 Loss Function and Optimization总结
    处理特征的一些方法
    CS231n 2017Spring Lecture3 Loss Function and Optimization总结
    CS231n 2017Spring Lecture3 Loss Function and Optimization总结
    CS231n 2017Spring Lecture3 Loss Function and Optimization总结

最终对比ConvNet

CS231n 2017Spring Lecture3 Loss Function and Optimization总结

相关文章:

  • 2021-10-28
  • 2022-12-23
  • 2021-10-08
  • 2021-09-07
  • 2021-11-27
  • 2021-10-27
  • 2022-12-23
  • 2018-12-19
猜你喜欢
  • 2021-11-23
  • 2021-07-09
  • 2021-08-09
  • 2021-07-08
  • 2022-12-23
  • 2022-01-08
  • 2021-06-17
相关资源
相似解决方案