• 1.推导LR损失函数

Chang-Xiao Li Machine Learning 2019 Task5

Chang-Xiao Li Machine Learning 2019 Task5

Chang-Xiao Li Machine Learning 2019 Task5

 

损失函数:

给定一组参数w和b最大自然估计为:

Chang-Xiao Li Machine Learning 2019 Task5

取对数:

Chang-Xiao Li Machine Learning 2019 Task5

变换形式:

Chang-Xiao Li Machine Learning 2019 Task5

即为两个伯努利分布的交叉熵。

Chang-Xiao Li Machine Learning 2019 Task5

LR的损失函数为:

Chang-Xiao Li Machine Learning 2019 Task5

2.学习LR梯度下降

Chang-Xiao Li Machine Learning 2019 Task5

Chang-Xiao Li Machine Learning 2019 Task5

Chang-Xiao Li Machine Learning 2019 Task5

因此 LR 梯度下降:

Chang-Xiao Li Machine Learning 2019 Task5

Chang-Xiao Li Machine Learning 2019 Task5

3.学习LR梯度下降

4.softmax原理

对于多分类问题

Chang-Xiao Li Machine Learning 2019 Task5

5.softmax损失函数

 

Chang-Xiao Li Machine Learning 2019 Task5

6.softmax梯度下降

Chang-Xiao Li Machine Learning 2019 Task5

相关文章:

  • 2022-02-20
  • 2021-10-19
  • 2021-10-23
  • 2021-07-26
  • 2022-02-13
  • 2021-05-13
猜你喜欢
  • 2021-09-02
  • 2021-04-26
  • 2021-12-13
  • 2021-05-21
  • 2021-05-28
  • 2021-12-19
  • 2021-07-17
相关资源
相似解决方案