• Loss Source 1: Cross entropy loss,各个阶段的分类器都有
  • Loss Source 2: KL loss,深层的分类器作为浅层分类器的teacher
  • Loss Source 3: L2 loss from hints,深层分类器的特征和浅层分类器的特征做L2 loss,bottleneck即feature adaptation,为了使student和teacher一样大
    Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation
    Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation
    Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation
    Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation

相关文章:

  • 2022-01-07
  • 2021-11-04
  • 2021-05-30
  • 2022-01-14
  • 2021-11-02
  • 2021-11-20
  • 2021-11-18
猜你喜欢
  • 2021-09-23
  • 2022-12-23
  • 2021-11-01
  • 2021-06-14
  • 2021-09-03
  • 2021-09-17
  • 2021-10-23
相关资源
相似解决方案