Loss Source 1: Cross entropy loss,各个阶段的分类器都有 Loss Source 2: KL loss,深层的分类器作为浅层分类器的teacher Loss Source 3: L2 loss from hints,深层分类器的特征和浅层分类器的特征做L2 loss,bottleneck即feature adaptation,为了使student和teacher一样大 相关文章: 2022-01-07 2021-11-04 2021-05-30 2022-01-14 2021-11-02 2021-11-20 2021-11-18