Multi-task Learning的工作:
Contribuions
- 提出了:
- 利用更加丰富的unlabeled data
- 利用multi-task learning来提供complementary information,让预测更准确
- 来分别解决之前工作的问题:
- 都需要 sufficient amount of annotated data,but annotated data are all captured in limited scenes.
- 实验中作者发现之前的算法 ne-
Methods
-
Multi-task CNN
一个多任务模型,同时完成三个任务:shadow edge, region, count,相当于增加额外的supervision(from both global and detail views),让预测更准确。 -
Multi-task mean teacher network
这个是整体架构,把上面的MT-CNN塞到这里面。
labelled data放进student net,与groundtruth对比计算出classification CE loss。unlabelled data 放进student net,加noise放进teacher net,把teacher net的结果当成是label来用。计算consistency loss来让student & teacher的结果匹配。加 noise可以实现regularisation。用loss来更新student的参数,然后,teacher的参数更新为student参数的exponential moving average。
这种semi-supervised learning的方法来源于Neurips2017 Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results. (没太看懂,还需要再仔细看下mean teacher的原理。)
Results
3个数据集。