1 Multi-Sample Interpolation Training Method

less than 1 minute read

Liang, Daojun, Feng Yang, and Xiuping Wang. "Multi-Sample Interpolation Training Method." Proceedings of the 2018 International Conference on Machine Learning Technologies. 2018. http://liangdaojun.github.io/files/p1-MSITM.pdf

Published in Journal 1, 2018

Download Citation

Abstract The mixup training method has achieved a better generalization performance than the traditional training method. But there is no interpretation to why mixup has such a good generalization. In this paper, a series of ablation experiments were first done to prove that the training method of mixup is equivalent to a regularization and data augmentation. Then, we propose several different multi-sample training methods as variations of the mixup, which can also achieve comparable performance with mixup. This method shows that the different mixing methods can achieve the same effect as the original mixup training method. Next, a network architecture that can classify multiple samples at the same time is being proposed. The network architecture prove that mixup does not only learn a linear interpolation between the two categories, but learns to separate the two categories more accurately. Finally, through the fineturning of mixup, the training precision can be further improved.