ACML 2020 🇹🇭
  • News
  • Program

Bridging Ordinary-Label Learning and Complementary-Label Learning

By Yasuhiro Katsura and Masato Uchida

Abstract

A supervised learning framework has been proposed for the situation where each trainingdata is provided with a complementary label that represents a class to which the pattern does not belong. In the existing literature, complementary-label learning has been studied independently from ordinary-label learning, which assumes that each training data is provided with a label representing the class to which the pattern belongs. However, providing a complementary label should be treated as equivalent to providing the rest of all the labels as the candidates of the one true class. In this paper, we focus on the fact that the loss functions for one-versus-all and pairwise classification corresponding to ordinary-label learning and complementary-label learning satisfy certain additivity and duality, and provide a framework which directly bridge those existing supervised learning frameworks. Further, we derive classification risk and error bound for any loss functions which satisfy additivity and duality.