Publication Cover

More About NC


Article Metrics

Altmetric

About article usage data:

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Aenean euismod bibendum laoreet. Proin gravida dolor sit amet lacus accumsan et viverra justo commodo. Proin sodales pulvinar tempor. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus.


Divergence estimators based on direct approximation of density ratios without going through separate approximation of numerator and denominator densities have been successfully applied to machine learning tasks that involve distribution comparison such as outlier detection, transfer learning, and two-sample homogeneity test. However, since density-ratio functions often possess high fluctuation, divergence estimation is a challenging task in practice. In this letter, we use relative divergences for distribution comparison, which involves approximation of relative density ratios. Since relative density ratios are always smoother than corresponding ordinary density ratios, our proposed method is favorable in terms of nonparametric convergence speed. Furthermore, we show that the proposed divergence estimator has asymptotic variance independent of the model complexity under a parametric setup, implying that the proposed estimator hardly overfits even with complex models. Through experiments, we demonstrate the usefulness of the proposedapproach.

Makoto Yamada
NTT Communication Science Laboratories, NTT Corporation, Seika-cho, Kyoto, 619-0237, Japan
Taiji Suzuki
Department of Mathematical Informatics, University of Tokyo, Bunkyo-ku, Tokyo, 113-8656, Japan
Takafumi Kanamori
Department of Computer Science and Mathematical Informatics, Nagoya University, Chikusa-ku, Nagoya, Aichi, 464-8601, Japan
Hirotaka Hachiya
Department of Computer Science, Tokyo Institute of Technology, Meguro-ku, Tokyo, 152-8552, Japan
Masashi Sugiyama
Department of Computer Science, Tokyo Institute of Technology, Meguro-ku, Tokyo, 152-8552, Japan

Divergence estimators based on direct approximation of density ratios without going through separate approximation of numerator and denominator densities have been successfully applied to machine learning tasks that involve distribution comparison such as outlier detection, transfer learning, and two-sample homogeneity test. However, since density-ratio functions often possess high fluctuation, divergence estimation is a challenging task in practice. In this letter, we use relative divergences for distribution comparison, which involves approximation of relative density ratios. Since relative density ratios are always smoother than corresponding ordinary density ratios, our proposed method is favorable in terms of nonparametric convergence speed. Furthermore, we show that the proposed divergence estimator has asymptotic variance independent of the model complexity under a parametric setup, implying that the proposed estimator hardly overfits even with complex models. Through experiments, we demonstrate the usefulness of the proposedapproach.

Makoto Yamada
NTT Communication Science Laboratories, NTT Corporation, Seika-cho, Kyoto, 619-0237, Japan
Taiji Suzuki
Department of Mathematical Informatics, University of Tokyo, Bunkyo-ku, Tokyo, 113-8656, Japan
Takafumi Kanamori
Department of Computer Science and Mathematical Informatics, Nagoya University, Chikusa-ku, Nagoya, Aichi, 464-8601, Japan
Hirotaka Hachiya
Department of Computer Science, Tokyo Institute of Technology, Meguro-ku, Tokyo, 152-8552, Japan
Masashi Sugiyama
Department of Computer Science, Tokyo Institute of Technology, Meguro-ku, Tokyo, 152-8552, Japan