ACML 2020 🇹🇭
  • News
  • Program

CCA-Flow: Deep Multi-view Subspace Learning with Inverse Autoregressive Flow

By Jia He, Feiyang Pan, Fuzhen Zhuang, and Qing He

Abstract

Multi-view subspace learning aims to learn a shared representation from multiple sources or views of an entity. The learned representation enables reconstruction of common patterns of multi-view data, which helps dimensional reduction, exploratory data analysis, missing view completion, and various downstream tasks. However, existing methods often use simple structured approximations of the posterior of shared latent variables for the sake of computational efficiency. Such oversimplified models have a huge impact on the inference quality and can hurt the representation power. To this end, we propose a new method for multi-view subspace learning that achieves efficient Bayesian inference with strong representation power. Our method, coined CCA-Flow, bases on variational Canonical Correlation Analysis and models the inference network as an Inverse Autoregressive Flow (IAF). With the flow-based variational inference imposed on the latent variables, the posterior approximations can be arbitrarily complex and flexible, and the model can still be efficiently trained with stochastic gradient descent. Experiments on three benchmark multi-view datasets show that our model gives improved representations of shared latent variables and has superior performance against previous works.