Unsupervised Correlation Analysis
release_7yv6jklvrfaxtoinb5kdnb6oou
by
Yedid Hoshen, Lior Wolf
2018
Abstract
Linking between two data sources is a basic building block in numerous
computer vision problems. In this paper, we set to answer a fundamental
cognitive question: are prior correspondences necessary for linking between
different domains?
One of the most popular methods for linking between domains is Canonical
Correlation Analysis (CCA). All current CCA algorithms require correspondences
between the views. We introduce a new method Unsupervised Correlation Analysis
(UCA), which requires no prior correspondences between the two domains. The
correlation maximization term in CCA is replaced by a combination of a
reconstruction term (similar to autoencoders), full cycle loss, orthogonality
and multiple domain confusion terms. Due to lack of supervision, the
optimization leads to multiple alternative solutions with similar scores and we
therefore introduce a consensus-based mechanism that is often able to recover
the desired solution. Remarkably, this suffices in order to link remote domains
such as text and images. We also present results on well accepted CCA
benchmarks, showing that performance far exceeds other unsupervised baselines,
and approaches supervised performance in some cases.
In text/plain
format
Archived Files and Locations
application/pdf 2.3 MB
file_ewalp5pqe5abrmupvge34ggiam
|
arxiv.org (repository) web.archive.org (webarchive) |
1804.00347v1
access all versions, variants, and formats of this works (eg, pre-prints)