Deep One-Class Classification Using Intra-Class Splitting release_jsin76ijy5gxdf5vaurkt7nyoy

by Patrick Schlachter, Yiwen Liao, Bin Yang

Released as a article .

2019  

Abstract

This paper introduces a generic method which enables to use conventional deep neural networks as end-to-end one-class classifiers. The method is based on splitting given data from one class into two subsets. In one-class classification, only samples of one normal class are available for training. During inference, a closed and tight decision boundary around the training samples is sought which conventional binary or multi-class neural networks are not able to provide. By splitting data into typical and atypical normal subsets, the proposed method can use a binary loss and defines an auxiliary subnetwork for distance constraints in the latent space. Various experiments on three well-known image datasets showed the effectiveness of the proposed method which outperformed seven baselines and had a better or comparable performance to the state-of-the-art.
In text/plain format

Archived Files and Locations

application/pdf  288.5 kB
file_kipl3tjh65fm3mkybvc3upen5a
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2019-03-12
Version   v2
Language   en ?
arXiv  1902.01194v2
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 26842c89-aef0-4277-b1f7-f48f70a55328
API URL: JSON