Deep Lesion Graphs in the Wild: Relationship Learning and Organization
of Significant Radiology Image Findings in a Diverse Large-scale Lesion
Database
release_rwlzerpimbfrlnuoa2ckzt5zha
by
Ke Yan, Xiaosong Wang, Le Lu, Ling Zhang, Adam Harrison, Mohammadhad
Bagheri, Ronald Summers
2018
Abstract
Radiologists in their daily work routinely find and annotate significant
abnormalities on a large number of radiology images. Such abnormalities, or
lesions, have collected over years and stored in hospitals' picture archiving
and communication systems. However, they are basically unsorted and lack
semantic annotations like type and location. In this paper, we aim to organize
and explore them by learning a deep feature representation for each lesion. A
large-scale and comprehensive dataset, DeepLesion, is introduced for this task.
DeepLesion contains bounding boxes and size measurements of over 32K lesions.
To model their similarity relationship, we leverage multiple supervision
information including types, self-supervised location coordinates and sizes.
They require little manual annotation effort but describe useful attributes of
the lesions. Then, a triplet network is utilized to learn lesion embeddings
with a sequential sampling strategy to depict their hierarchical similarity
structure. Experiments show promising qualitative and quantitative results on
lesion retrieval, clustering, and classification. The learned embeddings can be
further employed to build a lesion graph for various clinically useful
applications. We propose algorithms for intra-patient lesion matching and
missing annotation mining. Experimental results validate their effectiveness.
In text/plain
format
Archived Files and Locations
application/pdf 6.2 MB
file_32jfgqia4bdxnbo4vcisexf76m
|
arxiv.org (repository) web.archive.org (webarchive) |
1711.10535v2
access all versions, variants, and formats of this works (eg, pre-prints)