Synthetic data augmentation using GAN for improved liver lesion classification

Maayan Frid-Adar, Eyal Klang, Michal Amitai, Jacob Goldberger, Hayit Greenspan

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper, we present a data augmentation method that generates synthetic medical images using Generative Adversarial Networks (GANs). We propose a training scheme that first uses classical data augmentation to enlarge the training set and then further enlarges the data size and its diversity by applying GAN techniques for synthetic data augmentation. Our method is demonstrated on a limited dataset of computed tomography (CT) images of 182 liver lesions (53 cysts, 64 metastases and 65 hemangiomas). The classification performance using only classic data augmentation yielded 78.6% sensitivity and 88.4% specificity. By adding the synthetic data augmentation the results significantly increased to 85.7% sensitivity and 92.4% specificity.

Original languageEnglish
Title of host publication2018 IEEE 15th International Symposium on Biomedical Imaging, ISBI 2018
PublisherIEEE Computer Society
Pages289-293
Number of pages5
ISBN (Electronic)9781538636367
DOIs
StatePublished - 23 May 2018
Event15th IEEE International Symposium on Biomedical Imaging, ISBI 2018 - Washington, United States
Duration: 4 Apr 20187 Apr 2018

Publication series

NameProceedings - International Symposium on Biomedical Imaging
Volume2018-April

Conference

Conference15th IEEE International Symposium on Biomedical Imaging, ISBI 2018
Country/TerritoryUnited States
CityWashington
Period4/04/187/04/18

Keywords

  • Data augmentation
  • Generative adversarial network
  • Image synthesis
  • Lesion classification
  • Liver lesions

All Science Journal Classification (ASJC) codes

  • Biomedical Engineering
  • Radiology Nuclear Medicine and imaging

Fingerprint

Dive into the research topics of 'Synthetic data augmentation using GAN for improved liver lesion classification'. Together they form a unique fingerprint.

Cite this