Export file:

Format

  • RIS(for EndNote,Reference Manager,ProCite)
  • BibTex
  • Text

Content

  • Citation Only
  • Citation and Abstract

Deep learning for actinic keratosis classification

1 Information Engineering, University of Padua, Via Gradenigo 6, 35131 Padova, Italy
2 Faculty of Medicine and Health Technology, Tampere University

Classification of biological images plays a crucial role in many biological problems, e.g. recognition of cell phenotypes and maturation levels, localization of cell organelles and histopathological classification, and holds the potential to support early diagnosis, which is critical in disease prevention. In this paper, we tested different ensemble of canonical and deep classifiers to provide accurate identification of actinic keratosis (AK), one of the most common skin lesions that could degenerate into lethal squamous cell carcinomas.
We used a clinical image dataset to build and test different ensembles of support vector machines trained by handcrafted descriptors and convolutional neural networks (CNNs) for which we experimented different learning rates, augmentation techniques (e.g. warping) and topologies.
Our results show that the proposed ensemble obtains performance comparable to the state of the art. To reproduce the experiments reported in this paper, the MATLAB code of all the descriptors is available at https://github.com/LorisNanni.
  Figure/Table
  Supplementary
  Article Metrics

References

1. Nanni L, Paci M, Brahnam S, et al. (2017) An ensemble of visual features for Gaussians of local descriptors and non-binary coding for texture descriptors. Expert Syst Appl 82: 27-39.    

2. Quaedvlieg PJF, Tirsi E, Thissen MRTM, et al. (2006) Actinic keratosis: how to differentiate the good from the bad ones? Eur J Dermatol 16: 335-339.

3. Schwartz RA (1997) The Actinic Keratosis A Perspective and Update. Dermatol Surg 23: 1009-1019.

4. Rigel DS, Gold LFS (2013) The importance of early diagnosis and treatment of actinic keratosis. J Am Acad Dermatol 68: S20-S27.    

5. Wassef C, Rao BK (2013) Uses of non-invasive imaging in the diagnosis of skin cancer: An overview of the currently available modalities. Int J Dermatol 52: 1481-1489.    

6. Ballerini L, Fisher RB, Aldridge B, et al. (2013) A Color and Texture Based Hierarchical K-NN Approach to the Classification of Non-melanoma Skin Lesions. In: M.E. Celebi, G. Schaefer (Eds.) Color Medical Image Analysis. Springer, Dordrecht. 6: 63-86.

7. Hames SC, Sinnya S, Tan JM, et al. (2015) Automated Detection of Actinic Keratoses in Clinical Photographs. PLoS One 10: e0112447.    

8. Kawahara J, BenTaieb A, Hamarneh G (2016) Deep features to classify skin lesions. 2016 IEEE 13th International Symposium on Biomedical Imaging 2016: 1397-1400.

9. Spyridonos P, Gaitanis G, Likas A, et al. (2017) Automatic discrimination of actinic keratoses from clinical photographs. Comput Biol Med 88: 50-59.    

10. Spyridonos P, Gaitanis G, Likas A, et al. (2019) Late fusion of deep and shallow features to improve discrimination of actinic keratosis from normal skin using clinical photography. Skin Res Technol 25: 538-543.    

11. (2016) Clinical photography. Journal of Oral Biology and Craniofacial Research 6: 171.    

12. Pietikäinen M, Hadid A, Zhao G, et al. (2011) Local Binary Patterns for Still Images. In: Computer Vision Using Local Binary Patterns. Computational Imaging and Vision. Springer, London. 40: 13-47.    

13. Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet Classification with Deep Convolutional Neural Networks. Commun ACM 60: 84-90.

14. Badrinarayanan V, Kendall A, Cipolla R (2017) SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation. IEEE T Pattern Anal 39: 2481-2495.    

15. Ren S, He K, Girshick RB, et al. (2017) Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. IEEE T Pattern Anal 39: 1137-1149.    

16. Szegedy C, Liu W, Jia Y, et al. (2015) Going deeper with convolutions. 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 1-9.

17. He K, Zhang X, Ren S, et al. (2016) Deep Residual Learning for Image Recognition. 2016 IEEE Conference on Computer Vision and Pattern Recognition 770-778.

18. Russakovsky O, Deng J, Su H, et al. (2015) Imagenet large scale visual recognition challenge. Int J Comput Vis 115: 211-252.    

19. Bookstein FL (1989) Principal warps: Thin-plate splines and the decomposition of deformations. IEEE T Pattern Anal 11: 567-585.    

20. Cubuk ED, Zoph B, Mane D, et al. (2019) AutoAugment: Learning Augmentation Strategies From Data. Proceedings of the IEEE Conference on Computer Vision Pattern Recognition 113-123.

21. Tan X, Triggs W (2010) Enhanced local texture feature sets for face recognition under difficult lighting conditions. IEEE T Image Process 19: 1635-1650.    

22. Guo Z, Zhang L, Zhang D (2010) A completed modeling of local binary pattern operator for texture classification. IEEE T Image Process 19: 1657-1663.    

23. Nosaka R, Fukui K (2014) HEp-2 cell classification using rotation invariant co-occurrence among local binary patterns. Pattern Recogn 47: 2428-2436.    

24. Nanni L, Paci M, Santos F, et al. (2016) Review on texture descriptors for image classification. Comput Vis Simul Methods Appl Technol.

25. Zhu Z, You X, Chen CLP, et al. (2015) An adaptive hybrid pattern for noise-robust texture analysis. Pattern Recogn 48: 2592-2608.    

26. Dalal N, Triggs B (2005) Histograms of oriented gradients for human detection.

27. Song T, Li H, Meng F, et al. (2018) LETRIST: Locally encoded transform feature histogram for rotation-invariant texture classification. IEEE T Circ Syst Vid 28: 1565-1579.    

28. Fagerland MW, Lydersen S, Laake P (2013) The McNemar test for binary matched-pairs data: mid-p and asymptotic are better than exact conditional. BMC Med Res Methodol 13: 91.    

© 2020 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution Licese (http://creativecommons.org/licenses/by/4.0)

Download full text in PDF

Export Citation

Article outline

Show full outline
Copyright © AIMS Press All Rights Reserved