Purpose: The purpose of this study was to explore the use of fluorescein angiography (FA) images in a convolutional neural network (CNN) in the management of retinopathy of prematurity (ROP). Methods: The dataset involved a total of 835 FA images of 149 eyes (90 patients), where each eye was associated with a binary outcome (57 “untreated” eyes and 92 “treated”; 308 “untreated” images, 527 “treated”). The resolution of the images was 1600 and 1200 px in 20% of cases, whereas the remaining 80% had a resolution of 640 and 480 px. All the images were resized to 640 and 480 px before training and no other preprocessing was applied. A CNN with four convolutional layers was trained on 90% of the images (n = 752) randomly chosen. The accuracy of the prediction was assessed on the remaining 10% of images (n = 83). Keras version 2.2.0 for R with Tensorflow backend version 1.11.0 was used for the analysis. Results: The validation accuracy after 100 epochs was 0.88, whereas training accuracy was 0.97. The receiver operating characteristic (ROC) presented an area under the curve (AUC) of 0.91. Conclusions: Our study showed, we believe for the first time, the applicability of artificial intelligence (CNN) technology in the ROP management driven by FA. Further studies are needed to exploit different fields of applications of this technology. Translational Relevance: This algorithm is the basis for a system that could be applied to both ROP as well as experimental oxygen induced retinopathy.

Lepore, D., Ji, M. H., Pagliara, M. M., Lenkowicz, J., Capocchiano, N. D., Tagliaferri, L., Boldrini, L., Valentini, V., Damiani, A., Convolutional neural network based on fluorescein angiography images for retinopathy of prematurity management, <<TRANSLATIONAL VISION SCIENCE & TECHNOLOGY>>, 2020; 9 (2): 1-8. [doi:10.1167/tvst.9.2.37] [http://hdl.handle.net/10807/207150]

Convolutional neural network based on fluorescein angiography images for retinopathy of prematurity management

Lepore D.;Pagliara M. M.;Lenkowicz J.;Capocchiano N. D.;Tagliaferri L.;Boldrini L.;Valentini V.;Damiani A.
2020

Abstract

Purpose: The purpose of this study was to explore the use of fluorescein angiography (FA) images in a convolutional neural network (CNN) in the management of retinopathy of prematurity (ROP). Methods: The dataset involved a total of 835 FA images of 149 eyes (90 patients), where each eye was associated with a binary outcome (57 “untreated” eyes and 92 “treated”; 308 “untreated” images, 527 “treated”). The resolution of the images was 1600 and 1200 px in 20% of cases, whereas the remaining 80% had a resolution of 640 and 480 px. All the images were resized to 640 and 480 px before training and no other preprocessing was applied. A CNN with four convolutional layers was trained on 90% of the images (n = 752) randomly chosen. The accuracy of the prediction was assessed on the remaining 10% of images (n = 83). Keras version 2.2.0 for R with Tensorflow backend version 1.11.0 was used for the analysis. Results: The validation accuracy after 100 epochs was 0.88, whereas training accuracy was 0.97. The receiver operating characteristic (ROC) presented an area under the curve (AUC) of 0.91. Conclusions: Our study showed, we believe for the first time, the applicability of artificial intelligence (CNN) technology in the ROP management driven by FA. Further studies are needed to exploit different fields of applications of this technology. Translational Relevance: This algorithm is the basis for a system that could be applied to both ROP as well as experimental oxygen induced retinopathy.
Inglese
Lepore, D., Ji, M. H., Pagliara, M. M., Lenkowicz, J., Capocchiano, N. D., Tagliaferri, L., Boldrini, L., Valentini, V., Damiani, A., Convolutional neural network based on fluorescein angiography images for retinopathy of prematurity management, <>, 2020; 9 (2): 1-8. [doi:10.1167/tvst.9.2.37] [http://hdl.handle.net/10807/207150]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: http://hdl.handle.net/10807/207150
Citazioni
  • ???jsp.display-item.citation.pmc??? 1
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 4
social impact