ALSERepository of Iași University of Life Sciences, ROMANIA

Multi-Convolutional Neural Network-Based Diagnostic Software for the Presumptive Determination of Non-Dermatophyte Molds

Show simple item record

dc.contributor.author Milanovic, Mina
dc.contributor.author Otaševic, Suzana
dc.contributor.author Randelovic, Marina
dc.contributor.author Grassi, Andrea
dc.contributor.author Cafarchia, Claudia
dc.contributor.author Mareș, Mihai
dc.contributor.author Milosavljevic, Aleksandar
dc.date.accessioned 2024-10-18T10:01:08Z
dc.date.available 2024-10-18T10:01:08Z
dc.date.issued 2024-01-31
dc.identifier.citation Milanović, Mina, Suzana Otašević, Marina Ranđelović, Andrea Grassi, Claudia Cafarchia, Mihai Mares, and Aleksandar Milosavljević. 2024. "Multi-Convolutional Neural Network-Based Diagnostic Software for the Presumptive Determination of Non-Dermatophyte Molds" Electronics 13, no. 3: 594. https://doi.org/10.3390/electronics13030594 en_US
dc.identifier.uri https://www.mdpi.com/2079-9292/13/3/594
dc.identifier.uri https://repository.iuls.ro/xmlui/handle/20.500.12811/4703
dc.description.abstract Based on the literature data, the incidence of superficial and invasive non-dermatophyte mold infection (NDMI) has increased. Many of these infections are undiagnosed or misdiagnosed, thus causing inadequate treatment procedures followed by critical conditions or even mortality of the patients. Accurate diagnosis of these infections requires complex mycological analyses and operator skills, but simple, fast, and more efficient mycological tests are still required to overcome the limitations of conventional fungal diagnostic procedures. In this study, software has been developed to provide an efficient mycological diagnosis using a trained convolutional neural network (CNN) model as a core classifier. Using EfficientNet-B2 architecture and permanent slides of NDM isolated from patient’s materials (personal archive of Prof. Otašević, Department of Microbiology and Immunology, Medical Faculty, University of Niš, Serbia), a multi-CNN model has been trained and then integrated into the diagnostic tool, with a 93.73% accuracy of the main model. The Grad-CAM visualization model has been used for further validation of the pattern recognition of the model. The software, which makes the final diagnosis based on the rule of the major method, has been tested with images provided by different European laboratories, showing an almost faultless accuracy with different test images. en_US
dc.language.iso en en_US
dc.publisher MDPI en_US
dc.rights CC BY 4.0
dc.rights.uri https://creativecommons.org/licenses/by/4.0/
dc.subject fungal infection en_US
dc.subject mold identification en_US
dc.subject deep learning en_US
dc.subject Grad-CAM en_US
dc.title Multi-Convolutional Neural Network-Based Diagnostic Software for the Presumptive Determination of Non-Dermatophyte Molds en_US
dc.type Article en_US
dc.author.affiliation Mina Milanovic, Aleksandar Milosavljevic, Faculty of Electronic Engineering, University of Niš, Aleksandra Medvedeva 14, 18000 Niš, Serbia
dc.author.affiliation Suzana Otaševic, Marina Randelovic, Department of Microbiology and Immunology, Faculty of Medicine, University of Niš, 18000 Niš, Serbia
dc.author.affiliation Suzana Otaševic, Marina Randelovic, Center of Microbiology and Parasitology, Public Health Institute Niš, 18000 Niš, Serbia
dc.author.affiliation Andrea Grassi, Istituto Zooprofilattico Sperimentale della Lombardia e dell’Emilia Romagna, 27100 Pavia, Italy
dc.author.affiliation Claudia Cafarchia, Department of Veterinary Medicine, University of Bari, Valenzano, 70010 Bari, Italy
dc.author.affiliation Mihai Mareș, Laboratory of Antimicrobial Chemotherapy, Iasi University of Life Sciences, 700490 Iasi, Romania
dc.publicationName Electronics
dc.volume 13
dc.issue 3
dc.publicationDate 2024
dc.identifier.eissn 2079-9292
dc.identifier.doi https://doi.org/10.3390/electronics13030594


Files in this item

This item appears in the following Collection(s)

Show simple item record

CC BY 4.0 Except where otherwise noted, this item's license is described as CC BY 4.0