Comparative Study Between Vision Transformer and EfficientNet on Marsh Grass Classification
DOI:
https://doi.org/10.32473/flairs.36.133132Palabras clave:
Deep Learning, Convolutional Neural Networks, Environmental Monitoring, Vision Transformers, Machine LearningResumen
Due to rapidly changing ecosystems, effective environmental protection often calls for the monitoring of the vegetation for any environmental changes. Vegetation monitoring is essential in assessing the changes and impacts to environmentally valuable ecosystems such as marshlands. While vegetation monitoring of marsh grasses is crucial to the maintenance and protection of marshlands, it is a tedious and time-consuming task that involves careful examination of individual pixels within large resolution images. In this study we compare the use of Vision Transformers (ViT) and two different EfficientNet models on automated marsh grass identification using the GTMNERR Marsh Grass Species data set. Our results show that the use of a ViT allowed for an increase in the accuracy of marsh grass identification. The Vision Transformer was also able to better distinguish between the 6 classes in the data set and provided competitive training time to the smaller of the two EfficientNet models tested in this study.
Descargas
Publicado
Cómo citar
Número
Sección
Licencia
Derechos de autor 2023 Conrad Testagrose, Mehlam Shabbir, Braden Weaver, Xudong Liu
![Creative Commons License](http://i.creativecommons.org/l/by-nc/4.0/88x31.png)
Esta obra está bajo una licencia internacional Creative Commons Atribución-NoComercial 4.0.