İnsansız Hava Araçlarında Bulunan Kameralar Kullanılarak Orman Yangınlarının Tespit Edilmesi
View/ Open
Date
2023Author
Aral, Rahmi Arda
xmlui.dri2xhtml.METS-1.0.item-emb
Acik erisimxmlui.mirage2.itemSummaryView.MetaData
Show full item recordAbstract
Unmanned aerial vehicles(UAVs) are invaluable technologies thanks to their remote control and monitoring capabilities. Operational forces and firefighters use UAVs in wildfire detection missions. Due to their high pattern recognition capabilities, Convolutional Neural Networks (CNNs) are one of the most prominent deep learning architectures, making them proper for the task of forest fire recognition using UAVs.
Deep convolutional neural networks can perform effectively on hardware with high processing capability. While these networks can be easily operated in unmanned aerial vehicles managed from ground control stations with GPU-supported hardware, lightweight, small-sized, and efficient networks are required to execute on a typical UAV's limited computational resources.
To overcome these impediments, this thesis presents comprehensive research for performing forest fire detection tasks using UAV vision data with deep and lightweight convolutional neural networks.
In this thesis, experiments have been carried out using well-known convolutional neural network architectures to achieve the most successful approach. We also performed transfer learning on several further models.
In addition, convolutional neural network architectures have been modified by adding an attention mechanism to develop models with high accuracy.
Among the experimented models, the attention-based EfficientNetB0 backboned model emerged as the most successful architecture.
With the test accuracy of 92.02 % in the FLAME dataset and the test accuracy of 99.76 % in the infrared dataset, the addition of a layer of attention in detecting forest fire strongly reinforces the efficiency of the EfficientNetB0 based model.