Processing, Please wait...

  • Home
  • About Us
  • Search:
  • Advanced Search

Growing Science » Tags cloud » Forest fire detection

Journals

  • IJIEC (747)
  • MSL (2643)
  • DSL (668)
  • CCL (508)
  • USCM (1092)
  • ESM (413)
  • AC (562)
  • JPM (271)
  • IJDS (912)
  • JFS (91)
  • HE (32)
  • SCI (26)

Keywords

Supply chain management(166)
Jordan(161)
Vietnam(149)
Customer satisfaction(120)
Performance(113)
Supply chain(110)
Service quality(98)
Competitive advantage(95)
Tehran Stock Exchange(94)
SMEs(87)
optimization(86)
Financial performance(83)
Trust(83)
TOPSIS(83)
Sustainability(81)
Job satisfaction(80)
Factor analysis(78)
Social media(78)
Knowledge Management(77)
Artificial intelligence(77)


» Show all keywords

Authors

Naser Azad(82)
Mohammad Reza Iravani(64)
Zeplin Jiwa Husada Tarigan(63)
Endri Endri(45)
Muhammad Alshurideh(42)
Hotlan Siagian(39)
Jumadil Saputra(36)
Dmaithan Almajali(36)
Muhammad Turki Alshurideh(35)
Barween Al Kurdi(32)
Ahmad Makui(32)
Basrowi Basrowi(31)
Hassan Ghodrati(31)
Mohammad Khodaei Valahzaghard(30)
Sautma Ronni Basana(29)
Shankar Chakraborty(29)
Ni Nyoman Kerti Yasa(29)
Sulieman Ibraheem Shelash Al-Hawary(28)
Prasadja Ricardianto(28)
Haitham M. Alzoubi(27)


» Show all authors

Countries

Iran(2183)
Indonesia(1290)
India(787)
Jordan(786)
Vietnam(504)
Saudi Arabia(453)
Malaysia(441)
United Arab Emirates(220)
China(206)
Thailand(153)
United States(111)
Turkey(106)
Ukraine(104)
Egypt(98)
Canada(92)
Peru(88)
Pakistan(85)
United Kingdom(80)
Morocco(79)
Nigeria(78)


» Show all countries
Sort articles by: Volume | Date | Most Rates | Most Views | Reviews | Alphabet
1.

Employing CNN mobileNetV2 and ensemble models in classifying drones forest fire detection images Pages 297-316 Right click to download the paper Download PDF

Authors: Dima Suleiman, Ruba Obiedat, Rizik Al-Sayyed, Shadi Saleh, Wolfram Hardt, Yazan Al-Zain

DOI: 10.5267/j.ijdns.2024.10.004

Keywords: Forest fire detection, Drone imagery, MobileNetV2, Ensemble learning, DeepFire dataset, Transfer learning

Abstract:
In recent years, the adoption of advanced machine learning techniques has revolutionized approaches to solving complex problems, such as identifying occurrences of forest fires. Among these techniques, the use of Convolutional Neural Networks (CNNs) combined with ensemble methods is particularly promising. To investigate the feasibility of detecting fires using video streams from Unmanned Aerial Vehicles (UAVs), the lightweight CNN architecture MobileNetV2 was utilized for real-time detection. Several experiments were conducted on the DeepFire dataset, which comprises an equal number of images with and without fire, to evaluate MobileNetV2's performance. Notably, the architecture's linear bottlenecks and the efficient use of inverted residuals ensure high accuracy without compromising on feature extraction capabilities. For a comprehensive assessment, MobileNetV2 was benchmarked against other models, including DenseNet121, EfficientNetV2S, and VGG16. Accuracy was enhanced by averaging predictions through methods such as voting or summing results. As documented in the literature, MobileNetV2 consistently outperforms other architectures in computational efficiency and provides an excellent balance between efficiency and the quality of learned features over multiple epochs. This study underscores the suitability of MobileNetV2 for real-time applications on drones, particularly for the detection of forest fires in resource-constrained environments. The results show that MobileNetV2 achieves the highest accuracy (0.994), sensitivity (0.994), and specificity (0.998) among the tested models, with low standard deviations across all metrics. In contrast, EfficientNetV2S exhibited the lowest accuracy and sensitivity, both at 0.779, with a specificity of 0.829. The ensemble (Sum) method achieved an average accuracy of 0.989, sensitivity of 0.989, and specificity of approximately 0.988. Therefore, MobileNetV2 not only delivers the highest accuracy and stability but also demonstrates that the choice of ensemble method significantly affects the results.
Details
  • 51
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: IJDS | Year: 2025 | Volume: 9 | Issue: 2 | Views: 421 | Reviews: 0

 
2.

Nature inspired firefighter assistant by unmanned aerial vehicle (UAV) data Pages 143-166 Right click to download the paper Download PDF

Authors: Seyed Muhammad Hossein Mousavi, Atiye Ilanloo

DOI: 10.5267/j.jfs.2023.1.004

Keywords: Unmanned Aerial Vehicle (UAV), Forest Fire Detection, Nature Inspired Image Processing, Image Segmentation, Classification and regression tree

Abstract:
One of the most hazardous phenomena in forests is wildfire or bush fire and early detection of massive damage prevention is vital. Employing Unmanned Aerial Vehicles (UAV) as a visual and extinguisher tool in order to prevent this tragedy which brings fatal effects on humans and wildlife has high importance. Additionally, using aerial imagery could assist firefighters to recognize fire intensity and localize and route the fire in the forest which shrinks down casualties of firefighters. All these benefits and more is just possible by employing cheap UAVs. The proposed research uses nature-inspired image processing techniques in order to segment and classify fire in color and thermal images. Multiple nature-inspired and traditional computer vision techniques, including Chicken Swarm Algorithm (CSA) intensity adjustment (contrast enhancement), Denoising Convolutional Neural Network (DnCNN), Local Phase Quantization (LPQ) feature extraction, Bees Image Segmentation, Biogeography-Based Optimization (BBO) feature selection, Firefly Algorithm (FA) classification and more are employed to achieve high classification and segmentation accuracy. The system evaluates nine performance metrics including, F-Score, Accuracy, and Jaccard for the segmentation stage and four performance metrics for the classification stage. All experiments are conducted on the two most recent UAV fire datasets of FLAME (2021) and DeepFire (2022). Additionally, fire intensity, fire direction, and fire geometrical calculation are calculated which assists firefighters even more. As smoke shows the location of the fire, a smoke detection workflow is proposed, too. Proposed system Compared with traditional and novel methods for segmentation and classification leading to satisfactory and promising results for almost all metrics. The trained model of this system could be used in most of the current rescue UAVs in real-time applications. For the FLAME dataset (color data), segmentation precision is 95.57 % and classification accuracy is 91.33 %. Also, For the DeepFire dataset segmentation precision is 91.74 % and classification accuracy is 96.88 %.
Details
  • 51
  • 1
  • 2
  • 3
  • 4
  • 5

Journal: JFS | Year: 2023 | Volume: 3 | Issue: 3 | Views: 998 | Reviews: 0

 

® 2010-2026 GrowingScience.Com