DEEP LEARNING TECHNIQUES USING REMOTE SENSING DATA FOR ENHANCED DISASTER MANAGEMENT: A REVIEW
Keywords:
Disaster Management, Remote Sensing, Deep Learning, Multimodal data, GPS, GNSS, SAR, LiDAR, Satellite ImageryAbstract
Multimodal remote sensing data plays a crucial role in disaster management by providing comprehensive information for effective response and recovery. However, current disaster detection and prediction mainly rely on satellite images and sensors. This study investigates the use of data fusion techniques and deep learning models to process diverse remote sensing data sources from different modalities to enhance disaster response strategies and decision-making. By synthesizing the existing literature and research findings, this study examines the current state- of-the-art approaches, challenges, and opportunities in leveraging diverse remote sensing data types, such as satellite imagery, LiDAR data, GPS, and GNSS data. This study provides a comprehensive overview of the utilization of multimodal remote sensing data in disaster management for various phases of disasters, such as the pre-disaster, during-disaster, and post-disaster phases. Natural disasters with catastrophic consequences, such as earthquakes, landslides, and floods, are the focus of this study. By critically evaluating the strengths and limitations of existing methodologies, this study aims to identify gaps in research and propose future directions for advancing the use of deep learning in multimodal data.