Enhancing maritime situational awareness through multimodal fusion: insights from a real-world experiment
Wohlleben, K., Hubner, M., Markchom, T.
It is advisable to refer to the publisher's version if you intend to cite from this work. See Guidance on citing. Abstract/SummaryEffective maritime border surveillance is crucial. Challenges we face include irregular migration, smuggling, oil spills and the need for rapid search and rescue. Various sensing technologies, including AIS, SAR, optical and infrared sensors, as well as UAV-mounted sensors, clearly enhance maritime awareness. However, integrating their diverse outputs remains complex. Feature-level multi-modal sensor fusion is a well-known methodology for robust detection and behavior analysis. However, most research relies on simulations or isolated sensors, which limits practical insights. This study presents a controlled real-world experiment combining synchronized data from coastal ground sensors and UAV-mounted visual and infrared sensors. The recorded dataset enables the evaluation of feature-level fusion in authentic conditions. We enhance existing fusion frameworks with additional modules and assess them using operational metrics. This study contributes to our understanding of the efficacy of multi-modal fusion in complex maritime environments, while also highlighting the significant challenges involved in transitioning from simulations to controlled real-world sensor data.
Deposit Details University Staff: Request a correction | Centaur Editors: Update this record |
Tools
Tools