Data fusion, which involves integrating data from multiple sources, is increasingly valuable across various fields due to its ability to enhance information quality, accuracy, and reliability. This process enables a more comprehensive understanding of complex phenomena by merging diverse datasets, providing insights that are otherwise unattainable. In the realm of remote sensing, where precise data acquisition is critical, fusion techniques have become indispensable, benefiting applications such as object detection, classification, and change detection. While much emphasis has been placed on spatial sharpening techniques in published studies, there remains a notable gap in establishing robust workflows for both lab-based and UAS-based remote sensing data fusion, particularly in the near-infrared (VNIR) and short-wave infrared (SWIR) regions. This study aims to investigate VNIR-SWIR fusion using data sourced from a medieval manuscript in a controlled laboratory environment and from UAS-based sensors in a real-world setting, addressing differences in system parameters and processing workflows. Despite challenges such as image registration issues, our analysis has yielded promising results, underscoring the importance of ongoing refinement in fusion methodologies to ensure comprehensive data interpretation and analysis across diverse datasets and environments.
|