KEYWORDS: Image fusion, Multispectral imaging, RGB color model, Color imaging, Image quality, Light sources and illumination, Color difference, Signal to noise ratio, Feature extraction
In low light conditions, color (RGB) images taken by cameras contain a lot of noise and loss of detail and color. However, multispectral images can provide spectral information, which can be fused by neural network models . In this paper, an improved U-Net model is proposed for multi-spectral image fusion to achieve color imaging in low illumination environment. The U-Net model is a symmetric convolutional neural network that helps extract and combine features at various image scales. Our improved U-Net model integrates residual blocks and attention mechanisms, utilizing multilevel feature extraction and contextual information fusion to significantly enhance imaging quality. To meet the requirements of low-light conditions, the model design incorporates a multi-scale feature fusion strategy, bolstering robustness against weak light and noise. We conducted multiple experiments at different light levels to validate the effectiveness of the model. The quality of the fused color images was evaluated with objective assessment metrics such as peak signal-to-noise ratio (PSNR), structural similarity index (SSIM) and chromatic aberration (ΔE). The experimental results demonstrated the effectiveness of the proposed method, which can generate color images with high color reproduction and rich detail under low light conditions. Compared to traditional methods, our approach shows substantial improvements in image clarity, noise suppression, and color authenticity, indicating significant practical value. In summary, this study combines deep learning with multispectral image fusion to propose an effective method for low-light color imaging. It offers new insights and technical solutions for addressing low-light imaging challenges in practical applications.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.