21-06-2012, 04:30 PM
Analysis of Pixel Level Multi Sensor Medical Image Fusion
Analysis of Pixel Level Multi Sensor.pptx (Size: 1.58 MB / Downloads: 41)
Introduction
Image fusion is a technique used to integrate a high-resolution panchromatic image with a low-resolution multispectral image to produce a high-resolution multispectral image, which contains both the high-resolution spatial information of the panchromatic image and the color information of the multispectral image.
Image fusion can provide the following functions
1. Sharpen images;
2. Improve geometric corrections;
3. Provide stereo-viewing capabilities for stereophotogrammetry;
4. Enhance certain features not visible in either of the single data alone;
5. Complement data sets for improved classification;
6. Detect changes using multitemporal data;
7. Substitute missing information (e.g. clouds-VIR, shadows-SAR) in one image with signals from another sensor image;
8. Replace defective data.
Wavelet-based image fusion method
Additive wavelet-based image fusion method
The whole process can be divided into four steps:
1. Assuming that the panchromatic image and multispectral image has been registered, apply the histogram match process between panchromatic image and different bands of the multispectral image respectively, and obtain three new panchromatic images (PANR, PANG, and PANB).
2. Use the wavelet transform to decompose new panchromatic images and different bands of multispectral image twice, respectively.
Conclusion
Compared wavelet-based and wavelet-integrated fusion methods. All fusion methods were tested on two data sets, IKONOS and Quickbird Images and also elucidated the following factors: wavelets (orthogonal, birorthogonal, and non-orthogonal), decimation or undecimation, and wavelet decomposition levels, which could affect the final fusion result. In the wavelet decomposition process, the wavelet selection affects the final fusion and the combination of wavelet with PCA and Wavelet with IHS produces the best results.