Pigment Classification Method of Mural Multi-Spectral Image Based on Multi-Scale Superpixel Segmentation
In pigment classification of mural multi-spectral image,traditional algorithms typically extract the spatial features of the image through the fixed pane.Specifically,the spatial relationship between different pigments is ignored,and the classification error of pigments in the halo area is large.Furthermore,the feature extraction method of a single scale cannot effectively express the differences between pigment blocks.In this study,a pigment classification method for mural multi-spectral images based on multi-scale superpixel segmentation is proposed.First,the dimensionality of mural multi-spectral data is reduced by using adaptive band optimization method,which effectively reduces the amount of data required for superpixel segmentation.Second,the pseudo-color image synthesized by the first three bands after the band optimization and dimensionality reduction is segmented based on gradient constraint.It leads to segmentation results that are more close to the actual contour and improves the accuracy of pigment classification.Third,the selected sample pixels are mapped into the super pixels to realize the spatial information and feature enhancement of the image.Finally,given that a single scale cannot be accurately applied to each pigment block,multi-scale superpixels are used to segment false-color mural images,obtain segmentation maps of different scales,perform mean filtering in the same superpixel label region of the segmentation map,and use support vector machine(SVM)classifier to classify the multi-scale superpixel segmentation images.A fusion decision strategy based on majority voting is adopted to obtain the final classification result.The experimental results show that the proposed method can realize an overall accuracy of 98.84%and average accuracy of 97.75%on the simulated mural multi-spectral image dataset.Hence,the proposed method can provide more accurate classification results than the control group.