首页|融合空间分割注意力的织物材质识别方法

融合空间分割注意力的织物材质识别方法

扫码查看
针对传统神经网络检测织物材质精确度低、检测速度慢的问题,提出一种融合空间分割注意力的织物材质识别算法。首先对多种材质的织物风吹视频进行分帧处理,得到织物图像。接着进行数据预处理,并采集织物图像的时序信息,利用欧氏距离计算织物图像中同一像素点在时间前后的位移量,将织物图像进行区域划分。将处理后的图像输入到注意力网络中进行特征提取,采取深度可分离卷积(DSC)替代普通卷积,以减少网络参数与计算量,增强网络的特征提取能力。然后在每个卷积层后引入空间分割注意力模块(SPAM)来增强重要特征,防止特征图信息丢失过多,提升网络精度。最后通过全局平均池化层和softmax层实现织物材质的识别。结果表明:所提出的织物材质识别算法能够快速、有效地对织物材质进行分类识别,准确率达到93。9%,单张图片检测时间为83。14 ms,在保证识别精度的同时具有较强的实时性。
A fabric material recognition method based on spatially partitioned attention
To achieve high-precision identification of fabric materials,reduce identification time,and improve production efficiency,it is of great significance to develop a system capable of accurately distinguishing between various fabric types.In this paper,we proposed a fabric material recognition network that incorporates spatial segmentation attention.We utilized the pre-trained DenseNet121 network for experimental dataset selection and combined depthwise separable convolution(DSC)with the spatial partitioned attention module(SPAM)to create a network structure that fulfills the demands of swift recognition and high precision.To obtain the best performance of the network,the dataset was preprocessed by color weakening,data augmentation and region division.We collected a series of fabric images with temporal sequence information from videos showing fabrics being blown by the wind.The RGB values of critical regions were weighted and recalibrated,and random perturbations,flips,and translations were applied to the images,enhancing clarity in critical regions while suppressing irrelevant ones.The Euclidean distance was used to calculate the displacement amount around the same pixel time of the fabric image,and the image region was divided into wrinkled area and flat areas.We obtained 6,000 grayscale images of 224x224 pixels,with 1,000 fabric images per class across six categories.We constructed the proposed mixed depthwise separable convolutional neural network(MDW-CNN)using Python.Firstly,the fabric video was segmented to obtain the fabric image for data preprocessing.Then,the improved convolutional neural network was used for feature extraction,and the ordinary convolution was replaced by the DSC,which enhanced the ability of the network to extract features and reduced the network parameters and calculation.Secondly,SPAM was introduced after each convolutional layer to enhance the saliency features,prevent the loss of too much information of the feature map,and improve the accuracy of the network.Finally,fabric material recognition was achieved through the global average pooling layer and the softmax layer.The 224 px×224 px fabric image was used to complete the experiment on the Intel processor,and the CNN+LSTM,Timesformer,two-stream network,ViViT,YOLOv5,YOLOv8 and the network proposed in this paper were compared.The results show that the proposed MDW-CNN can maintain good recognition accuracy while ensuring a low number of parameters.The network proposed in this study shows strong performance in fabric material recognition,achieving a recognition accuracy of 93.9%.Regarding network parameters,the proposed method reduced them by 3.3%,48.5%,56.7%,29.3%,26.1%,and 12.7%when compared with CNN+LSTM,Timesformer,the two-stream network,ViViT,YOLOv5,and YOLOv8,respectively.In this study,the improved convolutional network method has been applied to the task of fabric recognition.Experimental results indicate that the improved network offers faster detection speeds,significantly reduces the number of network parameters,achieves a recognition accuracy of 93.9%,and has a detection time of 83.14 ms for a single image.Thus,it achieves real-time performance while maintaining high recognition accuracy.

fabric material identificationspatially partitioned attention module(SPAM)regional divisionconvolutional neural networkdepthwise separable convolution(DSC)

南科良、靳雁霞、王松松、王婷、张晓竺、张壮威

展开 >

中北大学,计算机科学与技术学院,太原 030051

中北大学,人工智能与计算机视觉研究所,太原 030051

织物材质识别 空间分割注意力模块 区域划分 卷积神经网络 深度可分离卷积

2024

现代纺织技术
浙江理工大学 浙江省纺织工程学会

现代纺织技术

CSTPCD北大核心
影响因子:0.31
ISSN:1009-265X
年,卷(期):2024.32(12)