复杂背景下基于SIFT算法的局部遮挡人脸识别
Partial Occlusion Facal Recognition Based on SIFT Algorithm in Complex Background
翁存福 1朱喜顺2
作者信息
- 1. 江西师范大学科学技术学院,江西 九江 332020
- 2. 南昌大学先进制造学院,江西 南昌 330000
- 折叠
摘要
由于复杂的背景环境中存在大量噪声点、平滑点以及失真点,影响人脸识别效果,提出基于SIFT算法的局部遮挡人脸识别方法.考虑到噪声因素的影响建立尺度不变高斯函数,通过该函数描述原始图像的噪声干扰,查找平滑程度最高的像素点,以像素均值作为参考,对像素点及周围区域实行平滑约束.建立包含眼睛、鼻子、嘴巴以及耳朵的面部特征子集,与标准图像比对,采用最大期望算法提取重叠度最高的面部特征,作为人脸识别的参照标准.采用两个向量的内积衡量人脸图像与参考值之间的相似度,计算存在和不存在局部遮挡的特征权重,建立置信区间,代入面部特征权重实施置信对比,完成人脸识别.实验结果表明,所提方法的识别精准度较高,在灰度变化、噪声以及局部遮挡的情况下均能够保证识别精度.
Abstract
Due to the presence of a large number of noise points,smooth points,and distortion points in complex background environments,which affect the effectiveness of facial recognition,a local occlusion facial recognition meth-od based on the SIFT algorithm is proposed.With the consideration of the influence of noise,we designed a Gaussian function with scale invariance to describe the noise interference of original image.After that,we searched the pixel with the highest smoothness,and took the mean value as a reference to implement smoothing constraints on the pixel and the surrounding area.Moreover,we constructed a subset of facial features including eyes,nose,mouth and ears,and then compared it with the standard image.Furthermore,we used the expectation-maximization algorithm to extract the facial features with maximum overlapping degree as the reference for face recognition.Meanwhile,we used the inner product of the two vectors to measure the similarity between face image and reference value,and calculated the feature weight of the face with and without local occlusion.Finally,we constructed a confidence interval and intro-duced the face feature weight into it for confidence comparison,thus completing the face recognition.The experimental results show that the proposed method can ensure high recognition accuracy in the case of gray level change,noise and local occlusion.
关键词
复杂背景/高斯函数/平滑约束/面部特征权重Key words
Complex background/Gaussian function/Smoothing constraint/Facial feature weight引用本文复制引用
基金项目
江西省教育厅科学技术研究项目(2021)(GJJ217906)
出版年
2024