首页|反向残差结构的指纹细节点提取轻量型网络模型

反向残差结构的指纹细节点提取轻量型网络模型

A Lightweight Network Based on Inverted Residual Structure for Fingerprint Minutiae Extraction

扫码查看
针对指纹匹配中细节点信息提取错误和提取不充分的问题,提出一种基于深度学习的端到端轻量型改进多尺度反向残差指纹网络模型(inverted residual network for fingerprint minutiae extraction,IRFingerNet).该网络模型使用改进的残差结构,建立容易优化的轻量级网络,在增加网络深度时可减少信息丢失;把指纹的脊线、细节点等多种特征融合为联合特征,增强语义信息并提高对细节点的感知能力;运用通道注意力机制,校正联合特征,增大有效特征权重,减少无效特征权重.在NIST 4、FVC 2002、FVC 2004数据库上的实验结果表明,在实际应用中,IRFingerNet可以更有效地完成指纹细节点提取的任务,拥有更高的精准度和回调率,整体的F1得分高达0.87,其效果相较于传统提取方法得到了11%的提升,且达到了每个指纹图像0.23 s的检测速度.
To address the erroneous and insufficient extraction of minutiae information in fingerprint matching,this paper proposes an improved end-to-end multiscale inverted residual network for fingerprint minutiae extraction(IRFingerNet)based on deep learning.This improved residual structure is incorporated into the model,and lightweight networks that are easily optimized is built to reduce information loss while increasing the network depth.The multiple features of fingerprint are sent into the network as united features to enhance semantic information and improve the perception of details.The channel attention mechanism is applied to calibrate the united features so that the effective feature weight is increased and the invalid feature weight decreased.Experimental results on NIST 4,FVC 2002 and FVC 2004 databases show that IRFingerNet can perform the task of fingerprint detail point extraction more effectively in practical applications with higher accuracy and callback rate,and the overall F1 score strikes 0.87,making an efficiency 11%greater than the traditional extraction method,and a detection speed of 0.23 s per fingerprint image.

feature extraction of fingerprintinverse residual fingerprint network model(IRFingerNet)deep learningattention mechanismfeature fusion

侯雪峰、苏毅婧、李俊、徐敏

展开 >

厦门理工学院电气工程与自动化学院,福建 厦门 361024

中国科学院海西研究院泉州装备制造研究中心,福建 泉州 362000

指纹特征提取 反向残差指纹网络模型(IRFingerNet) 深度学习 注意力机制 特征融合

国家自然科学基金福建省自然科学基金

619061782020J05094

2024

厦门理工学院学报
厦门理工学院

厦门理工学院学报

影响因子:0.196
ISSN:1673-4432
年,卷(期):2024.32(1)
  • 24