首页|Fusion of acoustic and deep features for pig cough sound recognition

Fusion of acoustic and deep features for pig cough sound recognition

扫码查看
? 2022 Elsevier B.V.The recognition of pig cough sound is a prerequisite for early warning of respiratory diseases in pig houses, which is essential for detecting animal welfare and predicting productivity. With respect to pig cough recognition, it is a highly crucial step to create representative pig sound characteristics. To this end, this paper proposed a feature fusion method by combining acoustic and deep features from audio segments. First, a set of acoustic features from different domains were extracted from sound signals, and recursive feature elimination based on random forest (RF-RFE) was adopted to conduct feature selection. Second, time-frequency representations (TFRs) involving constant-Q transform (CQT) and short-time Fourier transform (STFT) were employed to extract visual features from a fine-tuned convolutional neural network (CNN) model. Finally, the ensemble of the two kinds of features was fed into support vector machine (SVM) by early fusion to identify pig cough sounds. This work investigated the performance of the proposed acoustic and deep features fusion, which achieved 97.35% accuracy for pig cough recognition. The results provide further evidence for the effectiveness of combining acoustic and deep spectrum features as a robust feature representation for pig cough recognition.

Convolutional neural networksFeature fusionPig coughTime-frequency representations

Shen W.、Ji N.、Yin Y.、Dai B.、Sun B.、Tu D.、Hou H.、Kou S.、Zhao Y.

展开 >

School of Electrical Engineering and Information Northeast Agricultural University

Tus College of Digit Guangxi University of Science and Technology

School of Computer Science Harbin Finance University

Department of Computer Science Donald Bren School of Information and Computer Sciences University of California Irvine

展开 >

2022

Computers and Electronics in Agriculture

Computers and Electronics in Agriculture

EISCI
ISSN:0168-1699
年,卷(期):2022.197
  • 11
  • 37