首页|Researchers at University of Aizu Zero in on Machine Learning (Human Activity Recognition via Wi-Fi and Inertial Sensors With Machine Learning)
Researchers at University of Aizu Zero in on Machine Learning (Human Activity Recognition via Wi-Fi and Inertial Sensors With Machine Learning)
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
IEEE
Research findings on artificial intelligence are discussed in a new report. According to news reporting out of the University of Aizu by NewsRx editors, research stated, “Human activity recognition (HAR) plays a crucial role in human-computer interaction, smart home, health monitoring and elderly care. However, existing methods typically utilize camera, radio frequency (RF) signals or wearable devices for activity recognition.” Funders for this research include Japan Society For The Promotion of Science (Jsps) Kakenhi; Jka Foundation; New Energy And Industrial Technology Development Organization (Nedo) Intensive Support For Young Promising Researchers. The news correspondents obtained a quote from the research from University of Aizu: “Each singlesensor modality has its inherent limitations, like camera-based methods having blind spots, Wi-Fi-based methods depending on the environment and the inconvenience of wearing Inertial Measurement Unit (IMU) devices. In this paper, we propose a HAR system that leverages three types of sensor combinations: Wi-Fi, IMU and a hybrid of Wi-Fi+IMU. We utilize the Channel State Information (CSI) provided by Wi-Fi and the accelerometer and gyroscope data from IMU devices to capture activity characteristics. Then, we employ six machine learning algorithms to recognize eight types of daily activities. These algorithms include Support Vector Machine (SVM), Multi-layer Perceptron (MLP), Decision Tree, Random Forest, Logistic Regression and k-Nearest Neighbors (kNN). Additionally, we investigate the accuracy of hand gesture recognition using different sensor combinations and analyze the calculation speed of each combination. We conduct a survey to collect user feedback on the performance of various sensor combinations in our HAR system. The results show that the combination of CSI+IMU yields the best HAR recognition accuracy, with a accuracy of 89.38%. The SVM algorithm consistently performs well across all systems, especially excelling in the CSI+IMU system supported by energy and average Fast Fourier Transform (FFT) features.”
University of AizuAlgorithmsCyborgsEmerging TechnologiesMachine Learning