Design of Interactive Automation System for Virtual Reality Translation Robot Based on Multi sensor Data
To further improve the translation accuracy of English translation robots,a virtual reality translation robot interaction automation system based on multi-sensor data is proposed.The system first utilizes Kinect sensors,IMU inertial sensors,and EMG electromyography sensors to collect information on user translation interaction behavior;Then use the KNN-EMAR algorithm for senti-ment analysis to obtain the sentiment analysis results,and input the results together with the source language text information to be translated into the Universal MMT model to obtain the translation results;Finally,using virtual reality technology,create segmented animations to preserve user interaction behavior and output translation results through animation and voice.Through experiments,when the number of training samples is 700,the accuracy of KNN-EMAR in user sentiment analysis can reach 99.23%;And after adding the emotional analysis results output by the KNN-EMAR algorithm to the input information of the Universal MMT model,the BLEU and METR index values of the translation results have significantly improved,and the translation performance is good,basically meeting the design requirements;The system can operate normally and output translation results of interactive animations and transla-ted speech in the Oculus series of virtual reality devices.Therefore,this system has certain practical significance in the research of English translation robots.