首页|Adaptive navigation assistance based on eye movement features in virtual reality
Adaptive navigation assistance based on eye movement features in virtual reality
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
万方数据
Background Navigation assistance is essential for users when roaming virtual reality scenes;however,the traditional navigation method requires users to manually request a map for viewing,which leads to low immersion and poor user experience.Methods To address this issue,we first collected data on who required navigation assistance in a virtual reality environment,including various eye movement features,such as gaze fixation,pupil size,and gaze angle.Subsequently,we used the boosting-based XGBoost algorithm to train a prediction model and finally used it to predict whether users require navigation assistance in a roaming task.Results After evaluating the performance of the model,the accuracy,precision,recall,and F1-score of our model reached approximately 95%.In addition,by applying the model to a virtual reality scene,an adaptive navigation assistance system based on the real-time eye movement data of the user was implemented.Conclusions Compared with traditional navigation assistance methods,our new adaptive navigation assistance method could enable the user to be more immersive and effective while roaming in a virtual reality(VR)environment.