Development of SLAM Experimental Teaching Platform for the Fusion of Lidar/Depth Camera
This paper presents an experimental teaching platform for simultaneous localization and mapping(SLAM)based on the fusion of lidar and depth camera data.The platform consists of a Nano computer and an STM32 motion controller,and is equipped with an Orbbec Astra S depth camera,a lidar,a gyroscope,an odome-ter,and other sensors.The platform collects environmental information through the sensors,solves the robot kine-matics,controls the robot motion,and performs lidar/depth camera fusion,SLAM,and data visualization.The control system is developed based on the Robot Operating System(ROS),which has high portability and supports secondary development.This platform can effectively and quickly construct 2D/3D maps of scenes,which helps to enhance students'practical skills,understand the relevant algorithms,and achieve good teaching outcomes.