Three dimensional (3D) mapping of environments has gained significant research interest over the last decades because of its important need for environmental modeling and monitoring. There are many successful research efforts in this field, and even it has been turned into commercial products such as Velodyne Lidar. However, due to natural localization challenges, not much research for 3D mapping wearable sensor devices has been successfully reported. In this paper, we are interested in building a smart and wearable shoe which integrates multiple laser scanners and an inertial measurement unit (IMU) to build a 3D map of the environments. The proposed Smart Shoe can collect data and build a real-time 3D map during human walking. Such a smart shoe can support disabled people (blind people) to easily navigate and avoid obstacles in the environment. Additionally, this shoe can help firefighters quickly model and recognize objects in the fired and dark smoke buildings where the cameras may not be useful. The developed localization algorithm using IMU can output a smooth and accurate pose and trajectory of the human walking. This key importance of the shoe localization enables 3D mapping successfully while minimizing data registration error from the laser point cloud.