Tightly-Coupled LiDAR-IMU-Leg Odometry with
Online Learned Leg Kinematics Incorporating Foot Tactile Information

Taku Okawara, Kenji Koide, Aoki Takanose, Shuji Oishi, Masashi Yokozuka, Kentaro Uno, Kazuya Yoshida

Tohoku University, AIST

Robotics and Automation Letters, 2025, May

Overview

Legged robots have significant potential for transportation and inspection tasks in challenging environments (e.g., rough terrain) thanks to their superior locomotion capabilities compared to wheeled robots. Featureless environments (e.g., tunnels, long corridors, alleys, lunar surfaces) and deformable terrains (e.g., sandy beaches and gravel) are challenging environments for exteroceptive sensors (e.g., LiDAR, camera)-based and kinematic models-based odometry estimation, respectively.

To deal with these challenges, we presented an odometry estimation algorithm that fuses LiDAR-IMU constraints and online trainable leg kinematic constraints incorporating tactile information (neural leg kinematics model) in the proposed factor graph based on tightly coupled way. We propose the neural adaptive leg odometry factor to solve simultaneously solve odometry estimation and online training of the neural leg kinematics. This model's online training enhances its adaptability to changes in the weight loads of a legged robot and terrain conditions. This enables the effective utilization of foot tactile information (reaction forces) for motion prediction because foot reaction forces vary with both robot weight loads and terrain conditions. To balance accuracy and computational costs for training the network, we divided the model into two models trained online (online learning model) and offline (offline learning model).

Overview

Overview of the proposed method, which simultaneously solves odometry estimation and online training of the neural leg kinematics model on a unified factor graph.

Why is foot tactile information (foot reaction force) incorporated into the neural leg kinematics model?

Foot tactile information (foot reaction force)-based motion prediction is performed by: 1) estimating the acceleration induced by the reaction force by dividing this reaction force by the robot's mass, and 2) estimating the robot's velocity by integrating this acceleration [1][2]. Therefore, adaptation to robot weight loads is needed to effectively use foot tactile information for generic motion prediction based on neural leg kinematics model. This adaptation is important for some tasks, such as delivery and transportation, where the robot's weight load can change mid-application. Therefore, we train the neural leg kinematics model online to adapt to robot weight loads.

Principle

Principle of foot tactile information (foot reaction force)-based motion prediction.

Offline training of the neural leg kinematics model

As shown in the overview image, the neural leg kinematics model is divided into the online learning model and the offline learning model. The offline learning model is trained by the 28 types of datasets shown in the following image. We obtained the reference twist of the offline batch learning based on LiDAR-IMU odometry using the omnidirectional FOV LiDAR (Livox MID-360).

offline_learning_datasets

Datasets used for our offline training procedure of the neural leg kinematics model (i.e., obtain the offline learning model).

In offline training procedure only, LiDAR-IMU odometry with the omnidirectional FOV LiDAR (Livox MID-360) is used to obtain the reference twist (output of the neural leg kinematics model).

Regarding label data for contact states, we manually processed the 1D foot force sensor (Unitree footpad) values based on a threshold, to create the label indicating whether each foot is in contact or not. Note that the contact state labels are needed for only the offline batch learning phase (i.e., the contact state labels are NOT needed for the online learning phase).

Online training of the neural leg kinematics model

As shown in the overview image, the online learning model of the neural leg kinematics model is trained online with odometry estimation to retain the both consistency. To jointly conduct odometry estimation and online learning on a unified factor graph, we propose the neural adaptive leg odometry factor, which is a constraint related to a robot pose and MLP parameter of the online learning model. Note that the offline learning is fixed during the odometry estimation (online learning procedure).

The objective function eALL is defined by the sum of LiDAR-based (Matching cost factor), IMU-based (IMU preintegration factor), the neural leg kinematics model-based motion constraints (neural adaptive leg odometry factor), and some constraints (e.g., prior factor) as follows. See the paper for details of each error term. Therefore, we optimize this objective function eALL using the ISAM2 optimizer to perform our state estimation.

offline_learning_datasets

Experimental results of odometry estimation and online training of the neural leg kinematics model

The proposed odometry estimation algorithm was demonstrated with the narrow FOR LiDAR (Livox AVIA) to imitate severely featureless environments such as the following image.

odometry_exp_condition

Experimental conditions of the proposed odometry estimation. Note that the omnidirectional FOV LiDAR was NOT used during odometry estimation (online learning phase).

The proposed method was evaluated based on two experiment sequences:

  • (1) Sandy beach sequence: Deformable terrains and extremely featureless environments.

The quadruped robot conducted odometry estimation in the deformable terrain, where the assumption leg robot's kinematics-based motion constraints are corrupted.

Sunahama degeneration visualization

LiDAR point clouds is degenerated severely due to extremely featureless environments.

The external weight load (3kg) was removed in the middle of the experiment to demonstrate the network's adaptability to changes in robot weight load. Tactile information (foot reaction force)-based motion prediction can be conducted by dividing the reaction force by the robot mass; thus, adaptability to the robot mass is needed for the neural leg kinematics model.

Sunahama results

Odometry estimation results of beach sequence. The map was constructed by aligning the raw point cloud with the robot poses estimated through our odometry estimation. We can see that half of the point clouds were detected as degenerated.

  • (2) Campus journey sequence: Terrain condition changes and featureless environments.
Campus degeneration visualization

Terrain condition was changed sequentially, 1) Asphalt, 2) Gravel (deformable terrain), and 3) Grass in the campus sequence. Furthermore, areas where point clouds degenerate are included in these environments.

The 3kg of the external weight load is removed in the middle of this experiment. Adaptability to the robot's weight load in the neural adaptive leg kinematics model is crucial for effectively incorporating tactile information (foot reaction force) into our learning model.

Campus results

Odometry estimation results of campus journey sequence. The map was constructed by aligning the raw point cloud with the robot poses estimated through our odometry estimation. We can see that many point clouds were detected as degenerated.

ATEs and RTEs of the Odometry Algorithms

Method/Sequence Campus Sandy Beach
ATE [m] RTE [m] ATE [m] RTE [m]
Ours 0.29 ± 0.12 0.13 ± 0.04 0.08 ± 0.05 0.12 ± 0.04
Ours w/o online learning 0.36 ± 0.17 0.17 ± 0.07 0.90 ± 0.70 0.20 ± 0.10
Ours w/o tactile info. 0.63 ± 0.30 0.15 ± 0.06 0.12 ± 0.06 0.12 ± 0.04
FAST-LIO2 Corrupted Corrupted Corrupted Corrupted
Unitree odometry w/ LIO 0.57 ± 0.32 0.15 ± 0.06 No record No record
Unitree odometry 0.80 ± 0.46 0.17 ± 0.06 No record No record

References

  • [1] M. Fourmy, T. Flayols, P.-A. Léziart, N. Mansard, and J. Solà, "Contact forces preintegration for estimation in legged robotics using factor graphs," in 2021 IEEE International Conference on Robotics and Automation (ICRA), 2021, pp. 1372–1378.
  • [2] J. Kang, H. Kim, and K.-S. Kim, "VIEW: Visual-inertial external wrench estimator for legged robot", IEEE Robotics and Automation Letters, 2023, pp. 8366–8377.