Special Topic on Devices and Circuits for Wearable and IoT Systems

An error-based micro-sensor capture system for real-time motion estimation

Lin Yang1, Shiwei Ye1, Zhibo Wang1, Zhipei Huang1, , Jiankang Wu1, Yongmei Kong2 and Li Zhang3

+ Author Affiliations

 Corresponding author: Zhipei Huang, Email: zhphuang@ucas.ac.cn

PDF

Abstract: A wearable micro-sensor motion capture system with 16 IMUs and an error-compensatory complementary filter algorithm for real-time motion estimation has been developed to acquire accurate 3D orientation and displacement in real life activities. In the proposed filter algorithm, the gyroscope bias error, orientation error and magnetic disturbance error are estimated and compensated, significantly reducing the orientation estimation error due to sensor noise and drift. Displacement estimation, especially for activities such as jumping, has been the challenge in micro-sensor motion capture. An adaptive gait phase detection algorithm has been developed to accommodate accurate displacement estimation in different types of activities. The performance of this system is benchmarked with respect to the results of VICON optical capture system. The experimental results have demonstrated effectiveness of the system in daily activities tracking, with estimation error 0.16 ± 0.06 m for normal walking and 0.13 ± 0.11 m for jumping motions.

Key words: motion capture systemIMUcomplementary filtermotion estimation



[1]
Fern'ndez-Baena A, Susín A, Lligadas X, et al. Biomechanical validation of upper-body and lower-body joint movements of kinect motion capture data for rehabilitation treatments. Proceedings of the 4th IEEE Intelligent networking and collaborative systems (INCoS), 2012: 656
[2]
Kong W, Sessa S, Cosentino S, et al. Development of a real-time IMU-based motion capture system for gait rehabilitation. IEEE Robotics and Biomimetics (ROBIO), 2013: 2100
[3]
Schmitz A, Ye M, Shapiro R, et al. Accuracy and repeatability of joint angles measured using a single camera markerless motion capture system. J Biomechan, 2014, 47(2): 587 doi: 10.1016/j.jbiomech.2013.11.031
[4]
Park S, Park H, Kim J, et al. 3D displacement measurement model for health monitoring of structures using a motion capture system. Measurement, 2015: 352
[5]
Moeslund TB, Hilton A, Krüger V. A survey of advances in vision-based human motion capture and analysis. International Journal of Computer Vision and Image Understanding, 2006, 104(2): 90
[6]
Vlasic D, Adelsberger R, Vannucci G, et al. Practical motion capture in everyday surroundings. ACM transactions on graphics (TOG), 2007, 26(3): 35 doi: 10.1145/1276377
[7]
Lambrecht, J, Kirsch R. Miniature low-power inertial sensors: promising technology for implantable motion capture systems. IEEE Trans Neural Syst Rehabil Eng, 2014, 22(6): 1138 doi: 10.1109/TNSRE.2014.2324825
[8]
Aminian K, Najafi B. Capturing human motion using body-fixed sensors: outdoor measurement and clinical applications. Comput Animat Virtual Worlds, 2004, 15(2): 79 doi: 10.1002/(ISSN)1546-427X
[9]
Brigante C, Abbate N, Basile A, et al. Towards miniaturization of a MEMS-based wearable motion capture system. IEEE Trans Ind Electron, 2011, 58(8): 3234 doi: 10.1109/TIE.2011.2148671
[10]
Antonello R, Ito K, Oboe R. Acceleration measurement drift rejection in motion control systems by augmented-state kinematic Kalman filter. IEEE Trans Ind Electron, 2016, 63(3): 1953 doi: 10.1109/TIE.2015.2512224
[11]
Godha S, Lachapelle G. Foot mounted inertial system for pedestrian navigation. Meas Sci Technol, 2008, 19(7): 075202 doi: 10.1088/0957-0233/19/7/075202
[12]
Yun X, Calusdian J, Bachmann E, et al. Estimation of human foot motion during normal walking using inertial and magnetic sensor measurements. IEEE Trans Instrum Meas, 2012, 61(7): 2059 doi: 10.1109/TIM.2011.2179830
[13]
Wang Z, Zhao H, Qiu S, et al. Stance-phase detection for ZUPT-aided foot-mounted pedestrian navigation system. IEEE/ASME Trans Mechatron, 2015, 20(6): 3170 doi: 10.1109/TMECH.2015.2430357
[14]
Hol J, Dijkstra F, Luinge H, et al. Tightly coupled UWB / IMU pose estimation. Proc IEEE Ultra-Wideband Conference, 2009: 688
[15]
Corrales J, Candelas F, Torres F. Hybrid tracking of human operators using IMU/UWB data fusion by a Kalman filter. Proceedings of the 3rd ACM/IEEE Human-Robot Interaction (HRI) International Conference, 2008: 193
[16]
Zihajehzadeh S, Yoon P, Kang B, et al. UWB-aided inertial motion capture for lower body 3-D dynamic activity and trajectory tracking. IEEE Trans Instrum Meas, 2015, 64(12): 3577 doi: 10.1109/TIM.2015.2459532
[17]
Qi Y, Soh C, Gunawan E, et al. Assessment of foot trajectory for human gait phase detection using wireless ultrasonic sensor network. IEEE Trans Neural Syst Rehabil Eng, 2016, 24(1): 88 doi: 10.1109/TNSRE.2015.2409123
[18]
Fischer C, Muthukrishnan K, Hazas M, et al. Ultrasound-aided pedestrian dead reckoning for indoor navigation. Proceedings of the first ACM international workshop on Mobile entity localization and tracking in GPS-less environments, 2008: 31
[19]
Kuipers J. Quaternions and Rotation Sequences. A Primer with Applications to Orbits, Aerospace Virtual Reality, 2002: 400
[20]
Choukroun D, Bar-Itzhack I, Oshman Y. Novel quaternion Kalman filter. IEEE Trans Aerospace Electron Syst, 2006, 42(1): 174 doi: 10.1109/TAES.2006.1603413
Fig. 1.  description for our proposed system and algorithms.

Fig. 2.  (Color online) Typical stance phase and swing phase in daily activities.

Fig. 3.  (Color online) Experiment setup (right) and our IMU unit (left).

Fig. 4.  (Color online) (a) Walking back and forth. (b) Jumping in place. (c) Standing jump. (d) Compound motion.

Fig. 5.  (Color online) Velocity estimation (x and y) result for indoor walking back and forth.

Fig. 6.  (Color online) Trajectory in walking back and forth experiment. (a) Trajectory by our algorithm. (b) Trajectory compared with integration.

Fig. 7.  (Color online) Vertical displacement estimation during jumping in place.

Fig. 8.  (Color online) Trajectory of walking around rectangle area.

Fig. 9.  (Color online) Trajectory estimation during jumping in place.

Table 1.   Phase-detection accuracy in different gait type.

Gait type Detection accuracy (%)
Walking along straight 100
Walking back and forth 95
Jumping in place 100
Standing jump 100
Compound motion 94
DownLoad: CSV

Table 2.   Comparison of position estimation error in different experiments.

Experiment CKF (m) Integration (m) Godha[11] (m) Yun[12] (m)
Short distance walking (5 m) 0.16 ± 0.06 1.45 ± 0.26 0.82 ± 0.24 0.52 ± 0.16
Short distance walking back and forth (10 m) 0.21 ± 0.09 1.01 ± 0.35 0.86 ± 0.28 0.74 ± 0.21
Jumping in place (0.5 m) 0.09 ± 0.09 0.18 ± 0.16 0.16 ± 0.14 0.12 ± 0.12
Standing jump (1.0 m) 0.13 ± 0.11 0.22 ± 0.23 0.21 ± 0.20 0.21 ± 0.19
Outdoor long distance walking (100 m) 3.6 ± 0.4 9.2 ± 5.4 8.6 ± 4.8 7.8 ± 3.7
DownLoad: CSV
[1]
Fern'ndez-Baena A, Susín A, Lligadas X, et al. Biomechanical validation of upper-body and lower-body joint movements of kinect motion capture data for rehabilitation treatments. Proceedings of the 4th IEEE Intelligent networking and collaborative systems (INCoS), 2012: 656
[2]
Kong W, Sessa S, Cosentino S, et al. Development of a real-time IMU-based motion capture system for gait rehabilitation. IEEE Robotics and Biomimetics (ROBIO), 2013: 2100
[3]
Schmitz A, Ye M, Shapiro R, et al. Accuracy and repeatability of joint angles measured using a single camera markerless motion capture system. J Biomechan, 2014, 47(2): 587 doi: 10.1016/j.jbiomech.2013.11.031
[4]
Park S, Park H, Kim J, et al. 3D displacement measurement model for health monitoring of structures using a motion capture system. Measurement, 2015: 352
[5]
Moeslund TB, Hilton A, Krüger V. A survey of advances in vision-based human motion capture and analysis. International Journal of Computer Vision and Image Understanding, 2006, 104(2): 90
[6]
Vlasic D, Adelsberger R, Vannucci G, et al. Practical motion capture in everyday surroundings. ACM transactions on graphics (TOG), 2007, 26(3): 35 doi: 10.1145/1276377
[7]
Lambrecht, J, Kirsch R. Miniature low-power inertial sensors: promising technology for implantable motion capture systems. IEEE Trans Neural Syst Rehabil Eng, 2014, 22(6): 1138 doi: 10.1109/TNSRE.2014.2324825
[8]
Aminian K, Najafi B. Capturing human motion using body-fixed sensors: outdoor measurement and clinical applications. Comput Animat Virtual Worlds, 2004, 15(2): 79 doi: 10.1002/(ISSN)1546-427X
[9]
Brigante C, Abbate N, Basile A, et al. Towards miniaturization of a MEMS-based wearable motion capture system. IEEE Trans Ind Electron, 2011, 58(8): 3234 doi: 10.1109/TIE.2011.2148671
[10]
Antonello R, Ito K, Oboe R. Acceleration measurement drift rejection in motion control systems by augmented-state kinematic Kalman filter. IEEE Trans Ind Electron, 2016, 63(3): 1953 doi: 10.1109/TIE.2015.2512224
[11]
Godha S, Lachapelle G. Foot mounted inertial system for pedestrian navigation. Meas Sci Technol, 2008, 19(7): 075202 doi: 10.1088/0957-0233/19/7/075202
[12]
Yun X, Calusdian J, Bachmann E, et al. Estimation of human foot motion during normal walking using inertial and magnetic sensor measurements. IEEE Trans Instrum Meas, 2012, 61(7): 2059 doi: 10.1109/TIM.2011.2179830
[13]
Wang Z, Zhao H, Qiu S, et al. Stance-phase detection for ZUPT-aided foot-mounted pedestrian navigation system. IEEE/ASME Trans Mechatron, 2015, 20(6): 3170 doi: 10.1109/TMECH.2015.2430357
[14]
Hol J, Dijkstra F, Luinge H, et al. Tightly coupled UWB / IMU pose estimation. Proc IEEE Ultra-Wideband Conference, 2009: 688
[15]
Corrales J, Candelas F, Torres F. Hybrid tracking of human operators using IMU/UWB data fusion by a Kalman filter. Proceedings of the 3rd ACM/IEEE Human-Robot Interaction (HRI) International Conference, 2008: 193
[16]
Zihajehzadeh S, Yoon P, Kang B, et al. UWB-aided inertial motion capture for lower body 3-D dynamic activity and trajectory tracking. IEEE Trans Instrum Meas, 2015, 64(12): 3577 doi: 10.1109/TIM.2015.2459532
[17]
Qi Y, Soh C, Gunawan E, et al. Assessment of foot trajectory for human gait phase detection using wireless ultrasonic sensor network. IEEE Trans Neural Syst Rehabil Eng, 2016, 24(1): 88 doi: 10.1109/TNSRE.2015.2409123
[18]
Fischer C, Muthukrishnan K, Hazas M, et al. Ultrasound-aided pedestrian dead reckoning for indoor navigation. Proceedings of the first ACM international workshop on Mobile entity localization and tracking in GPS-less environments, 2008: 31
[19]
Kuipers J. Quaternions and Rotation Sequences. A Primer with Applications to Orbits, Aerospace Virtual Reality, 2002: 400
[20]
Choukroun D, Bar-Itzhack I, Oshman Y. Novel quaternion Kalman filter. IEEE Trans Aerospace Electron Syst, 2006, 42(1): 174 doi: 10.1109/TAES.2006.1603413
  • Search

    Advanced Search >>

    GET CITATION

    shu

    Export: BibTex EndNote

    Article Metrics

    Article views: 3549 Times PDF downloads: 25 Times Cited by: 0 Times

    History

    Received: 30 March 2017 Revised: 06 June 2017 Online: Accepted Manuscript: 13 November 2017Published: 01 October 2017

    Catalog

      Email This Article

      User name:
      Email:*请输入正确邮箱
      Code:*验证码错误
      Lin Yang, Shiwei Ye, Zhibo Wang, Zhipei Huang, Jiankang Wu, Yongmei Kong, Li Zhang. An error-based micro-sensor capture system for real-time motion estimation[J]. Journal of Semiconductors, 2017, 38(10): 105004. doi: 10.1088/1674-4926/38/10/105004 L Yang, S W Ye, Z B Wang, Z P Huang, J K Wu, Y M Kong, L Zhang. An error-based micro-sensor capture system for real-time motion estimation[J]. J. Semicond., 2017, 38(10): 105004. doi: 10.1088/1674-4926/38/10/105004.Export: BibTex EndNote
      Citation:
      Lin Yang, Shiwei Ye, Zhibo Wang, Zhipei Huang, Jiankang Wu, Yongmei Kong, Li Zhang. An error-based micro-sensor capture system for real-time motion estimation[J]. Journal of Semiconductors, 2017, 38(10): 105004. doi: 10.1088/1674-4926/38/10/105004

      L Yang, S W Ye, Z B Wang, Z P Huang, J K Wu, Y M Kong, L Zhang. An error-based micro-sensor capture system for real-time motion estimation[J]. J. Semicond., 2017, 38(10): 105004. doi: 10.1088/1674-4926/38/10/105004.
      Export: BibTex EndNote

      An error-based micro-sensor capture system for real-time motion estimation

      doi: 10.1088/1674-4926/38/10/105004
      Funds:

      Research supported by the National Natural Science Foundation of China (Nos. 61431017, 81272166).

      More Information
      • Corresponding author: Email: zhphuang@ucas.ac.cn
      • Received Date: 2017-03-30
      • Revised Date: 2017-06-06
      • Published Date: 2017-10-01

      Catalog

        /

        DownLoad:  Full-Size Img  PowerPoint
        Return
        Return