Skip to main content
×
Home
    • Aa
    • Aa

Self-reproduction for articulated behaviors with dual humanoid robots using on-line decision tree classification

  • Jane Brooks Zurn (a1), Yuichi Motai (a1) (a2) and Scott Vento (a3)
Abstract
SUMMARY

We have proposed a new repetition framework for vision-based behavior imitation by a sequence of multiple humanoid robots, introducing an on-line method for delimiting a time-varying context. This novel approach investigates the ability of a robot “student” to observe and imitate a behavior from a “teacher” robot; the student later changes roles to become the “teacher” for a naïve robot. For the many robots that already use video acquisition systems for their real-world tasks, this method eliminates the need for additional communication capabilities and complicated interfaces. This can reduce human intervention requirements and thus enhance the robots' practical usefulness outside the laboratory. Articulated motions are modeled in a three-layer method and registered as learned behaviors using color-based landmarks. Behaviors were identified on-line after each iteration by inducing a decision tree from the visually acquired data. Error accumulated over time, creating a context drift for behavior identification. In addition, identification and transmission of behaviors can occur between robots with differing, dynamically changing configurations. ITI, an on-line decision tree inducer in the C4.5 family, performed well for data that were similar in time and configuration to the training data but the greedily chosen attributes were not optimized for resistance to accumulating error or configuration changes. Our novel algorithm, OLDEX identified context changes on-line, as well as the amount of drift that could be tolerated before compensation was required. OLDEX can thus identify time and configuration contexts for the behavior data. This improved on previous methods, which either separated contexts off-line, or could not separate the slowly time-varying context into distinct regions at all. The results demonstrated the feasibility, usefulness, and potential of our unique idea for behavioral repetition and a propagating learning scheme.

    • Send article to Kindle

      To send this article to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Self-reproduction for articulated behaviors with dual humanoid robots using on-line decision tree classification
      Available formats
      ×
      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your Dropbox account. Find out more about sending content to Dropbox.

      Self-reproduction for articulated behaviors with dual humanoid robots using on-line decision tree classification
      Available formats
      ×
      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your Google Drive account. Find out more about sending content to Google Drive.

      Self-reproduction for articulated behaviors with dual humanoid robots using on-line decision tree classification
      Available formats
      ×
Copyright
The online version of this article is published within an Open Access environment subject to the conditions of the Creative Commons Attribution-NonCommercial-ShareAlike licence <http://creativecommons.org/licenses/by-nc-sa/2.5/>. The written permission of Cambridge University Press must be obtained for commercial re-use.
Corresponding author
*Corresponding author. E-mail: ymotai@vcu.edu
Linked references
Hide All

This list contains references from the content that can be linked to their source. For a full set of references and notes please see the PDF or HTML where available.

8. J. Bongard and R. Pfeifer , “Evolving Complete Agents Using Artificial Ontogeny,”. In: Morpho-functional Machines: The New Species (Designing Embodied Intelligence) (Springer-Verlag, Berlin, 2003) pp. 237258.

29. B. Kim and G. Lee , “Decision-Tree Based Error Correction for Statistical Phrase Break Prediction in Korean,” Paper presented at the Proceedings of the 18th Conference on Computational linguistics (COLING) (Morgan Kaufmann Publishers, Saarbrücken, Germany, San Francisco, CA, USA, 2000) vol. 2, pp. 10511055.

37. A. Martinoli , A. J. Ijspeert and L. M. Gambardella , “A Probabilistic Model for Understanding and Comparing Collective Aggregation Mechanisms,” Proceedings of the 5th European Conference on Advances in Artificial Life (Springer-Verlag, Berlin/Heidelberg, 1999), Lausanne, Switzerland, vol. 1674, pp. 575584.

44. G. A. S. Pereira , V. Kumar , J. R. Spletzer , C. J. Taylor and M. F. M. Campos , “Cooperative Transport of Planar Objects by Multiple Mobile Robots Using Object Closure,” In: Experimental Robotics VIII (Springer, Berlin/Heidelberg, 2003) vol. 5, pp. 287296.

Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Robotica
  • ISSN: 0263-5747
  • EISSN: 1469-8668
  • URL: /core/journals/robotica
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Keywords:

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 115 *
Loading metrics...

Abstract views

Total abstract views: 123 *
Loading metrics...

* Views captured on Cambridge Core between September 2016 - 28th April 2017. This data will be updated every 24 hours.