Keynote Speakers

Kazuhiro Kosuge

kosuge1Dr. Kazuhiro Kosuge is a Professor in the Department of Bioengineering and Robotics at Tohoku University, Japan. He received the B.S., M.S., and Ph.D. in Control Engineering from the Tokyo Institute of Technology, in 1978, 1980, and 1988 respectively. From 1980 through 1982, he was a Research Staff in the Production Engineering Department, Nippon Denso Co., Ltd. (DENSO Co., Ltd. at present). From 1982 through 1990, he was a Research Associate in the Department of Control Engineering at Tokyo Institute of Technology. From 1990 to 1995, he was an Associate Professor at Nagoya University. From 1995, he has been at Tohoku University. He received the JSME Awards for the best papers from the Japan Society of Mechanical Engineers in 2002 and 2005, the RSJ Award for the best papers from the Robotics Society of Japan in 2005. He is an IEEE Fellow, a JSME Fellow, a SICE Fellow, and a RSJ Fellow. He served as President of IEEE Robotics and Automation Society for 2010-2011 and is currently serving as Junior Past-president of IEEE Robotics and Automation Society.

title“PaDY,” a Robot Co-worker Concept (slides)

abstract: In 2005, a dance partner robot, PBDR (Partner Ballroom Dance Robot), which dances a waltz as a female dance partner together with a male dancer, was developed in our laboratory. One of the key issues for the dance partner robot was how to read the male dancer’s lead. PaDY (in-time Parts/tools Delivery to You robot) has been developed as a co-worker robot in an automobile factory based on the technology developed for the dance partner robot. As a good co-worker could help its partner by reading its partner’s lead, the co-worker robot needs to read its partner’s lead. Without knowing the task being carried out, what its partner is doing for the task, what its partner is going to do next in the task process, and how its partner wants to be assisted, the co-worker robot could not help its partner. PaDY delivers necessary parts and tools to a worker, when he/she needs them for the assembly tasks of a vehicle by estimating what its partner is doing and what its partner is going to do. Several results of experiments carried out in an automobile assembly line show how effective the robot co-worker could be.

Tony Belpaeme

Tony Belpaeme and iCubTony Belpaeme is Professor of Cognitive Systems and Robotics at the University of Plymouth. He is associated with the Cognition Institute, the Centre for Robotics and Neural Systems and is a member of the University of Plymouth Marine Institute. He is a member of the College of the EPSRC. His research interests include social systems, cognitive robotics,  and artificial intelligence in general. At Plymouth he works alongside Angelo Cangelosi, Davide Marocco, Phil Culverhouse and Guido Bugmann on building robots that take inspiration from humans, both in their appearance and their intelligence.
Until April 2005 he was a postdoctoral fellow of the Flemish fund for scientific research (FWO Vlaanderen), and was affiliated with the Artificial Intelligence Laboratory, directed byLuc Steels, at the Vrije Universiteit Brussel. He held a guest professorship at the same university, where he taught introductory artificial intelligence and autonomous systems.

titleHarnessing the power of social engagement for human-robotics interaction (slides)

abstract: People are an inherently social species. We are instinctively and unconsciously sensitive to a wide range of social cues, and consciously and unconsciously use a wide range of channels to communicate, from the primitive channels such as eye gaze to the uniquely human such as language. Social interaction is so deep-seated, that it persists when we interact with machines and robots. This talks explores how we can make use of this trait to design better social human-robot interaction. A number of examples will be given from child-robot and human-robot interaction will be given, showing how a social robot encourages people to not only treat the robot as a social agent, but even let people form mental models of the robot. We show this can be used to good effect in machine learning on robots and in robot assisted therapy.

Nicolas Mansard


Since October 2008, Nicolas Mansard is working as a permanent researcher at LAAS/CNRS, Toulouse. He is involved in the Gepetto research group, with Philippe Souères, Florent Lamiraux, Olivier Stasse and Jean-Paul Laumond.

His research activities are concerned with sensor-based control, and more specifically the integration of sensor-based schemes into humanoid robot applications. It is an exciting research topic at the intersection of the fields of robotics, automatic control, signal processing and numerical mathematics. His main application field is currently the Humanoid robotics, as it causes serious challenges that are representative of many other robot domains.

Before being in the Gepetto group, He was a post-doc in Tsukuba, and a post-doc at the IA Lab of Stanford University, in the group led by Prof.O.Khatib. He has done his PhD. with Francois Chaumette in the Lagadic group, at INRIA Rennes. He was a student of the École Normale Supérieure de Bretagne and did both my MSc (Robotics) and BEng (CS) in the INP Grenoble (ENSIMAG).

titleSemiotics of Motion: Toward a Robotics Programming Language (slides)

abstract: My work is aiming at establishing the bases of a semiotics of motion, in order to facilitate the programing of complex robotics systems. The objective is to build a symbolic model of the action, based on the analysis of the numerical functions that drive the motion. The methodology comes from the well-known robotics concepts: motion-planning algorithms, control of redundant systems and task-function approach. The originality of the work is to consider the “task” as the unifying concept both to describe the motion and to control its execution.

In the first part, the task-function control framework is extended to cover all the possible modalities of the robot. The objective is to absorb from the lowest-possible functional level the maximum of uncertainty factors, that are then not necessary to model at the higher levels. I will focus on the hierarchical resolution of inverse dynamics in the operational space, which raises a broad expressiveness of generated motion, from efficient high-dynamics to safe human-friendly movements.

This sensorimotor layer is then used as a basic action vocabulary that enables the system to be controlled with a higher-level interface. In the second part, this action vocabulary is used to provide a dedicated robotics programing language, to build motion-planning methods and to describe an observed movement.

The proposed methods are generic and can be applied to a various systems, from robotics (redundant robots) to computer animation (virtual avatars). Nonetheless, the work is more specifically dedicated to humanoid robotics. Without forgetting other possible outlets, humanoid robotics provides a tangible applicative and experimental framework and opens by several paths toward the natural human motion.

Alessandro De Luca

ADLAlessandro De Luca is Professor of Robotics in the Department of Computer, Control, and Management Engineering at the University of Roma “La Sapienza”. His research interests include control of human-robot interaction, dynamic modeling and control of robots with flexible components, visual servoing, nonlinear control of robot manipulators and wheeled mobile robots, and fault detection in robotics. He has published over 170 journal and conference papers and book chapters, receiving best paper awards at ICRA 1998, IROS 2006, and BIOROB 2012 conferences. He was a recipient of the Helmoltz-Humboldt Award in 2005 and is a Fellow of IEEE (class of 2007). He has been the first Editor-in-Chief of the IEEE Transactions on Robotics (2004-08) and the General Chair of IEEE ICRA 2007. Currently, he is the Vice-President for Publications of the IEEE Robotics and Automation Society, a Panel Chair of the European Research Council for the Advanced Grants, and the coordinator of the FP7 European project SAPHARI. For more details:

titleProgress on Human-Robot Coexistence and Collaboration in SAPHARI (slides)

abstractRobotics research looks forward to the possibility of bringing closer humans and robots, when desired. Robots that can physically collaborate with humans will combine and enhance the skills of both the robot and the human. This will have multiple potential applications in industrial robotics (robot co-workers) and service robotics (personal assistants). To this end, robots have to be designed and controlled following new guidelines. We present a control framework for safe physical Human-Robot Collaboration (pHRC), based on a stack of nested consistent behaviors of the robot:                                                                                                                                                         i) Safety is the most important low-level feature of a robot that needs to work with humans, and includes fast detection and reaction to collisions;
ii) Coexistence is the robot capability of sharing the workspace  without interfering with humans; 
iii) Collaboration occurs when the robot performs a complex task with direct human interaction and coordination.

These concepts will be exemplified in human-robot interaction experiments performed using a KUKA Lightweight IV robot and a Microsoft Kinect sensor, as well as with a conventional small-size industrial robot. A number of basic robot capabilities will be illustrated, including whole-body collision avoidance, model- or signal-based collision detection, and distinguishing intentional contacts from undesired collisions. Combining the geometric information on the contact area, as localized on line by a depth sensor, with the joint torques due to contact, as obtained by our standard residual-based method,allows estimating the exchanged Cartesian contact forces at any point along the robot structure and without the need of force or torque sensing.
The estimate of the contact force can be used for the design of admittance, impedance, or force control laws that focus on the behavior at the contact level, rather than at the joint or Cartesian level only. Preliminary experimental results will be presented.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s