The Coglaboration Project focused on designing techniques for the fluid transfer of objects between a robot's gripper and a human hand since this was considered to be a key aspect in the development of successful and efficient robotic assistants.
RUR provided the skills needed to develop the high level software architectural design and to provide the oversight necessary for the successful integration of the many parts of the CogLaboration system.
Evaluation Methods and Procedures
Additionally, RUR provided a key role within the evaluation task.
To test that the approach for human-robot and robot-human object transfer was successful, RUR was instrumental in defining a number of scenarios that were representative of human-robot collaborative tasks. The scenarios included such tasks as:
Car mechanic's assistant: handing tools as required to a mechanic that is under a car or leaning under the bonnet
Assistant for the elderly: providing assistance to an elderly person or disabled person by handing them everyday objects such as drinks, spectacle and TV remote controls.
RUR Object Motion Sensor
To carry out the trials sensors were used to gather data on the motion of the object as it is transferred from human to robot or vice-versa.
A motion capture systems was used to determine the position vs. time trajectory of the test objects as they were transferred, however, it was realised that this alone was not sufficient because, although acceleration and jerk data are helpful in assessing the smoothness of the transfer, the data is too noisy if is calculated by repeatedly differentiating the position data from the motion capture. Additionally, data concerning the exact instant in time at which the human and robot grasp and release the object is beneficial. Although it would be possible for the robot's control software to indicate when an object is grasped or released, it is more difficult to determine this for the human participant.
To overcome both of these issues, RUR developed the RUR Object Motion Sensor to fill in these gaps. The sensor was small and light and could be embeeded within the test objects to measure and record the required information. Acceleration and jerk were measured using electronic accelerometers and rate gyroscopes and touch was measured using capacitive touch switches that could be customised to suit all of the different test objects.
The CogLaboration Project received financial support from the European Union Seventh Framework Programme FP7-ICT-7-2.1 (No. 287888).