COgnitive RObotics LAB (COROLAB)
School of Computer Science, University of Manchester
Tell Me More


About the Research Group

The Cognitive Robotics Lab at the School of Computer Science of the University of Manchester, led by Professor Angelo Cangelosi, hosts a "Robot Home", with a living lab for human-robot interaction experiments. The lab has access to various humanoid robot platforms (iCub, Pepper, Sawyer and Nao) and other interactive robots (e.g. Giraff). The Lab is involved in numerous projects funded by the EU H2020 programme, by the US Air Force Lab, and by companies such as Honda. Current projects include MoveCare on robot companions for the elderly, THRIVE++ on trust and theory of mind, DeCIFER on the robot's understanding of human's actions and goals, and the three Marie Sklodowska Curie grants DCOMM, SECURE and STRoNA on machine learning for language and action.


Group Leader

Angelo Cangelosi

Professor of Machine Learning & Robotics

Postdoctoral Researchers

Wenxuan Mou

US THRIVE++ project

Gi Hyun Lim

H2020 STRoNA Fellowship

Daniel Hernández García

H2020 MoveCare Project

PhD Students

Baris Serhan


Gabriella Pizzuto


Jacopo de Berardinis

Marta Romeo

H2020 MoveCare Project

Mohammad Thabet


Samuele Vinanzi


Ioanna Giorgi

Martina Ruocco

US THRIVE++ project

Federico Tavella

Visiting Scholars

Debora Zanatto

Plymouth University

Juan G Victores

Universidad Carlos III de Madrid


Robots we have at our lab

  • iCub

    iCub Description

    • The iCub robot is a humanoid robot having around the dimensions of a three-and-a-half-year-old child. The prototype, of height 100 cm and weight 23 kg, valued €200K, incorporates 53 DoFs (Degrees of Freedom) for moving the head, arms & hands, waist and legs, compliant actuators and sensors which communicate with an on-board PC104 controller using CanBus. The skin in forearms and hand palms allows for tactile HRI and fine object manipulation. The head has stereo cameras in a swivel mounting where eyes would be located on the human head and microphones on the side, to replace the ears. It also has lines of red LEDs representing mouth and eyebrows mounted behind the face panel for making facial expressions and expressing emotions.

    • The robot is already capable of eye and head motion, leg and body movement using accelerometers and gyroscopes, object recognition and a grasping movement of small objects, crawling, solving complex 3D mazes, collision avoidance within non-static environments and self-collision avoidance.

    • iCub is an open systems platform for research in robotics, AI, and cognitive science. YARP (Yet Another Robot Platform) is the software that represents the nervous system of the iCub humanoid robot.

  • Sawyer

    Sawyer Description

    • Sawyer is the revolutionary collaborative robot designed to execute tasks that have been impractical to automate with traditional industrial robots.

    • Sawyer features a 7 degree of freedom robot arm with a 1260 mm reach that maneuvers into tight spaces and operates in work cells designed for humans. Built-in force sensing capabilities allow it to make adaptive decisions as tasks run, enabling Sawyer to work precisely (+/- .1 mm), while operating safely next to people.


  • Pepper

    Pepper Robot Description

    • Pepper is the world’s first social humanoid robot able to recognize faces and basic human emotions. Pepper was optimized for human interaction and is able to engage with people through conversation and his touch screen.

    • A robot designed to interact with humans. Standing 120cm tall, Pepper has no trouble in perceiving his environment and entering into a conversation when he sees a person. The touch screen on his chest displays content to highlight messages and support speech. His curvy design ensures danger-free use and a high level of acceptance by users.


  • NAO

    NAO Robot Description

    • NAO is the first robot created by SoftBank Robotics. Famous around the world, NAO is a tremendous programming tool and he has especially become a standard in education and research. NAO is also used as an assistant by companies and healthcare centers to welcome, inform and entertain visitors.

    • 58cm in height, NAO is a bipedal robot with pleasantly rounded features. 25 degrees of freedom which enable him to move and adapt to his environment. 7 touch sensors located on the head, hands and feet, sonars and an inertial unit to perceive his environment and locate himself in space. 4 directional microphones and speakers to interact with humans. Two 2D cameras to recognize shapes, objects and even people. Open and fully programmable platform.


  • Giraff

    Giraff Description

    • Giraff is a telepresence robot that relatives, home care and healthcare staff can use to virtually visit individuals that are still living at home despite their regular care needs.

    • The Giraff consists of a height-adjustable video screen on a long, narrow neck, and the bottom is equipped with a small computer. The robot moves around on resilient wheels, can move over rugs and thresholds and is easy to maneuver from a computer with a decent internet connection.



Research Projects


Safety Enables Cooperation in...


Deictic Communication


Multiple-Actors Virtual Empathic Caregiver for the Elder



Developmental Collaborative Intelligence For Embodied Robotic agents, in collaboration with Honda Research Institute


Spatio-Temporal Representation on Neuromorphic Architecture

Contact Us

We are on the Internet!