Joost de Winter appointed Professor of Cognitive Human-Robot Interaction
Joost de Winter has been appointed Professor of Cognitive Human-Robot Interaction at the Department of Cognitive Robotics as of the 6th December 2022. De Winter’s research focuses on touchless interaction between robots and humans so that in the future, robots can read and understand the intentions and instructions of humans in their vicinity by means of eye-tracking and other sensors. And robots, in turn, will be able to adapt their functioning to communicate their own intentions both to nearby humans and other robots. The vision is to contribute to a future in which humans and robots share cognitive processes and adapt to each other.
Cars that can that can determine the state and skill of the driver
Joost de Winter: “Machines will be interacting with humans more and more - in traffic situations, for example, or in domestic and work environments, so they’re increasingly being equipped with sensors. Take driving, for example: automated cars are still imperfect, and human drivers are also imperfect - many traffic accidents are caused by driver distraction. If, however, the car can determine whether the driver is distracted, and provide automated support at those moments, then we can achieve a synthesis of the information-processing of human and machine.”
Joost de Winter obtained his MSc in Aerospace Engineering at TUD in October 2004. He then got his PhD - on the topic of improving automated driver training in driving simulators - at the faculty of Mechanical, Engineering in January 2009. Since then his career has largely been devoted to the statistical processing of human behavioral signals in order to extract the strengths and weaknesses of human operators, and to developing visual, tactile, and auditory interfaces, as well as support systems, to improve safety and performance.
In the future, robots will understand and interact with humans
“The field of cognitive human-robot interaction,” says de Winter, “involves facets such as the automated inference of the human state and the development of robot motion and expressive interfaces that allow the human to understand the robot better. The scientific challenge is to develop models and algorithms that facilitate mutual human-robot understanding, and to experimentally evaluate the design concepts. I envision that, in a future household, humanoid robots will understand and interact with humans, just as humans currently understand and interact with each other.”