Dr.Allan Ponniah is a Consultant Plastic Surgeon at the Royal Free Hospital, London. He has been involved in the latest Live Science experiment at the Science Museum – ‘Are your facial expressions unique?’ – run in collaboration with researchers from Imperial College London. Below he explores how the data gathered at Live Science could improve the lives of autistic children and people undergoing reconstructive surgery amongst many others.
The human face is an incredible part of our bodies. It plays heavily into how we view ourselves and how the world sees us especially in terms of our identity, perceived beauty, ability to communicate and the expression of emotions.
Surprisingly up until now, analysis of the face has been very limited. I first encountered this during my plastic and reconstructive surgery training as many of the ideals of facial proportions were based on north American Caucasian females. This was due to a lot of previous work being driven by the cosmetic surgery market and as a result planning surgery for facial disfigurement could be difficult if the patient fell within a demographic that had not previously been studied.
It was for this reason that whilst I was conducting research at Great Ormond Street Hospital, I set up the largest 3D facial data collection at the Science Museum in 2012. As visitors to the museum come from diverse backgrounds, we were able to collect data for a wide range of age, ethnic and gender groups. To analyse such a data-set I set up a collaboration with Stefanos Zafeiriou from Imperial College London (ICL) and the research gathered then led to the creation of the most accurate digital model of the human face in existence.
Stefanos and I now have a new project at the Science Museum, where we are developing our original research further by collecting 3D data of facial expressions. Once visitors have filled out information covering their ethnic background they are invited to make facial expressions based on emotions like fear, anger, happiness etc. While they’re doing this our 3dMD cameras capture their expressions allowing us to build up a full digital map of how their faces move and express emotion.
The result will be a range of 3D renderings representing about 30 “typical” people of different genders, ethnicities and age groups to help with planning facial reconstruction surgery. The project has excited the public and press with The Times praising the way – ‘moving footage of volunteers pulling a wide range of expressions, including happy, sad, surprised and disgusted, will enable more realistic remodelling…and help people who need reconstructive surgery.’ This 3D technology can also be applied in non-medical ways, such as in games where the desire for accuracy in characters portrayals is ever more important, virtual reality and other entertainment industries.
One exciting application of this technology is as a tool to help children with autism. Stefanos has previously worked with Maja Pantic – a Professor at ICL who developed the technology behind the Kaspar robot which is used to help children with autism. We are also consulting various experts and charities including the team at Great Ormond Street Hospital, the National Autistic Society and the Makaton charity in the development of an app that could help children across the autistic spectrum.
A common characteristic of autism is it can make judging the meaning behind other people’s facial expressions more difficult. As a significant amount of human interaction is non-verbal being able to understand common facial expressions can open up a world of new ways to communicate. Hopefully this could help autistic children to feel more confident in their interactions with others, and could inspire the discovery of many more ways of helping people by using the research conducted at the Science Museum.
Live Science ‘Are your facial expressions unique?’ is running in the ‘Who Am I’ gallery until 3rd of July 2017.