It’s 2049 and you’re feeling sick. Instead of going to the doctor — which has become an obsolete profession — you take out your phone and take a diagnosing “selfie.” Fiction? More like soon to be a fact; scientists have designed a computer model that accurately predicts your health based on the shape of your face.
Here’s a futuristic little nugget that the latest Blade Runner appears to have missed: in the year 2049, replicants would have also come in a “doctor model,” because androids with superior diagnosing abilities are becoming easier and easier to fathom.
If that sounds too far-fetched, just consider this: a computer model has not only managed to accurately “guess” aspects of health just by looking at a face, but the human brain was also recently found to work in the exact same way.
Dr. Ian Stephen, of Macquarie University in Sydney, Australia, and his colleagues used facial shape analysis to correctly detect markers of physiological health in more than 270 individuals of different ethnicities.
“We have developed a computer model,” explains Dr. Stephen, “that can determine information about a person’s health simply by analyzing their face, supporting the idea that the face contains valid, perceptible cues to physiological health.”
The findings have now been published in the journal Frontiers in Psychology, and they make the idea of a computer-enhanced super-doctor whose brain has been optimized for flawless diagnosing appear more scientific than fictional.
Or, in the meantime, maybe just a very cool app will do.
Dr. Stephen explains how the study was carried out: “First, we used photos of 272 Asian, African, and Caucasian faces to train the computer to recognize people’s body fat, BMI, […] and blood pressure from the shape of their faces.”
“We then asked the computer to predict these three health variables in other faces, and found that it could do so,” says Dr. Stephen.
Next, the researchers wanted to see whether or not humans would detect health cues in the same way. So, Dr. Stephen and his colleagues designed an app that enabled human participants to change the appearance of the faces so that they would look as healthy as possible.
The parameters of the app could be changed according to the computer model.
“We found that the participants altered the faces to look lower in fat, have a lower BMI and, to a lesser extent, a lower blood pressure, in order to make them look healthier,” says Dr. Stephen.
“This suggests that some of the features that determine how healthy a face looks to humans are the same features that the computer model was using to predict body fat, BMI, and blood pressure.”
In other words, our brains work in much the same way as the computer model, and they can predict health from a facial shape with surprising accuracy.
Dr. Stephen goes on to speculate about the evolutionary significance of the findings. He says, “The results suggest that our brains have evolved mechanisms for extracting health information from people’s faces, allowing us to identify healthy people to mate with or to form cooperative relationships with.”
“This fills an important missing link in current evolutionary theories of attractiveness,” he adds.
“The findings,” Dr. Stephen concludes, “provide strong support for the hypothesis that the face contains valid, perceptible cues to physiological health, and while the models are at an early stage, we hope that they could be used to help diagnose health problems in the future.”
But will the future still have doctors in it? Or simply health-diagnosing apps? Or even super-capable healthcare replicants performing the jobs that humans no longer wish to do? It remains to be seen…