Have you ever wondered what you or your children will look like at age 60, 70 or 80? For those who are curious, researchers have developed a software program that automatically ages a young child’s face throughout its life, and the system could be helpful for cases of missing children, the team says.

The researchers, from the University of Washington, have described their findings in a paper posted online, and they plan to present their findings at the IEEE Computer Vision and Pattern Recognition conference in Ohio in June.

They explain that modeling and predicting the change in a person’s face is quite tricky because the aging process depends on environmental and genetic factors that may not be apparent in a single photo captured during youth.

Additionally, there is not much available data to build models on, given that age analysis databases are small, have low resolution or are limited in age range.

“Aging photos of very young children from a single photo is considered the most difficult of all scenarios,” explains Ira Kemelmacher-Schlizerman, assistant professor of computer science and engineering, “so we wanted to focus specifically on this very challenging case.”

For their work, she explains that they used photos of children in uncontrolled conditions and found that their method “works remarkably well.”

Currently, to age-progress photos of missing children, the renderings are typically created by artists who use editing software and photos of the child and family members to predict the changes that will occur as a result of the aging process.

However, the researchers say this is a very time-consuming technique that is especially difficult to produce in children younger than 5 years old, given that their faces are quite baby-like in appearance.

But the team’s new technique takes into account thousands of faces of the same gender and age, taken from Google image searches. Other images containing age information – such as photos of science contests, beauty contests and soccer teams – are also pulled into the database. The software then calculates the average physical changes in the group, applying those changes to the subject’s face.

Age progression
The researchers used their software on a single photo of a child (far left). They then age-progressed the photo (left in each pair) and compared it with actual photos of the same person at the corresponding age (right in each pair).
Image credit: University of Washington

For their work, the team tested their algorithm-created images against photos of 82 real people who were photographed over many years.

They then conducted an experiment, in which they asked participants to spot the real-life photos, and results revealed that 44% of people chose the real-life photos, while 37% chose the rendered images. Of the rest of the participants, 15% said both were equally likely to be the real photo, while 5% said neither image was likely.

“Our extensive user studies demonstrated age progression results that are so convincing that people can’t distinguish them from reality,” says co-author Steven Seitz, professor of computer science and engineering.

“When shown images of an age-progressed child photo and a photo of the same person as an adult, people are unable to reliably identify which one is the real photo,” he adds.

The researchers explain that their software can run on a normal computer, and to generate results for one face, it only takes around 30 seconds.

Additionally, to compensate for factors in the original photo such as lighting, shadows, facial expressions – or, yes, even milk moustaches – the algorithm first accounts for tilted faces, turned heads or variable lighting and then applies the shape and appearance changes to the face.

The team says for future improvements, they plan to incorporate other attributes, such as wrinkles, hair whitening and ethnicity. ”I’m really interested in trying to find some representation of everyone in the world by leveraging the massive amounts of captured face photos,” says Kemelmacher-Schlizerman. “The aging process is one of many dimensions to consider.”

Medical News Today recently reported that researchers from Ohio State University taught a computer to recognize 21 different human emotions.