What we taste is intertwined not just with what we smell, but also with other sensory inputs. What we can see, in particular, may alter how we perceive the flavor of food — at least this is what a range of experiments using virtual reality settings have shown.

man and child tasting tomatoes in gardenShare on Pinterest
New research suggests that where we are influences what we taste.

Our sense of taste rules many of our dietary choices, as we tend to pick foods that we enjoy over ones that hold less of an appeal.

More importantly, what we taste sends essential signals to our brains, alerting us immediately if food has gone off.

Taste is crucial to how we make our way through life, and throughout history, it has helped keep the human race alive. However, many different factors can influence our perception of the same culinary flavors.

According to a new study conducted by researchers from Cornell University in Ithaca, NY, what we taste when we eat is significantly influenced by where we are when we have our meals.

When we eat, we perceive not only just the taste and aroma of foods; we get sensory input from our surroundings — our eyes, ears, even our memories about surroundings.”

Robin Dando, senior author

The team’s findings have recently appeared in the Journal of Food Science.

To test how an individual’s surroundings can alter their perception of taste, the researchers asked about 50 people to participate in a virtual reality experiment.

Through virtual reality headsets, each participant experienced, by turns, three different environments: a sensory booth, a park bench, and a cow barn.

In each of these visual contexts, the participants ate a sample of blue cheese — all identical. Then, the researchers asked them to rate how much they enjoyed their cheese sample in each context, and to assess its saltiness and pungency in each case.

Sure enough, the virtual reality setting influenced the participants’ taste perceptions. When they ate the cheese sample while they were “in the cow barn,” they rated the food as more pungent than in the other settings.

“We consume foods in surroundings that can spill over into our perceptions of the food,” Dando says.

This study also provides another useful finding, namely that scientists can easily and inexpensively adapt virtual reality technology for use in food sensory evaluation, which is a top method of analysis in the field of sensory science.

Food sensory science focuses on how individuals perceive, and respond to, different types of foods and drinks, and this type of research has various applications.

One important application, for instance, is to improve the experience of eating for older people. As some people age, they may lose part of their sense of taste and thus find food less appealing, which may lead them to eat less, or less healthfully.

Food sensory evaluation that needs exposure to different environments would generally require researchers to rebuild the different types of surroundings in a laboratory setting.

However, virtual reality allows researchers to recreate a varied set of conditions with ease, and without having to invest in a wide range of materials and other resources.

“This research validates that virtual reality can be used, as it provides an immersive environment for testing,” Dando explains.

“Visually, virtual reality imparts qualities of the environment itself to the food being consumed — making this kind of testing cost-efficient,” he adds.