A new algorithm can measure heart rates of people shown in regular digital video by examining imperceptibly small head movements that come with a rush of blood caused by the heart’s contractions.

Researchers from MTI’s Computer Science and Artificial Intelligence Laboratory developed the algorithm that gave pulse calculations that were consistently within a few beats per minute of those made by electrocardiograms (EKGs).

Additionally, it was able to give approximations of the time intervals between beats – a calculation used to pinpoint people who are at risk for cardiac events.

The researchers will present their findings this summer at the Institute of Electrical and Electronics Engineers’ Computer Vision and Pattern Recognition conference.

They believe the video-based pulse-measurement system could be used for checking the heartbeats of seniors or newborns, whose skin may be harmed by repeated attachment and removal of EKG leads.

John Guttag, the Dugald C. Jackson Professor of Electrical Engineering and Computer Science and director of MIT’s Data-Driven Medicine Group, said:

“From a medical perspective, I think that the long-term utility is going to be in applications beyond just pulse measurement. Can you use the same type of techniques to look for bilateral asymmetries? What would it mean if you had more motion on one side than the other?”


Theoretically, the method could calculate cardiac output, or the volume of blood pumped by the heart, which is often used in the diagnosis of many types of heart disease. Before the echocardiogram, cardio output was measured by calculating exactly the types of mechanical forces that the new algorithm registers, says Guttag.

He explains, “I think this should be viewed as proof of concept. It opens up a lot of potential flexibility.”

The algorithm combines many methods that are known in the field of computer vision. It starts by using standard face recognition to differentiate between the person’s head from the rest of the image. Next, it randomly picks 500 to 1,000 exact points, clustered around the person’s mouth and nose, whose movements it follows from frame to frame.

Then, it filters out any frame-to-frame movements whose temporal frequency decreases outside the range of a regular heartbeat – about 0.5 to 5 hertz, or 30 to 300 cycles per minute. That will get rid of movements that continue at a lower frequency, like those caused by regular breathing and slow alterations in posture.

Lastly, using a method known as principal component analysis, the algorithm breaks down the resulting signal into many constituent signals, which stand as part of the left over movements that are not correlated with each other. Of those signals, it chooses one that appears to be the most regular and that drops within the typical frequency band of the human pulse.

Guha Balakrishnan, a graduate student in MIT’s Department of Electrical Engineering and Computer Science and researcher of the current study, also developed a variation of the algorithm that doesn’t use face recognition. It is less accurate, but it can produce a reasonable estimate of pulse rate from videoing the back of a person’s head.

Written by Kelly Fitzgerald