Video software to "see" someone's pulse

 Newsoffice Images Article Images Original 20120621151524-0

MIT researchers developed software that highlights differences between successive frames of video that are usually too subtle or quick to catch. "So, for instance, the software makes it possible to actually “see” someone’s pulse, as the skin reddens and pales with the flow of blood (video stills above), and it can exaggerate tiny motions, making visible the vibrations of individual guitar strings or the breathing of a swaddled infant in a neonatal intensive care unit." "Researchers amplify variations in video, making the invisible visible"



  1. Hey!  That video of the pulse on the wrist shows the pulse in the wrong spot!   Don’t you usually feel for arterial pulse on the same side of the wrist as the thumb?  That’s what I learned in my first aid classes at least…

    1. Well, it may be different for feeling and seeing.

      Actually, looking at my wrists right now, I can see my pulse without any camera assistance. Is that a bad sign? Should I go see a doctor?

    2.  There’s a  couple arteries you can palpate the wrist pulse at, the radial and ulnar.  Radial is more prominent on palpation.  Some folks also have one in the middle (median). 

      1. The mnemonic for remembering the thumb side is the Radial side is twiddling your thumb and remembering your thumb goes Round and Round.

    1. You could just mount a low powered laser to the top of the building/middle of the bridge and have it point toward a target.  The movement should be visible as the laser dot moves around the target.

      I mean it’s not as cool as just recording a video of the building and then running it through some computations.  But something large scale like that would certainly get kids involved/interested in things like this.

    2. As long as there are features that provide contrast you can do digital image correlation (DIC) if you have images from various angles. To look at strain fields in pressure vessels we used a sponge and tempera paint to cover the thing with speckles to provide the contrast. The rest is voodoo to me – I’m not the DIC guy at work.

  2. I don’t really get the vibration thing.  If you already have a high speed input, then slow that down and apply a motion vector to it wouldn’t that show you where the motion is at?  Like the last comparison with the DSLR.  (I assume he means the DSLR pictured is snapping pictures while they film it in high speed.)  In the unfiltered video nothing appears to be happening, but it they are being played back at the same speed I should be able to see that level of motion in the video.  It seems like their computation is exaggerating the movements to make them more visible, not just identifying them.

    1. That’s exactly what it’s doing. Read the notes under the vids, the whole point is the software intelligently amplifies motion- so makes it more obvious. Otherwise it would just be a high speed camera :)

      1. But in terms of being beneficial, like seeing a pulse, I think they are presenting it very badly.

        Exaggerating the movement is something they have done after/during the detection, not an application with the knowledge of the movement…er. let me explain that.

        There are a lot of video filters for avisynth that do similar things.  Take a frame, break it down, look for movement, look at the previous and future frames, look for movement, analyze all those movement vectors to see if what you really have is accurate movement or something like dust or grain.  In this instance apply a smoothing filter to only those areas with non uniform movement…and you have a temporal degrain filter.

        So in that example there is the intermediate step of comparing all the motion vectors (which seems to be what these comparisons are showing), but that’s not the actual output – in my example it is now the degrained frame from the original video.  The example with the baby/infant was good, show the heart beat (output) with the color overlay/change of the infant on the original video (intermediate step). 

        Just presenting the intermediate step without usable context makes it confusing as to what information the video is trying to present.

  3. Err.. perhaps I’m missing something, but Philipps Vital Signs Camera App has been available since November 2011 and produces reasonable results.

    1. Yes. You are missing something. Remember those shaking blurry spots? They were all slightly shaking at different rates and they were able in post to select an arbitrary frequency and anything slightly shaking at that frequency had its swing greatly amplified while nothing shaking at other rates was affected. And they could select and change the target frequency at will. That’s pretty slick.

  4. The implications of something like this built into Google’s Glass project, or another AR system are staggering. It can pretty much turn people into walking lie-detector tests.

Comments are closed.