Video software to "see" someone's pulse

Discuss

27 Responses to “Video software to "see" someone's pulse”

  1. Brad H. says:

    We are now getting closer to a real Voight-Kampff machine.

  2. Juby Monkey says:

    Hey!  That video of the pulse on the wrist shows the pulse in the wrong spot!   Don’t you usually feel for arterial pulse on the same side of the wrist as the thumb?  That’s what I learned in my first aid classes at least…

    • rrh says:

      Well, it may be different for feeling and seeing.

      Actually, looking at my wrists right now, I can see my pulse without any camera assistance. Is that a bad sign? Should I go see a doctor?

    • Snig says:

       There’s a  couple arteries you can palpate the wrist pulse at, the radial and ulnar.  Radial is more prominent on palpation.  Some folks also have one in the middle (median). 

      • Snig says:

        The mnemonic for remembering the thumb side is the Radial side is twiddling your thumb and remembering your thumb goes Round and Round.

    • you can feel a pulse anywhere you can compress an artery against bone.  Try it, you’ll find one on that side too.

  3. dross1260 says:

    Thanks for enhancing my diastolic pressure.

  4. awgrbr says:

    As an architect, I would be very interested in seeing deflections in bridges and buildings based on live loads and wind loads. Just for coolness. like this, but in real time: 
    http://gothamist.com/2009/07/24/video_manhattan_bridge_sways.php

    • bcsizemo says:

      You could just mount a low powered laser to the top of the building/middle of the bridge and have it point toward a target.  The movement should be visible as the laser dot moves around the target.

      I mean it’s not as cool as just recording a video of the building and then running it through some computations.  But something large scale like that would certainly get kids involved/interested in things like this.

    • As long as there are features that provide contrast you can do digital image correlation (DIC) if you have images from various angles. To look at strain fields in pressure vessels we used a sponge and tempera paint to cover the thing with speckles to provide the contrast. The rest is voodoo to me – I’m not the DIC guy at work.

  5. CTG says:

    You sure those MIT potheads didn’t just download the Philips Vital Signs APP that was available in 2011 and rebrand it as their own? http://itunes.apple.com/us/app/vital-signs-camera-philips/id474433446?mt=8

  6. pjk says:

    I’m sure this will be used for good.

  7. bcsizemo says:

    I don’t really get the vibration thing.  If you already have a high speed input, then slow that down and apply a motion vector to it wouldn’t that show you where the motion is at?  Like the last comparison with the DSLR.  (I assume he means the DSLR pictured is snapping pictures while they film it in high speed.)  In the unfiltered video nothing appears to be happening, but it they are being played back at the same speed I should be able to see that level of motion in the video.  It seems like their computation is exaggerating the movements to make them more visible, not just identifying them.

    • That’s exactly what it’s doing. Read the notes under the vids, the whole point is the software intelligently amplifies motion- so makes it more obvious. Otherwise it would just be a high speed camera :)

      • bcsizemo says:

        But in terms of being beneficial, like seeing a pulse, I think they are presenting it very badly.

        Exaggerating the movement is something they have done after/during the detection, not an application with the knowledge of the movement…er. let me explain that.

        There are a lot of video filters for avisynth that do similar things.  Take a frame, break it down, look for movement, look at the previous and future frames, look for movement, analyze all those movement vectors to see if what you really have is accurate movement or something like dust or grain.  In this instance apply a smoothing filter to only those areas with non uniform movement…and you have a temporal degrain filter.

        So in that example there is the intermediate step of comparing all the motion vectors (which seems to be what these comparisons are showing), but that’s not the actual output – in my example it is now the degrained frame from the original video.  The example with the baby/infant was good, show the heart beat (output) with the color overlay/change of the infant on the original video (intermediate step). 

        Just presenting the intermediate step without usable context makes it confusing as to what information the video is trying to present.

  8. A lot of this ‘frequency separation’ work even on still images is filtering down to more ordinary Photoshop retouchers in some unlikely places:
    http://www.modelmayhem.com/po.php?thread_id=439098

    In this animated gif example an artist removes the large scale wrinkles in a suit without disturbing the high frequency fabric pattern:
    http://2.bp.blogspot.com/_h5_nuNOj9N4/TM0COjcb-hI/AAAAAAAAACc/WwMP1HPNxLE/s1600/digital-steam.gif

  9. robcat2075 says:

    This is the beginning of that pepper shaker Dr. McCoy would wave around in front of people to deduce their vital signs.

  10. Jack Kieffer says:

    This is mind blowing, and it almost makes me wonder: are we human, or are we dancer?  Not really because I hate that song.

  11. Awesome Technology.. it will rock the world, Now Technology with medical science will help the mankind.

  12. retepslluerb says:

    Err.. perhaps I’m missing something, but Philipps Vital Signs Camera App has been available since November 2011 and produces reasonable results.

    • Yes. You are missing something. Remember those shaking blurry spots? They were all slightly shaking at different rates and they were able in post to select an arbitrary frequency and anything slightly shaking at that frequency had its swing greatly amplified while nothing shaking at other rates was affected. And they could select and change the target frequency at will. That’s pretty slick.

  13. Ray Perkins says:

    Isn’t  “highlighting differences between successive frames” exactly what all video compression is doing in the first place?

  14. AwesomeRobot says:

    The implications of something like this built into Google’s Glass project, or another AR system are staggering. It can pretty much turn people into walking lie-detector tests.

  15. Tomer Apel says:

    Hello,
    As can be seen in the link, my partner and I implemented such a system about a year ago

    http://in.bgu.ac.il/en/Pages/news/SIDS_bgu.aspx
    http://israel21c.org/health/the-software-that-could-prevent-crib-death/

Leave a Reply