HD TV and the placebo effect

Discuss

36 Responses to “HD TV and the placebo effect”

  1. KevinMarks says:

    This depends a lot on the quality of the upscaler in the TV – when we finally bought an HDTV, we returned the Samsung we bought first in favour of a Sony, because the Samsung reveled exactly how awful MPEG-compressed NTSC looks, whereas the Sony puts some serious post-processing work into making the Standard def signal look better.

  2. Brainspore says:

    I work with people who specialize in video and don’t seem to be bothered when the aspect ratio is set wrong. It boggles the mind…

  3. Anonymous says:

    In the eighties and nineties there existed a species known as the “film snob” — a person who insisted that scratches, expense, noise, fragility and inconvenience were a small price to pay for the benefits of a “warmer, fuller” picture. It is no accident these are the same terms used by audiophiles. Granted, people had no conception of how to get good results out of the alternative: videotape. They used it only because it was cheap.

    Then digital came along, and slowly but surely people began to appreciate its strengths. The problem is, one of them is as much a weakness as a strength.

    Faceshots in 1080p at 40+ inches are bigger and more detailed than my bathroom mirror. Every pore, makeup smudge, tiny hair, or stray eyelash is there at 10x lifesize.

    I like clarity, and high detail is nice for busy images. Sports programming clearly benefits, and maybe action movies, but personal interaction will be shot further out in future, mark my words.

    Also, lower-res online delivery movies benefit from a little analog smudging of blocky artifacts. Without a sharp corner, one is less distracted by the compression.

  4. Takashi Omoto says:

    Many non-US users (Japan and most European countries) are also used to RGB component output (via their SCART cables) since the mid’90′s. This gave improved image on cable/satellite tuners and even at the lowest end of DVD players, against the composite output that is common in the US. This has nothing to do with the PAL color format however, other than using the higher “625i” PAL resolution.

    HDTV might also be compared to PALPlus, that was a analog 576p anamorphic 16:9 format that European users had (albeit it’s mostly dead now, and I doubt most people actually remember it).

  5. Church says:

    @Brainspore I know people who do, e.g., work for Fox et al. They tend to shoot in SD, and it gets uprezzed to HD, so a fair portion of the time your “High Def” picture is just uprezzed HD. Eerily similar to this study, actually.

  6. robulus says:

    “Participants were unable to discriminate properly between digital and high-definition signals,”

    No that is not what the test, as stated, shows. If you showed them footage in SD and HD, asked them to pick which was which, and they couldn’t, then you could make that claim.

    Showing someone footage you claim is HD and recording their failure to correct you as an inability to discrimiate is ridiculous.

    There is so much variation in source material, and as others have said upscaling hardware in modern components, that you can get some really great images on SD. On the other hand if you played me the opening credits to Tru Blood I’d be able to tell you in three seconds whether it was HD or standard, because I know the source material so well.

  7. Daemon says:

    Sounds more like the Asch conformity experiment to me.

    http://en.wikipedia.org/wiki/Asch_conformity_experiments

  8. anonymous says:

    Like others have mentioned, there’s not a great detail of methodological detail here.

    Did they simply rate the two? Or was there a forced-choice decision of which it was (HD or non)?

    The two will yield different result, typically. The second method has greater power.

    Likewise, it sounds like (due to the cable and environment changes) this was a between-subjects design. Within-subjects would be preferable.

  9. EggyToast says:

    The big problem with tests like this, for me, is similar to any quality test — you can’t judge quality if you’re not sure what quality is. I have some very nice headphones, but I appreciate them because I know what I’m listening for — good bass, clear middles, sharp highs, and a good balance. For plenty of people, they just want headphones to play music, and they’re not listening to the quality of the audio.

    It’s no different from televisions, or even cars. I drive a Corolla and I discovered that while the engine is sturdy, all the little parts inside the car is what falls apart — because the car is cheap. With television, if you’ve never seen a Blu-Ray movie, and compared it to the normal DVD, you won’t even know what the difference is. You’ll get a general sense “this looks better…” but you won’t be able to say why.

    I’ve done comparisons at home but rarely is it that noticeable on the first watch — because you’re not judging the picture quality, but processing the content. Some would argue that people who appreciate high quality content can’t see the forest for the trees, but usually it’s the opposite — they already get the bigger picture, and now they have the time to look for the details. And when they see the finer details — hair, textures, lights — they can then appreciate them more.

  10. airshowfan says:

    I’ll agree with Ulotrichous and Daemon. This experiment is more about conformity than anything else.

    But that doesn’t change the fact that most people are pretty bad at distinguishing the quality of audio and images.

    Then again, when it comes to distinguishing quality, I wonder to what extent it is something we can turn off. As someone who does a lot of photography, I have a pretty good eye for sharpness (I can see it decay as you move away from the center of a photograph, I can see that some cameras can focus even the very same lenses more sharply than other cameras can) and for noise (going from a fixed-lens camera with a small sensor (Lumix, Powershot, Coolpix) to a full-frame SLR is like night and day). As someone who has done some web-design work, I have a pretty good eye for compression artifacts (after figuring out, for every image, how much I can compress it before the artifacts get too noticeable). And when I first saw a Blu-Ray being shown (properly) on an HDTV at a store, I was very impressed. BUT, in my living room I have a standard-res CRT TV. It’s huge, and was very expensive several years ago, and is about as good a CRT TV as you can get… but it’s still lower-res than the newer stuff. And guess what, I don’t care. I enjoy my shows and movies a whole lot. On my big CRT TV, I can become immersed in a movie almost as thoroughly as at the movie theater. The lower resolution is not distracting. Heck, compression artifacts on digital-broadcast TV, and even some color-banding and other artifacts on single-layer DVDs, are more distracting. I think that being able to see sharp lines and tiny details and sharp textures on an HDTV might itself distract me from the content. However, if I do get an HDTV and become used to seeing textures and sharp lines and tiny details, maybe I will later be distracted by blurry standard-def images, just as I now look at my early photography work and am appalled that I ever thought it was of acceptable quality.

    (On the other hand, I expect that the video qualities in movies have certain defficiencies, and it looks “wrong” when they are improved. My parents have a huge HDTV, wired up properly, HD video looks super sharp, etc. But when it is given a non-HD signal, by default it does noise reduction, sharpening, and some kind of morphing interpolation between frames to bring the framerate up. This makes the video look better and smoother and even gets rid of most compression artifacts… but while watching a movie, it really bugged me. I was expecting to see grain, and I was expecting to see that choppiness you get in low-framerate fast-shutterspeed filmed video… So I turned off the noise reduction and between-frame morphing, and sighed with relief when the result looked the way I wanted it to look).

    I like how Technogeek says we see what should be there rather than what’s actually there. I definitely do this for audio; The recording is almost a reminder of the song, rather than the song itself. Unless it’s of really terrible quality (compressed to less than half the bitrate you get on iTunes, or broadcast over radio that’s not being received well), I enjoy it and can be immersed in it.

    There is a limit, however. I cannot understand how so many people can be so blind to wildly distorted aspect ratios. (And it’s weird how “digital” this is; Either these distortions bug you a lot, or not at all. It’s kinda like how some large fraction of the population has no problem with using quotes for emphasis, much to the amusement of the rest of us who think these quotes make words sound sarcastic/facetious).

  11. cognitive dissonance says:

    it’s like telling people that more expensive vodka tastes better, and making people fiercely brand loyal, until dateline/48hours or whatever runs a double blind study and finds everyone likes smirdnoff more than anything else.

    i believe theres a similar cost-placebo effect with telling people the price of wine and then asking them to evaluate their satisfaction.

    and leave it to me to have not one, but two alcohol related analogies.

  12. ulotrichous says:

    This isn’t the placebo effect. This is the how many people are willing to look like an uncultured idiot in front of an AV enthusiast effect. Nobody wants to say that they don’t see the difference, it infers a lower overall capacity for perception and personal processing, and there is essentially no social cost to pay for lying and saying that you see a difference when you don’t think you do. It makes the researcher happy and makes you feel like you’re not going to be one of the poor schmucks on the survey who couldn’t tell the difference between SD and HD.

    “Hey, look at this awesome new thing! Isn’t it great and really amazing?”

    “Uh, yes! Yes it is!”

    I think this is a very interesting study in peer pressure and survey behavior. But perceptions of video quality? Not so much!

  13. Alan Braggins says:

    A while ago we had an American visitor look at our TV (showing a standard resolution Freeview picture over SCART) and say “oh, I see you have High-Def TV”.

    • arkizzle / Moderator says:

      Alan, in fairness the PAL signal is of a higher resolution than NTSC. S/he probably just noticed the sharper image and presumed HD.

      • mn_camera says:

        PAL is of a higher spatial resolution than NTSC, while being of a lower temporal resolution.

        (625 scan lines vs. 525, 25 fps vs. 30 – really 29.97 in order to get the color subcarrier in.)

  14. masamunecyrus says:

    If they used the same quality of “HD” that Best Buy and other stores do, no wonder people don’t notice the difference. I’ve never been to a TV store here in America that didn’t look like they were upscaling VHS tapes, re-encoding them with low-quality, and then transmitting them to the TVs over an antenna.

    I mean, c’mon. That’s like saying you can’t tell the difference between 640×480 and 1920×1080 on a computer monitor. That’s an unmistakable, gigantic difference, no matter who you are.

  15. Elvis Gump says:

    I do phone tech support for Directv and I can tell you the biggest problem is most people don’t even take the time to read the instructions or educate themselves even a little bit on the equipment they have. Customers will have a standard-def receiver and think buying an HDTV magically gets them an HD picture. Or they will hook up an HD receiver with RCA std-def cabling to their new HDTV and wonder why the picture looks crappy.

    When it comes to tv, the vast majority of people barely know how to work the remote control. I spend a good part of my tech support day educating customers on this. When I talk them into correct cabling and settings I often get yelps of surprise and joy when they finally get an HDTV picture on their sets.

    It tends to offset the depression I have because people in general hate their satellite and cable tv companies and scream at me about it until I can gently lead them to correctly set up their equipment over the phone. A lot of techs in the field tend to only know how to use drills and tools to set up cabling, but not always how to wire receivers and tvs properly. That seems endemic of the whole satellite AND cable installer techs I’ve dealt with in my experience.

    If you have a crappy picture, check the instructions and read the manual. You are your best defense when it comes to picture quality being what you pay for.

  16. tim says:

    As explained to me many years ago by a BBC video engineer -
    NTSC -> Never Twice the Same Colour
    SECAM -> Surely Europeans Can Achieve More
    PAL -> Pictures At Last.

  17. Ian70 says:

    I’ve seen -plenty- of upscaled TV on HDTV channels, and even plenty of STRETCH-O-VISION warping of 4:3 TV onto 16:9 on the HD A&E network, though thankfully that is less common. (Honestly, WTF is wrong with those idiots?)
    There are so many programs that are shot on film before being transferred to TV and/or HDTV, and so many different levels of quality of various productions, that it’s no wonder that people aren’t always able to distinguish just what a real proper HDTV signal is -supposed- to look like.

    If you really want to experience true HDTV, either watch any recent documentary on DiscoveryHD that involves mountain climbing and/or volcanoes. Alternately, rent a Blu-Ray version of The Dark Knight and pay special attention to the night-time helicopter shots where the black bars at the top and bottom of the movie disappear.. -that- was shot on IMAX and DOWNsampled to achieve a 1080p signal.

  18. jjfg45 says:

    The difference between 720 and 1018I has placebo written all over it.

  19. bcsizemo says:

    This really doesn’t surprise me at all. And I agree that most consumer TV’s come factory setup with some jacked up settings. Like a way over blown contrast/brightness, way to sharp, ect…

    If I had a choice I’d take a good old 480i analog signal for everyday TV watching. Why? Ever see a sports game in SD digital? Looks like ass. It has more artifacts than the Smithsonian. Now step up to a good HD signal and things change dramatically.

    As a consumer I feel shafted when I’m forced with a new option that gives me less and looks worse… Sure my DVD’s look better, but 60% or more of my TV time is TV?!…

    *sigh* I never understood why the VCR is dead.

  20. Michael of TV DVD Combi says:

    This phenomenon is very common and has appeared in all kinds of situations. I for one would speak up for sure if I could not see any difference, but knowing my luck I would probably forget my specs and really not be able to tell some old scratchy analogue from the latest HD.

    I love the analogy with the price of wine – that is SO true where I live. I drink £3.99 wine and it’s probably as good as what you used to pay the equivalent of £20 a bottle for in the 1960s.

  21. mn_camera says:

    Show most people a component NTSC or PAL (R-Y, Y, B-Y – not RGB, which is another signal altogether) picture on a top-quality studio monitor, and they will think they are seeing high def.

    Encoding to composite (for transmission – all broadcast signals are composite) removes that advantage. Decoding at the receiver will do little to re-create the original signal.

    And televisions and monitors – especially engineering monitors – are very different devices. One is meant to conceal imperfections, the other to reveal them.

  22. Steve says:

    A technically superior image is not necessarily perceived as better. When creating photoshop images, adding a fractional degree of blur is often the magic touch that makes a still picture come to life.

    I suppose I was a ‘film snob’ – I preferred the contrast-smoothed look of film over the crude look of tape, right up to the point where hi-def became capable of an image that was both pleasing and clean. But put me in a test like this one and I’d give you no more conclusive a result than these subjects. I skipped Blu-Ray because I find the picture quality jarring.

  23. scifijazznik says:

    And “gullible” is not in the dictionary.

  24. mn_camera says:

    @ Tim – I always heard SECAM as System Essentially Contrary to American Method.

  25. technogeek says:

    Really doesn’t surprise me; it’s the visual equivalent of what folks have been experiencing with MP3 audio. Higher-quality reproduction is great, but in most situations what really matters is the material being reproduced. A scratchy old 78RPM recording of a jazz great is still listenable; we quickly tune out the distracting surface noise and limited frequency response and hear what should be there rather than what’s actually there, possibly via the same sort of mechanisms that let us automatically compensate for changes in the color temperature of light sources.

    There are certainly cases where higher accuracy makes a huge difference. But … well, I saw HDTV for the first time in 1992, in the EU pavillion at the World’s Fair, and my reaction was “Yeah, it’s better. It’ll certainly help when I want to put a computer image on my TV, with tiny little details like small lettering. For most purposes, though, most folks will never notice.”

    As to whether Never The Same Colors is really sufficiently worst than PAL to make HDTV more noticable… I’m not convinced.

    Since we were going to go to digital TV anyway to save bandwidth, switching to HDTV at the same time made sense. And I do like the fact that a single screen is now good for my full media system, including the computer. But… Well, let’s put it this way: I haven’t upgraded yet.

    • BikerRay says:

      Don’t assume “digital TV” is all hi-def. Here in Canada, at least, TV stations can use their allotted digital bandwidth to provide either one hi-def channel, or (up to) six standard-def ones (typically the slot is shared amongst several stations, due to equipment costs). How often this is the case, I don’t know.

  26. tboy says:

    One wonders what Dan Ariely would say.

  27. Bryan says:

    There’s not enough info here to know if this was an properly structured test, and I am going to bet the answer is it wasn’t.

    First, was the TV was a model that performed upsampling/filtering on standard TV signal input by default. Did they turn off this feature?

    Second, what size of TV do the viewers normally watch TV on? Most SD TV owners are used to watching on 20″-30″ TVs in general. Just the act of switching to a 40″-50″ TV is going to skew the viewers perception.

    Third, … you get the idea.

  28. lyd says:

    “Picture quality is lower with NTSC, “so the difference compared with HD is much larger than for Europeans”, says van de Wijngaert.”

    That strikes me as a dubious claim. PAL has another 100 lines of resolution and a lower frame rate. Not exactly a huge improvement over NTSC.

  29. Jon H says:

    Another issue with the Europeans is that they have had access to widescreen, non-HD TV for years, so aspect ratio isn’t necessarily a clue whether they’re watching HD or SD.

  30. winkybb says:

    I think it really matters what screen size was used and how far away people sat. If the image was small and/or people were far away, then the placebo effect would be relatively strong. Get a bigger screen or sit closer and I doubt the conclusion would have been as strong.

    • lyd says:

      I suspect it still would have been quite remarkable. Most people seem to have an extremely high tolerance for completely jacked-up video. Heck, very many people have a *preference* for completely jacked-up video! Just look at the wide popularity of skewed aspect ratios or the default white point any consumer tv set.

  31. Chrs says:

    When I was watching football (a replay of a tackle) on my parents’ new television the other day while I was home, I realized that hey, I wouldn’t have been able to see that before.

    Convincing people that two very similar things are different is, as far as I can tell, much easier than convincing them that two noticeably different things are the same.

  32. Absent says:

    “That strikes me as a dubious claim. PAL has another 100 lines of resolution and a lower frame rate. Not exactly a huge improvement over NTSC.”

    Whilst PAL is 25fps and NTSC is ~29.7fps, most discs these days play at ~23.9fps both DVD and Bluray.

Leave a Reply