Birbla

Login
    Debunking HDR [video] (yedlin.net)
    116 points by plastic3169 - 4 days ago

  • Solid overview of applied color theory for video, so worth watching.

    As to what was to be debunked, the presentation not only fails to set out a thesis in the introduction, it doesn't even beg a question, so you've got to watch hours to get to the point: SDR and HDR are two measurement systems which when correctly used for most cases (legacy and conventional content) must produce the visual result. The increased fidelity of HDR makes it possible to expand the sensory response and achieve some very realistic new looks that were impossible with SDR, but the significance and value of any look is still up to the creativity of the photographer.

    This point could be more easily conveyed by this presentation if the author explained in the history of reproduction technology, human visual adaptation exposes a moment by moment contrast window of about 100:1, which is constantly adjusting across time based on average luminance to create an much larger window of perception of billions:1(+) that allows us to operate under the luminance conditions on earth. But until recently, we haven't expected electronic display media to be used in every condition on earth and even if it can work, you don't pick everywhere as your reference environment for system alignment.

    (+)Regarding difference between numbers such as 100 or billions, don't let your common sense about big or small values phase your thinking about differences: perception is logarithmic; it's the degree of ratios that matter more than the absolute magnitude of the numbers. As a famous acoustics engineer (Paul Klipsch) said about where to focus design optimization of response traits of reproduction systems: "If you can't double it or halve it, don't worry about it."

    by _wire_ - 4 days ago
  • At a minimum we should start from something captured close the reality, and then get creative from that point.. SDR is like black-and-white movies (not quite, but close). We can get creative with it, but can we just see the original natural look? HDR (and the wider color space associated) has a fighting chance to look real, but looking real seems far away from what movie makers are doing.
    by empiricus - 22 hours ago
  • The switch from 24-bit color to 30-bit color is very similar to the move from 15-bit color on old computers to 16-bit color.

    You didn’t need new displays to make use of it. It wasn’t suddenly brighter or darker.

    The change from 15 to 16 bit color was at least visible because the dynamic range of 16-bit color is much lower than 30-bit color, so you could see color banding improve, but it wasn’t some new world of color, like how HDR is sold.

    Manufacturers want to keep the sales boom that large cheap TVs brought when we moved away from CRTs. That was probably a “golden age” for screen makers.

    So they went from failing to sell 3D screens to semi-successfully getting everyone to replace their SDR screen with an HDR screen, even though almost no one can see the difference in those color depths when shown with everything else being equal.

    What really cheeses me on things like this is that TV and monitor manufacturers seem to gate the “blacker blacks” and “whiter whites” behind HDR modes and disable those features for SDR content. That is indefensible.

    by naikrovek - 22 hours ago
  • Unrelated to the video content, the technical delivery of the video is stunningly good. There is no buffering time, and clicking at random points in time on the seek bar gives me a result in about 100 ms. The minimal UI is extremely fast - and because seek happens onmousedown, oftentimes the video is already ready by the time I do onmouseup on the physical button. This is important to me because I like to skip around videos to skim the content to look for anything interesting.

    Meanwhile, YouTube is incredibly sluggish on my computer, with visible incremental rendering of the page UI, and seeking in a video easily takes 500~1000 ms. It's an embarrassment that the leading video platform, belonging to a multi-billion-dollar company, has a worse user experience than a simple video file with only the web browser's built-in UI controls.

    by nayuki - 22 hours ago
  • His previous stuff is so interesting and it's very refreshing to see a Hollywood professional able to dig so deep into those topics and teach us about it https://yedlin.net/NerdyFilmTechStuff/index.html

    I think the point that SDR inputs (to a monitor) can be _similar_ to HDR input to monitors that have high dynamic ranges is obvious if you look at the maths involved. Higher dynamic gives you more precision in the information, you can choose to do what you want with it : higher maximum luminosity, better blacks with less noise, more details in the middle etc.

    Of course we should also see "HDR" as a social movement, a new way to communicate between engineers, manufacturers and consumers, it's not "only" a math conversion formula.

    I believe we could focus first on comparing SDR and HDR black and white images, to see how higher dynamic range only in the luminosity is in itself very interesting to experience

    But in the beginning he is saying the images look similar on both monitors. Surely we could find counter examples and that only applies to his cinema stills ? If he can show this is true for all images then indeed he can show that "SDR input to a HDR monitor" is good enough for all human vision. I'm not sure this is true, as I do psychedelic animation I like to use all the gamut of colors I have at my hand and I don't care about representing scenes from the real world, I just want maximum color p0rn to feed my acid brain : 30 bits per pixels surely improve that, as well as wider color gamut / new LEDs wavelengths not used before

    by ttoinou - 22 hours ago
  • An excellent video. I've admired Yedlin's past work in debunking the need for film cameras over digital when you're going after a 'film look'

    I wish he shared his code though. Part of the problem is he can't operate like a normal scientist when all the best color grading tools are proprietary.

    I think it would be really cool to make an open source color grading software that simulates the best film looks. But there isn't enough information on Yedlin's website to exactly reproduce all the research he's done with open source tools.

    by sansseriff - 21 hours ago
  • I’m 14 minutes into this 2 hour 15 minute presentation that hinges on precision in terminology, and Yedlin is already making oversimplifications that hamper delivery of his point. First of all, he conflates the actual RGB triplets with the colorspace coordinates they represent. He chooses a floating point representation where each value of the triplet corresponds to a coordinate on the normalized axes of the colorspace, but there are other equally valid encodings of the same coordinates. Integers are very common.

    Secondly, Rec. 2100 defines more than just a colorspace. A coordinate triple in the Rec. 2100 colorspace does not dictate both luminance and chromaticity. You need to also specify a _transfer function_, of which Rec. 2100 defines two: PQ and HLG. They have different nominal maximum luminance: 10,000 nits for PQ and 1,000 nits for HLG. Without specifying a transfer function, a coordinate triple merely identifies chromaticity. This is true of _all_ color spaces.

    On the other hand his feet/meters analogy is excellent and I’m going to steal it next time I need to explain colorspace conversion to someone.

    by dcrazy - 20 hours ago
  • I read a whole book about this last year and it made me furious. Well, technically the book was about ACES but it was also about HDR and the problems overlap tremendously. I emphatically agree with this entire video and I will be sharing it widely.
    by autobodie - 18 hours ago
  • I just skimmed through parts of the video as I'm about to head to bed, but at least the bits I listened to sounded more like arguing for why 24bit audio isn't necessary for playback, 16bit will do just fine.

    Back in the days I made ray tracers and such, and going from an internal SDR representation to an internal HDR representation was a complete game changer, especially for multiple reflections. That was a decade or more before any consumer HDR monitors were released, so it was all tonemapped to SDR before displaying.

    That said, I would really like to see his two monitors display something with really high dynamic range. From the stills I saw in the video, they all seemed quite limited.

    Anyway, something to watch fully tomorrow, perhaps he addresses this.

    by magicalhippo - 17 hours ago
  • Presentation done in fully in Nuke. That's so funny to me. Technical and artistic excellence.
    by FrostKiwi - 14 hours ago
  • Curiously, when I play this back on my Android phone, after entering full screen, I can not exit/close the tab go to another tab. I can also not reach the system launcher any more.

    I had to kill Chrome via system prefs to make it stop.

    Seems like this video not only exposes the issues with HDR but also a rather weird bug on Chrome/Android.

    by virtualritz - 8 hours ago
  • For those of us that don't have time to randomly watch a 2 hour video, what exactly is being debunked here?
    by IshKebab - 7 hours ago

© 2025 Birbla.com, a Hacker News reader · Content · Terms · Privacy · Support