• finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    7 days ago

    That’s exactly the sort of thing his work improved. He figured out that graphics hardware assumed all lighting intensities were linear when in fact it scaled dramatically as the RGB value increased.

    Example: Red value is 128 out of 255 should be 50% of the maximum brightness, that’s what the graphics cards and likely the programmers assumed, but the actual output was 22% brightness.

    So you would have areas that were extremely bright immediately cut off into areas that were extremely dark.