I use a 1080p monitor and what I’ve noticed is that once creators start uploading 4k content the 1080p version that I watch on fullscreen has more artifacting than when they only uploaded in 1080p.

Did you notice that as well?

Watching in 1440p on a 1080p monitor results in a much better image, to the detriment of theoretically less sharper image and a lot higher CPU usage.

  • MrSoup@lemmy.zip
    link
    fedilink
    arrow-up
    14
    ·
    20 days ago

    YouTube automatically generate videos in lower resolution of the one uploaded.
    So when you watch a 4k video and switch to 1080, you are no longer watching the original video but a re-encoded one by YouTube itself which could have more artifacts since it’s resized and compressed.

    I dunno the exact specs (like bit rate, etc.), someone will probably add them in another reply.

    • Maxy@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      14
      ·
      20 days ago

      I believe YouTube always re-encodes the video, so the video will contain (extra) compression artefacts even if you’re watching at the original resolution. However, I also believe YouTube’s exact compression parameters aren’t public, so I don’t believe anyone outside of YouTube itself knows for sure which videos are compressed in which ways.

      What I do know is that different content also compresses in different ways, simply because the video can be easier/harder to compress. IIRC, shows like last week tonight (mostly static camera looking at a host) are way easier to compress than higher paced content, which (depending on previously mentioned unknown parameters) could have a large impact on the amount of artefacts. This makes it more difficult to compare different video’s uploaded at their different resolutions.

      • MrSoup@lemmy.zip
        link
        fedilink
        arrow-up
        5
        ·
        20 days ago

        YouTube always re-encodes the video

        You are right. For example you can upload an avi to YouTube, but they will never host and stream an avi.

        • DdCno1@beehaw.org
          link
          fedilink
          arrow-up
          4
          ·
          20 days ago

          AVI is a container, not a codec. An AVI container can contain video encoded with any kind of codec (unlike some other container formats, which are more restrictive). If you want to, you could put e.g. a VP9 or AV1 video stream (so the ones that YouTube is using) into an AVI container. In theory at least, if you uploaded an AVI file containing VP9 video, YouTube could just extract it from the container and stream it as is, but they’ll still re-encode it. Before you think that all of this talk of modern codecs in AVI containers is theoretical, AVI is used a a standard for archiving with some institutions, so it’s more relevant than you might think.

          However, you are partially right in that AVI can not be used for streaming, not just by YouTube, but in general, since this requirement obviously wasn’t taken into account when it was introduced in 1992 and thus not incorporated into this standard.

  • chunkystyles@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    11
    ·
    20 days ago

    YouTube compresses the shit out of 1080p content. Any video that has a lot of movement will look like trash at 1080p. Even if you’re on a lower resolution monitor, the higher bit rate of higher resolution videos will look better. It’s all very stupid on our end, but I assume it saves them a ton on bandwidth.

  • Ace! _SL/S@ani.social
    link
    fedilink
    arrow-up
    8
    ·
    edit-2
    20 days ago

    That’s because for example Youtube uses a bitrate of 4-7mbps for 1080p. 1440p gets arround 13mbps and 4k something like 46mbps iirc

    Other media providers are similiarly bad with their bitrates

    • Peter1986C@lemmings.world
      link
      fedilink
      arrow-up
      2
      ·
      20 days ago

      For AV1 that could still be okay, lol. It would be kind of meh for e.g. H264 but YT does not even use that anymore AFAIK.

      • Ace! _SL/S@ani.social
        link
        fedilink
        arrow-up
        3
        ·
        20 days ago

        Youtube uses VP9 for all resolutions most of the time. 1080p and below offer AVC as fallback encoding

  • DdCno1@beehaw.org
    link
    fedilink
    arrow-up
    6
    ·
    20 days ago

    There’s something else that hasn’t been mentioned yet: Video games in particular have been so detailed since the eight generation (XB1/PS4) that 1080p with its significant compression artifacts on YouTube swallows too many of those fine moving details, like foliage, sharp textures, lots of moving elements (like particles) and full-screen effects that modify nearly every pixel of every frame.

    And no, you will not get a less sharp image by downsampling 1440p or even 4K to 1080p, on the contrary. I would recommend you take a few comparison screenshots and see for yourself. I have a 1440p monitor and prefer 4K content - it definitely looks sharper, even down to fine-grain detail and I did the same when I had a 1200p screen, preferring 1440p content then (at least as soon as it was available - the early years were rough).

    If you are noticing high CPU usage at higher video resolutions, it’s possible that your GPU is outdated and can’t handle the latest codecs anymore - or that your operating system (since you’re on Linux based on your comment history) doesn’t have the right drivers to take advantage of the GPU’s decoding ability and/or is struggling with certain codecs. Under normal circumstances, there should be absolutely no increased CPU usage at higher video resolutions.

  • Maxy@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    4
    ·
    20 days ago

    About the “much higher CPU usage”: I’d recommend checking that hardware decoding is working correctly on your device, as that should ensure that even 4K content barely hits your CPU.

    About the “less sharper image”: this depends on your downscaler, but a proper downscaler shouldn’t make higher-resolution content any more blurry than the lower-resolution version. I do believe integer scaling (eg. 4K -> 1080p) is a lot less dependant on having a proper downscaler, so consider bumping the resolution up even further if the video, your internet, and your client allow it.

    • Peter1986C@lemmings.world
      link
      fedilink
      arrow-up
      2
      ·
      20 days ago

      Youtube pushes the AV1 “format” heavily these days which is hard to decode using hardware acceleration, given that a lot of devices still out there do not support that.

      • Maxy@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        1
        ·
        19 days ago

        Good point, though I believe you have to explicitly enable AV1 in Firefox for it to advertise AV1 support. YouTube on Firefox should fall back to VP9 by default (which is supported by a lot more accelerators), so not being able to decode AV1 shouldn’t be a problem for most Firefox-users (and by extension most lemmy users, I assume).

        • Peter1986C@lemmings.world
          link
          fedilink
          arrow-up
          1
          ·
          18 days ago

          I am running mostly Firefox or Librewolf on Linux these days, but I do not remember having to enable it. Not all of my systems support accelerating AV1 in their hardware, but they do play at 1080p (but with framedrops once above 30fps on the unaccelerated computer). But yeah, I do hope YT keeps VP9 around because of the acceleration.

    • sexy_peach@feddit.orgOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 days ago

      I was just guessing the higher CPU usage. You’re probably right that it doesn’t matter

  • stealth_cookies@lemmy.ca
    link
    fedilink
    arrow-up
    4
    ·
    20 days ago

    The one I’ve noticed is that for videos with the 1080p “Enhanced Bitrate” option, the free 1080p video looks like a blurry mess compared to normal 1080p content.

  • TranquilTurbulence@lemmy.zip
    link
    fedilink
    arrow-up
    3
    ·
    20 days ago

    I haven’t noticed anything. Would you do me a disservice and explain what I’m missing in my blissful ignorance. Make me see something that can never be unseen.

    • Peter1986C@lemmings.world
      link
      fedilink
      arrow-up
      3
      ·
      20 days ago

      I can only imagine that they (OP) set quality settings on [auto]. That way they might have YT constantly lowering bitrates/resolution. I do not have any issues either, but I use fixed quality settings.

      • DdCno1@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        20 days ago

        No, that’s not what they are talking about. Even if you set the video to 1080p and make sure that YouTube isn’t lowering it to a lower resolution, it still won’t look very good.

        Whether you notice or not depends on how perceptive you are, the quality of your eyesight and also the size and quality of your display. It’s hard to notice on a low-grade laptop screen (or smaller), as well as a cheap TN panel monitor, but go beyond around 20" and use a decent enough IPS panel and those blocky compression artifacts are hard to miss.

    • sexy_peach@feddit.orgOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      18 days ago

      I sit quite close to a large 1080p monitor. That’s why I notice when the bitrate is low and the video I am seeing lacks true 1080*720 pixels. Basically it’s compressed so much, that the image is noticeably worse than an image my monitor could display. That’s why when I use a higher pixel count compression, like 1440p, the compression problems don’t show as bad on the screen that will only show 1080p pixels anyways. That’s what I am talking about. On a phone or a laptop screen it will probably be less noticeable. I guess that’s why Youtube does it, it probably saves them a huge amount of bandwidth and people who want really good quality video might already have 4k displays which then get a way higher bitrate video feed anyways.

      I guess the 1080p monitor size starts to be a niche. More and more people using it are on smartphones I guess so it really makes sense to have a very low bitrate.

      • TranquilTurbulence@lemmy.zip
        link
        fedilink
        arrow-up
        2
        ·
        18 days ago

        Turns out, I have an old dumb FullHD TV that should be ideal for this experiment. So, if I watch a YT video on 1080p, I should be able to see compression artefacts that are invisible when using a higher resolution. How is that supposed to work anyway, given that the browser knows the output resolution? Will it just download a higher resolution video, drop every other pixel, and display the rest?

        • sexy_peach@feddit.orgOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          18 days ago

          Will it just download a higher resolution video, drop every other pixel, and display the rest?

          Yes, just like it can show a 1080p video not in fullscreen :)