• this@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    3 days ago

    Career Audio engineer chiming in here. I would say it’s a combination of the lack of LUFS regulation and improper mixes. IMHO dialogue clarity should be prioritized more than it currently is, or at the very least we should use the center channel in surround sound formats exclusively for dialog to make it easier to adjust(via software and/or or av receivers), at least more than we currently do. Also modern TV shows are often mixed with a high dynamic range like a movie would be(which relates to LUFS levels), kind of silly and also IMHO bad mixing practice as most of your viewers don’t have high end home theatres.

    • kieron115@startrek.website
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 days ago

      What would be the alternative? You don’t really expect the streaming companies to pay for TWO masters do you!? (/s if it wasn’t obvious)

        • kieron115@startrek.website
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          Sorry, /s means sarcastic. If anything I would absolutely expect them to pay for multiple mixes/masters given what’s been said about how people consume it.

          • Cort@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            2 days ago

            Semi sarcastic response to your sarcastic question:

            Force tv manufacturers to add a center channel speaker to TVs. There’s more room in 16:9 screens for a center channel than 4:3

            • kieron115@startrek.website
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              2 days ago

              Or even if they would just let us bond the TV speakers together to use as a center channel and augment that with a cheap 2.0/2.1 soundbar I bet it would be an improvement in dialogue clarity, even if the imaging would be a bit of a disaster.

  • MetaStatistical@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 days ago

    Even with places like YouTube, where LUFS level is strictly defined, there’s sooo many creators who have no earthly idea what LUFS is, which levels YouTube enforces, and how it corrects for it. They post their videos with quiet narration and wonder why viewers get annoyed at all of the turning up and turning down of volume on each video.

    See, YouTube enforces LUFS on videos by reducing volume on loud videos down to -14 LUFS. But, it doesn’t do anything to quiet videos. If you ever bring up the “Stats for Nerds” and look at the “Volume / Normalized” value, you might see something like “content loudness -5.9dB”. That means it’s -5.9dB quieter than it should be, and the creator should have amplified the video to normalize the volume levels before uploading it to YouTube.

    So, you end up with a video that’s about -6dB quieter, and you have to turn up the volume to actually hear the narration. Then your TV or whatever device you’re watching will get blasted by the next video, which is properly normalized at around 0dB, and you’re forced to turn the damn volume back down.

    YouTube has finally started to acknowledge the problem by introducing the Stable Volume feature. But, really, creators should educate themselves on how to properly mix their audio. I know editing is hard and there’s so many moving parts to deal with for YouTube uploads. But, audio quality is everything in a YouTube video. Nobody cares about whatever random B-roll video game footage, or PowerPoint slide presentation, or watermarked stock images, or videos of you presenting the narration with a lapel mic tied to a tree branch you’re using on the video side. It’s all about narration and audio quality.

    • acockworkorange@mander.xyz
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      Try some old time TV shows to see if that’s really the case. I don’t have issues watching old Star Trek, but current shows are a shot in the dark.

  • tiramichu@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    4 days ago

    Dynamic range and loudness normalisation is surely the main reason people are using subtitles, but habits are undeniably also changing too, as is the way people consume media in general.

    People don’t just look at the TV for an hour straight - they are doing other things, or second-screening, or having conversations, and multiple methods being available to pick up on the show dialog is helpful.

    Let’s not forget simple reasons like accessibility, either. My friend here in the UK is Hungarian, and despite being completely fluent in English he always likes to watch shows with subtitles as it helps with understanding some British accents which can be tricky for non-natives.

    And people just process information in different ways. We’ve all heard by now that some individuals can be visually oriented, while other people are aural. If you get a choice, why not take it?

    Not to mention that subs on streaming services are much better visual quality and timing than subs on broadcast TV used to be, which felt nasty and mis-timed, and very second-class. Clearly ‘good enough’ for hard of hearing individuals but not very pleasant.

    I don’t think it’s a hot take to say that as accessibility features get better and more available, more people will use them. And accessibility is for everyone.

    • boatswain@infosec.pub
      link
      fedilink
      arrow-up
      0
      ·
      3 days ago

      People don’t just look at the TV for an hour straight - they are doing other things, or second-screening, or having conversations, and multiple methods being available to pick up on the show dialog is helpful.

      Wouldn’t this make subtitles less useful rather than more? You can’t see the subtitles if you’re not just looking at the TV. For second-screening, it would be more helpful to listen to the audio while you’re also scrolling Lemmy or whatever.

      • tiramichu@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        3 days ago

        Audio and subtitles are complementary.

        If you’re having a conversation, or doing some other task that makes sound, or scrolling social media and a video starts playing, there could be a noise that momentarily covers up the audio and you miss something. If there are subs then you can quickly glance to see what was going on.

        Listening to spoken dialog allows you to look away, but subs let you catch back up if you miss something. They cover for each other.

    • corsicanguppy@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      4 days ago

      watch shows with subtitles as it helps with understanding some British accents

      If you’ve seen subtitles lately, they used to be pretty bad but now they’re horrible. The mess up on what’s being said a LOT.

      Also they spell like a primary drop-out: till, your/you’re, etc.

      • I_Has_A_Hat@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        4 days ago

        You clearly don’t remember the days of live close captions. Hoo-boy, it’s like you could pinpoint the moments the transcriber lost their focus.

    • gonzo-rand19@moist.catsweat.com
      link
      fedilink
      arrow-up
      0
      ·
      4 days ago

      Not to mention that subs on streaming services are much better visual quality and timing than subs on broadcast TV used to be

      That’s because they are not subtitles, they are closed captions. They are transmitted differently (check out Technology Connections’s recent video for more info) and serve different purposes.

      Also, for live broadcasts, they’re actually being typed by a stenographer in real time, which is why they sometimes have mistakes.

    • acockworkorange@mander.xyz
      link
      fedilink
      arrow-up
      0
      ·
      4 days ago

      I think you’re missing the point. Lack of LUFS standards is what forces people that normally wouldn’t/don’t like to use subtitles to use them because they can’t understand dialogue otherwise.

      • tiramichu@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        4 days ago

        I don’t disagree with that, all I’m saying is there are additional factors in play which also account for at least some of the rise in subtitle usage. It’s not all down to a single cause.

        Volume normalisation is a problem, but it’s also true that people aren’t the same as they were 20 years ago and don’t behave the same as 20 years ago.

        • MetaStatistical@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 days ago

          It’s kinda of a generational issue, though, because people are borne into this new world with new habits. It’s no longer paying attention to a single piece of media on a TV, but instead, turning on something in the background, while watching or reading something else on a phone.

          I don’t really understand it, even as somebody with ADD. If you don’t like what’s on TV, change it or move to a different room while you read on your phone.

    • Tar_Alcaran@sh.itjust.works
      link
      fedilink
      arrow-up
      0
      ·
      4 days ago

      it helps with understanding some British accents which can be tricky for non-natives.

      There are native Londoners from the west of London who have trouble understanding the native Londoners from the east of London and vice versa