Title almost says it all. OLED monitors are getting more and more affordable, but it’s almost out of the picture when buying a monitor because of toolbars and HUD elements. I don’t understand why monitors “burn-in”, when I shine my LED flashlight or some LED xmas lights they won’t simply start emitting the same light even when I turn them off. I know it’s a dumb comparison, but still, what happens?

The other thing that I don’t understand is the fact that I’ve never seen any signs of burn-in on anyone’s phone. Alright, technically that’s a lie, I did see some on a work phone (or two), that only had some chat app open, seemingly since ages, and the namebar was a bit burned-in, or something like that, as you’d guess I also didn’t interact with that phone a lot. As as said above “but still,” I’ve had my phone for a while now, so does my family and friends, some of us even doomscroll, and I’ve never seen any signs of burn-in on any (actually used) phone.

so, I can watch my background all day, but I should open my browser every like 3 hours press f11 twice and I’m safe? Ff I’m away just let the screensaver save my screen? In that case why would anyone ever worry about burn it, you almost have to do it intentionally. But if it’s really dangerous, like I immerse myself into a youtube video, but it has the youtuber’s pfp on the bottom right (does youtube still do that?), and it was hbomberguy’s, am I just done, toasted, burnt-in?

  • Plopp@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    First of all, LED is not the same as OLED. The O stands for organic. They are more sensitive to stuff and break down over time sort of (maybe a really crappy explanation, someone with more knowledge please help), especially the blue color.

    • Bronzie@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      Just to tack on and expand on your first point: LED monitors are normally LCD displays but with LED backligthing, allowing for more zone control and it is more efficient both with space and energy usage.

      For TV’s, burn in is becomming less of an issue due to software in newer models and improvements in the tech. The same goes for phones. Older OLED phones like the Pixel 2 I think, had issues with burn-in.

      Rtings is actually doing a long term torture test as we write. They have also included some PC monitors for good meassure.

      In general, the reason why it’s still not perfect for PC is that all office/daily use retains a static image on a large portion on your screen. Imagine a browser, Excel or program with a big static toolbar. This will cause issues even with pixel shift and refresh cycles. You can only move pixels so much without it affecting your experience.

      If you were to only game or watch movies on it, it would likely never show signs of burn-in.

      Hope this made sense

  • Blue_Morpho@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    My pixel 3a had burn in after 3 years. It’s the screen gui elements that rarely change that show the burn in. On android that’s the message bar at the top and the white line task switch at the bottom.

    I’ve been hearing “OLED burn in is better now” ever since I got my Galaxy Nexus 12 years ago. But it still seems to last only 3 years.

  • ilinamorato@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    Burn-in isn’t a light being emitted when off, it’s a light being dimmer when on.

    An LED works by passing current between two different semiconductors. When an electron jumps over the “gap” between those two semiconductors, it releases a photon of a particular color (determined by the size of the gap). But over time, as an LED is used, the gap can be damaged (by heat, by vibration, etc); when this happens, fewer electrons can jump the gap and thus fewer electrons produce photons. Or the properties of the gap are changed so that they emit a slightly different wavelength photon. So if you leave a particular set of pixels on, producing light, for an extended time, those LEDs will degrade more than the rest of the screen, leading that area to be discolored or dimmer. This is burn-in.

    Most of the time, that’s fine, because the LEDs on your screen experience wear in a more-or-less uniform pattern. Your phone is somewhat less susceptible to this, since (1) you tend to have your phone screen off most of the time, (2) there aren’t as many persistent HUD elements even when it’s on, as every app has its own configuration of controls and UI elements, and (3) you tend to replace a phone more often than a monitor. When you replace your phone, it’s probably more-or-less evenly dimmer overall than it was when you bought it, but since you don’t have anything to compare it to, you won’t know; with burn-in, though, that comparison is right next to the burned-in pixels.

    By contrast, a computer monitor will typically be on for 8+ hours at a time, and persistent display elements are a part of every major operating system. If you’re not using the LEDs in a panel more-or-less evenly, you’ll end up with a persistent image.

      • ilinamorato@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Fair point, I don’t know you. The average phone user, then. Most people use their phone about 4½ hours a day.

        • Wugmeister@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 months ago

          My retort was in fact a joke. I am terminally online and even I only average about 6 hours. But thank you

          Edit: Just checked, my average screen time this week was about 7 hours per day.

    • UnRelatedBurner@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Thanks, makes sense. But why don’t monitors have an “emergency” protocol to let the LEDs rest a while if we can know what’s the max stress that they can handle?

      So instead of burning out, I’d get a pop up saying that I should do something, or it lowers the brightness in that area or smth.

      • ilinamorato@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I have five guesses:

        (1) That would require more diagnostics than an LED on a monitor is able to provide at a reasonable cost, (2) if you’re leaving the monitor on in a situation where burn-in is likely, you’re probably not at the monitor when it matters, (3) monitors are a mission-critical piece of hardware, meaning that them turning themselves off (or even just turning off certain pixels) randomly is not a great idea, (4) it’s probably the OS’s job to decide when to turn off the monitor, as the OS has the context to know what’s important and what isn’t, and how long it’s been since you’ve interacted with the device, and (5) it’s in the monitor manufacturer’s best interest for your monitor to get burn-in so that you have to replace it more often.

        The actual answer is probably a combination of multiple things, but that’s my guess.

        Honestly, setting a screen timeout (or even a screen saver!) is the solution to this problem. So the problem was more or less solved in the early 80s.

  • ShortFuse@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    2 months ago

    Burn-in is a misnomer.

    OLEDs don’t burn their image into anything. CRTs used to burn in right onto the screen making it impossible to fix without physically changing the “glass” (really the phosphor screen).

    What happens is the OLED burns out unevenly, causing some areas to be weaker than others. That clearly shows when you try to show all the colors (white) because some areas can no longer get as bright as their neighboring areas. It is reminiscent of CRT burn-in. LCDs just have one big backlight (or multiple if they have zones) so unevenness from burnout in LCDs is rarely seen, though still a thing.

    So, OLED manufacturers do things to avoid areas from burning out from staying on for too long like pixel shifting, reducing refresh rate, or dimming areas that don’t change for a long time (like logos).

    There is a secondary issue that looks like burn-in which is the panel’s ability to detect how long a pixel has been lit. If it can’t detect properly, then it will not give an even image. This is corrected every once in a while with “compensation cycles” but some panels are notorious for not doing them (Samsung), but once you do, it removes most commonly seen “burn-in”.

    You’d have to really, really leave the same image on your screen for months for it to have any noticeable in real world usage, at least with modern OLED TVs. You would normally worry more about the panel dimming too much over a long period of time, but I don’t believe lifetime is any worse than standard LCD.

    TL;DR: Watch RTings explain it

  • warm@kbin.earth
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    Most people replace their phones before it becomes an issue. Phones are often used for small amounts of time as people pick them up to reply to a message, browse the web for a bit or watch a video. Monitors are on for long durations and have more stationary UI elements, so they will suffer from burn-in much sooner.

    • henfredemars@infosec.pub
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Smartphones also benefit from vertical integration. Your iPhone for example knows it’s an iPhone and knows that the display uses OLED and exactly what its properties are, so it can use a mitigation to subtly vary the exact location of UI elements to help reduce the effect. Your desktop PC could do this in principle, but it doesn’t necessarily know the display technology with such certainty, so mitigations for one specific technology hasn’t been a priority on that platform.