Title almost says it all. OLED monitors are getting more and more affordable, but it’s almost out of the picture when buying a monitor because of toolbars and HUD elements. I don’t understand why monitors “burn-in”, when I shine my LED flashlight or some LED xmas lights they won’t simply start emitting the same light even when I turn them off. I know it’s a dumb comparison, but still, what happens?
The other thing that I don’t understand is the fact that I’ve never seen any signs of burn-in on anyone’s phone. Alright, technically that’s a lie, I did see some on a work phone (or two), that only had some chat app open, seemingly since ages, and the namebar was a bit burned-in, or something like that, as you’d guess I also didn’t interact with that phone a lot. As as said above “but still,” I’ve had my phone for a while now, so does my family and friends, some of us even doomscroll, and I’ve never seen any signs of burn-in on any (actually used) phone.
so, I can watch my background all day, but I should open my browser every like 3 hours press f11 twice and I’m safe? Ff I’m away just let the screensaver save my screen? In that case why would anyone ever worry about burn it, you almost have to do it intentionally. But if it’s really dangerous, like I immerse myself into a youtube video, but it has the youtuber’s pfp on the bottom right (does youtube still do that?), and it was hbomberguy’s, am I just done, toasted, burnt-in?
Burn-in isn’t a light being emitted when off, it’s a light being dimmer when on.
An LED works by passing current between two different semiconductors. When an electron jumps over the “gap” between those two semiconductors, it releases a photon of a particular color (determined by the size of the gap). But over time, as an LED is used, the gap can be damaged (by heat, by vibration, etc); when this happens, fewer electrons can jump the gap and thus fewer electrons produce photons. Or the properties of the gap are changed so that they emit a slightly different wavelength photon. So if you leave a particular set of pixels on, producing light, for an extended time, those LEDs will degrade more than the rest of the screen, leading that area to be discolored or dimmer. This is burn-in.
Most of the time, that’s fine, because the LEDs on your screen experience wear in a more-or-less uniform pattern. Your phone is somewhat less susceptible to this, since (1) you tend to have your phone screen off most of the time, (2) there aren’t as many persistent HUD elements even when it’s on, as every app has its own configuration of controls and UI elements, and (3) you tend to replace a phone more often than a monitor. When you replace your phone, it’s probably more-or-less evenly dimmer overall than it was when you bought it, but since you don’t have anything to compare it to, you won’t know; with burn-in, though, that comparison is right next to the burned-in pixels.
By contrast, a computer monitor will typically be on for 8+ hours at a time, and persistent display elements are a part of every major operating system. If you’re not using the LEDs in a panel more-or-less evenly, you’ll end up with a persistent image.
Thanks, makes sense. But why don’t monitors have an “emergency” protocol to let the LEDs rest a while if we can know what’s the max stress that they can handle?
So instead of burning out, I’d get a pop up saying that I should do something, or it lowers the brightness in that area or smth.
I have five guesses:
(1) That would require more diagnostics than an LED on a monitor is able to provide at a reasonable cost, (2) if you’re leaving the monitor on in a situation where burn-in is likely, you’re probably not at the monitor when it matters, (3) monitors are a mission-critical piece of hardware, meaning that them turning themselves off (or even just turning off certain pixels) randomly is not a great idea, (4) it’s probably the OS’s job to decide when to turn off the monitor, as the OS has the context to know what’s important and what isn’t, and how long it’s been since you’ve interacted with the device, and (5) it’s in the monitor manufacturer’s best interest for your monitor to get burn-in so that you have to replace it more often.
The actual answer is probably a combination of multiple things, but that’s my guess.
Honestly, setting a screen timeout (or even a screen saver!) is the solution to this problem. So the problem was more or less solved in the early 80s.
very good points, thanks.
Bold assumption
Fair point, I don’t know you. The average phone user, then. Most people use their phone about 4½ hours a day.
My retort was in fact a joke. I am terminally online and even I only average about 6 hours. But thank you
Edit: Just checked, my average screen time this week was about 7 hours per day.