Title almost says it all. OLED monitors are getting more and more affordable, but it’s almost out of the picture when buying a monitor because of toolbars and HUD elements. I don’t understand why monitors “burn-in”, when I shine my LED flashlight or some LED xmas lights they won’t simply start emitting the same light even when I turn them off. I know it’s a dumb comparison, but still, what happens?
The other thing that I don’t understand is the fact that I’ve never seen any signs of burn-in on anyone’s phone. Alright, technically that’s a lie, I did see some on a work phone (or two), that only had some chat app open, seemingly since ages, and the namebar was a bit burned-in, or something like that, as you’d guess I also didn’t interact with that phone a lot. As as said above “but still,” I’ve had my phone for a while now, so does my family and friends, some of us even doomscroll, and I’ve never seen any signs of burn-in on any (actually used) phone.
so, I can watch my background all day, but I should open my browser every like 3 hours press f11 twice and I’m safe? Ff I’m away just let the screensaver save my screen? In that case why would anyone ever worry about burn it, you almost have to do it intentionally. But if it’s really dangerous, like I immerse myself into a youtube video, but it has the youtuber’s pfp on the bottom right (does youtube still do that?), and it was hbomberguy’s, am I just done, toasted, burnt-in?
First of all, LED is not the same as OLED. The O stands for organic. They are more sensitive to stuff and break down over time sort of (maybe a really crappy explanation, someone with more knowledge please help), especially the blue color.
LEDs and OLEDs work the same way, the only difference is their composition. Standard LEDs use metals, OLEDs use organic compounds (which, yes, are more sensitive to breakdown over time, but come with the advantage of being smaller, lighter, more flexible, etc).
And actually, it’s that size and flexibility that makes an OLED panel possible. An LED display is actually just a color LCD display with a white LED backlight; you need OLED to have the individual pixels generate their own light. Burn-in on a non-organic LED display would be a completely different thing (and is possible but rare).
It’s a nitpick, but since we got out the important details, technically they’re semimetals, or simple compounds with semimetalic properties.
An actual metal doesn’t have the separation between electron bands necessary to support multiple different conduction regimes (i.e. the magic). Again, a nitpick.
Interesting. I knew they were semiconductors, but I didn’t know they were also semimetals. Thanks for the details!
Yeah, there’s a lot of overlap there. TBH I suspect semimetal is just what we called things in between the two electronic structures before we had quantum mechanics to explain it, but that’s a guess.
Like I mentioned in my own reply, silicon is fairly metal-like physically, but it’s hard and brittle like diamond above it on the periodic table, as opposed to being ductile like every true metal is to a degree (AFAIK).
It’s only marginally more detailed on the chemistry, but conventional semiconductors (inorganic) are roughly like metal. Silicon is literally a shiny, meltable element, although it’s more brittle than a true metal. Like metal, unless they react with something like in rusting, they’re probably going to stay the same kind of metal. There’s exceptions if you really abuse a piece of metal, but a laymen probably hasn’t thought much about tempering and differences in crystal structure
Organic semiconducters are organic compounds, like a dye. Just as a dye can bleach or change over time, they can chemically change with a lot less outside help. This makes them prone to not working the same way after prolonged use. Blue and violet are the most energetic colours, so it makes sense the components emitting it would break down first. UV OLEDs would be even worse.
The reason we don’t use arrays of inorganic LEDs as screens (yet) is that they’re really picking hard and exacting to grow. The standard way to do it is grow a wafer which is about screen size, and then cut it up into tiny sections, only some of which will work, and each of which is worth something significant on it’s own. OLED can be grown in less crazy conditions, more like just printing with ink, and Micro LED gets around the wafer problem by using a precise robot arm to handle, test and connect each sub-pixel individually.
Cool video i saw a few days ago about the challenges of making true microLED displays. These would become the unchallenged king of displays if it became economical to make them and wouldnt have any of these issues.
https://www.youtube.com/watch?v=8_2KcB8JkfE
Just to tack on and expand on your first point: LED monitors are normally LCD displays but with LED backligthing, allowing for more zone control and it is more efficient both with space and energy usage.
For TV’s, burn in is becomming less of an issue due to software in newer models and improvements in the tech. The same goes for phones. Older OLED phones like the Pixel 2 I think, had issues with burn-in.
Rtings is actually doing a long term torture test as we write. They have also included some PC monitors for good meassure.
In general, the reason why it’s still not perfect for PC is that all office/daily use retains a static image on a large portion on your screen. Imagine a browser, Excel or program with a big static toolbar. This will cause issues even with pixel shift and refresh cycles. You can only move pixels so much without it affecting your experience.
If you were to only game or watch movies on it, it would likely never show signs of burn-in.
Hope this made sense
Yes, but our phones are OLED and most TVs sold are plain old LED.