Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

While the headline implies it's talking about LCD displays, it actually refers to LCD panel technology. The innovation is in the backlight technology behind the LCD panel, so LCD displays still have an exciting future ahead of them.


To be blunt: An exciting future in mass-market and mid-to-high tier. Which is perfectly fine.

For Premium tier the cost of elevating the picture quality to a competitive level (with elaborate backlight tech, multiple layers of LCD, etc.) is simply too high.

Samsung was only riding LCD in premium TVs for so long because they didnt have the technology for large-size OLED. So they spent a ton of money to establish the "QLED" brand as a new name for "LCD with yet another backlight" and started explaining people that brightness (not black-level or contrast) is the most important thing for a TV (because uniform LCD backlight can be brighter than per-pixel OLED)


The thing is: can I buy a big OLED and use it for whatever, without doing anything special to take care of it, and not have it suffer from burn in 10 years from now?


"burn-in" on OLED is actually the degradation of brightness of LEDs based on their operation hours. The impact is just more noticable on OLED because the pixels degrade individually (instead of LED-LCDs, where both brightness and color degrades uniformly across the whole zone of the backlight)

There's lots of methods now to make this degradation more uniform on OLED, like slightly shifting the image to distribute the load onto multiple subpixels, "cleaning" the panel when its switched off, down to measuring the operating hours of individual pictures and adjusting the brightness of surrounding pixels to blend the degradation.

Overall you can choose what you want to degrade over those 10 years: The LED brightness and color-accuracy of a whole zone / backlight unit (LED-LCD), or have the OLED panel actively compensate its aging (which has gotten ALOT better in the past years)

Either way, I wouldn't put my money on QLED (which is not a technology but a brand). QLED is actually far worse on lifetime than OLED [1], but as it was driven by the marketing powerhouse of Samsung this information is barely known. Considering how expensive it is to produce and that it was basically only made by Samsung to buy time until they can produce OLED as well, I wouldn't expect it to become actually a mature technology. It's more likely that they will replace the Quantum Dot backlight with something cheaper and still keep the brand "QLED" (if they haven't done that already).

AH-IPS LCD panels with zoned LED backlight probably still provide the most stable picture quality over longer time of use, compromising on image-quality compared to OLED. But the last really good TV with such a panel was produced quite a few years ago already...

[1] https://www.nature.com/articles/s41467-019-08749-2


I don't care much about average panel degradation.

I do care about seeing ghost letters or shapes on top of the latest TV series I'm watching.

Edit: The QLED article is from 2019, that's quite a few yers ago for cutting edge tech. I imagine tech is constantly improving.


You might be interested in the RTings longevity test which is exactly about that. The current round is playing CNN for 20 hours a day for 2 years, and they already have some results at the 6 month mark.

https://www.rtings.com/tv/tests/longevity-burn-in-test-updat...


Thanks!

This is a great overview demo of current panels! You can already see some very slight degradation in some of LGs OLEDs, but many of the Sony/Visio (and Samsung) OLED panels.

The others mostly look like standard LCD and DDI/Flex defects rather than burn-in so far.


You'll be fine with any of the existing technologies of today. In either case you will see luminance degradation over time, in case of LED-LCD or QLED also color-shifts of backlight zones.

But none of this is new. In reality, I barely heard people complain about their LCDs becoming more and more weak in brightness and color-tinted (usually yellow) each year. And yet, all of this happened to 100% of all LED-LCD panels.

> Edit: The QLED article is from 2019, that's quite a few yers ago for cutting edge tech. I imagine tech is constantly improving.

Yeah, there's alot of research ongoing to improve QD lifetime as a foundation for new LED technology. Here's one from Dec.2022 [1], and one from as late as 2023 [2], all with the subject of solving performance stability of the technology to become on-par (!) with common LED.

[1] https://pubs.acs.org/doi/full/10.1021/acsaelm.2c01351

[2] https://pubmed.ncbi.nlm.nih.gov/36990060/


You're missing the point:

> I don't care much about average panel degradation.

> I do care about seeing ghost letters or shapes on top of the latest TV series I'm watching.

Nobody cares about "luminance degradation", they just dont want ghosting


I think the point you're trying to make is you can calibrate away LCD backlight luminance degradation because it's uniform across the panel. And you don't notice because your eyes adapt to the whole screen.

With OLED, not so much. OLED does have tech to compensate based on total pixel usage. You might see that if you tried to change a panel without the electronics. I'm not aware of how widespread it is, but I know iphones can do it.


No I did not. Check the first sentence of my reply: "You'll be fine with any of the existing technologies of today."

And "ghosting" are artifacts on fast movement due to a lag in panel refresh, it's something entirely different.


> You'll be fine with any of the existing technologies of today

I think you're going too far with your statement. OLED/qd-oled and to a lesser degree WOLED all suffer from significant burn-in, which makes them pretty bad for desktop use.

Previous LCD technologies will degrade over the years by losing luminance and color accuracy, but that's a pretty minor issue compared to OLED as far as I'm concerned.


From context, you know the post is talking about ghosting from image retention, not ghosting from motion blur. Please don't be pedantic. We're not all versed in the terminology.


I'm pretty sure that on HN, being pedantic is basically required by law.


Sorry, but I've been in that industry (LCD and OLED) for over 10 years and this doesn't match with my understanding. QLED is basically an LCD technology with QuantumDot on top. It is just Samsung's marketing.

Most all mid tier LCD TVs currently use QD (whether in the backlight or the color filter layers). They don't use Blue QD (only longer lifetime Red/Green), but crystal InGaN in the Blue backlight. Other lower end LCDs use RG phosphor instead of QD to produce white, which also has long term burn-in issues, but as you mention they are not individual to pixels. Color shift of the (global backlight) screens over years can be compensated, but since environmental lighting has a bigger effect on perceived color, I'm not sure why you'd bother (mini-LED can be worse).

Even the very best currently shipping OLEDs (not yet phosphorescent Blue) have visible burn-in at ~2000hr even with image shifting de-contrast techniques. Early 2014 QD (Red/Green) have visible burn-in at ~7khr and modern ones about 20khr. They're about 10x better than OLEDs, but you could still see it several years in (rather than months in for OLED).

I've got no great love for QLED, but R/G QD just doesn't have the same level of burn-in problems that fluorescent Blue OLED does, and neither do Blue InGaN LEDs.

https://en.tab-tv.com/samsung-has-released-recommendations-f...

I'm including this link, because the SID.org papers are all paywalled.


Same here, working with and in that industry for ~17 years on technology strategy. I'm no evangelist either, neither for LCD nor OLED.

I frankly don't know how you technically define "visible burn-in" in context of QD (do you mean actual pure QD tech or Samsungs mixed use of the tech for QLED TVs?).

If we can agree, that: 1.) The jargon "burn-in" on OLED describes the luminance degradation of individual sub-pixels (i.e. a weaker red because the operation hours of a red sub-pixel were much higher than its neighbours).

2.) Samsung's "QLED" is a brand for a LCD panel with the backlight created using R/G QD and blue LED (instead of blue/yellow LED). The backlight on standard QLED is separated in ~700 zones of the display, so you have huge clusters of pixels using the same backlight zone.

Then: What is the equal meaning of "burn-in" then in the context of QLED? The luminance degradation of the whole zone?

Or is it the "legacy" meaning of burn-in on LCD: The occurence of weaker/dead pixels due to issues in the TFT layer (broken/weak transistors no longer properly rotating the crystals)?

Either way, I can't confirm and don't agree that OLED TVs will see a visible "burn-in" after ~2000hrs of normal use. First-gen models yes, but definitely not currently shipping models.


I suspect we agree more than disagree. I was on the Driver/TCON development side. I remember the QD guys demoing new OLED TVs at DisplayWeek '17 that showed visible degradation of extreme (B&W bars and Logos) visible at 64 sRGB (10min B/W 1min Grey) during the show after ~72-96hrs. At show setup it was invisible and by the end of the show it looked terrible. You could swap the cables between TVs for comparison.

Similar to standard definitions of Mura a method of defining visible burn-in, you'd have to define the burn image (16pix B&W block or CNN Logo?), the detection image (64 Grey?), viewing distance (30deg FoV?), and visible contrast (+/4 MND?). I guarantee you all the OEMs have a spec.

Here's a set of modern 20hr/day tests at 6 months (posted above) where OLED has visible degradation (mostly Sony/Viso but also Samsung and LG). The new ones are definitely better!

https://www.rtings.com/tv/tests/longevity-burn-in-test-updat...


As far as I can tell, the QLED screens on the market are not based on the QLED technlogy that article is about, and the lifetimes given in that article are obviously far too low to build a usable television set with. They claim non-uniform degradation between different colours with blue in particular dropping to half the initial output after just 20 hours - that's nothing compared to the life of a TV set. Samsung QLED TVs seem to use standard LEDs in the backlight with a seperate quantum dot layer that improves colour accuracy and gamut.


It is the same technology. Samsung is just compensating the deficiency of Blue QD by using conventional Blue LEDs in their QLED TVs (creating white light with Green and Red QD as you wrote as well). The point is that the degradation of QD material still exists, and along with the usual degradation of the Blue LED, also the QD-material will degrade and shift over time.

The research paper describes the lifetime and degradation in a uniform setup, as it's research for foundational QD-tech, not for TV panels.

There's alot of research ongoing to improve QD lifetime as a foundation for new LED technology. Here's one from Dec.2022 [1], and one from as late as 2023 [2], all with the subject of solving performance stability of the technology to become on-par (!) with common LED.

[1] https://pubs.acs.org/doi/full/10.1021/acsaelm.2c01351

[2] https://pubmed.ncbi.nlm.nih.gov/36990060/


There is certainly a difference in either technology or implementation. I've seen LG OLED TVs get noticeable lettering burn-in after a couple hundred hours and on the other end of the spectrum that guy left an unmoving bright-dark pattern on his OLED Switch for over a year straight before seeing any.


> The impact is just more noticable on OLED because the pixels degrade individually (instead of LED-LCDs, where both brightness and color degrades uniformly across the whole zone of the backlight)

Don't organic LEDs also degrade more than their "normal" counterparts?

> Overall you can choose what you want to degrade over those 10 years: The LED brightness and color-accuracy of a whole zone / backlight unit (LED-LCD), or have the OLED panel actively compensate its aging (which has gotten ALOT better in the past years)

I'd imagine it would be FAR easier to compensate the backlight as you just need to re-calibrate one string of LEDs that all see same load over lifetime of the monitor vs somehow tracking per-subpixel usage of each OLED


> Don't organic LEDs also degrade more than their "normal" counterparts?

Yeah, amplified due to their much smaller size, having less surface area for heat dissipation, and also depending on pixel-color in RGB-OLED setups. On TV's WOLED panels are used instead of RGB-OLED, which allows for better heat-distribution and a uniform lifetime regardless of pixel-color.

> I'd imagine it would be FAR easier to compensate the backlight as you just need to re-calibrate one string of LEDs that all see same load over lifetime of the monitor vs somehow tracking per-subpixel usage of each OLED

For luminance there is nothing to compensate, as it simply degrades uniformly. The peak-brightness is simply reducing over time.

Color-shifts cannot be re-calibrated that well because the shift happens due to the chemical decomposition process of the phosphor in each LED, which causes the color of the light going INTO the color-filter to shift (the base light that is filtered for RGB is no longer white).

Once it starts it's not a binary process, the color keeps shifting. Manual recalibration often has no real benefit then, as the color-shift continues and is not uniform across the whole display.


> Color-shifts cannot be re-calibrated that well because the shift happens due to the chemical decomposition process of the phosphor in each LED, which causes the color of the light going INTO the color-filter to shift (the base light that is filtered for RGB is no longer white).

But you can change white balance of most monitors right ? So they at the very least need cold white and warm white LEDs, just change balance of those. using RGB LEDs in backlight could also allow for that correction.


> It's more likely that they will replace the Quantum Dot backlight with something cheaper and still keep the brand "QLED" (if they haven't done that already).

Or something more expensive? Not sure if OLED is more expensive or not, but they are already using that as a backlight on the high-end. Specifically, blue which seems to be the issue for QLEDs if the paper is anything to go by.


My point was that "QLED" is actually a registered brand and doesn't represent a specific technology, so Samsung Display could change the underlying tech, adjust the meaning of "QLED" and keep going.

If they replace it with something more expensive they usually give it a new name to sell the higher price, like "Neo QLED"


The (tinfoil hat?) thing about the thing is: Does Samsung actually want their TVs to survive a whole 10 years?


Considering that most of them are Internet-connected now, they can just stop patching the firmware to obsolete them.

TV takes 10 minutes to load Netflix because it's part of a botnet? Better buy a new one.


Oh, Samsung are already "accidentally" sending out broken patches to TVs that happen to brick them. Coincidentally this happens just a few months after their warranty expires. Whoopsie!

It happened to me and it happened to a friend of mine, just a few months apart.

Never ever connect a Samsung TV to the Internet under any circumstances.


I'll take it one step further: never plug in a Samsung TV. Or better yet, never let one into your home.


This might backfire in EU countries since EU consumer protection laws require extended warranty for devices expected to last longer. For TVs it's 5 years I believe.

So if they're bricked after two years you could take it to the retailer.


With smartphones this would be a problem. With TVs you do still always have the option to use it disconnected/dumb with your own computer, chromecast, TV box, console, whatever.


If a corporation like Samsung became evil, I wonder if they could make a bunch of money running a DDOS botnet and attacking other casinos or something.


Do consumers want to pay the premium for such a TV? And would those that do choose Samsung?


The problem is Trust.

Consumers don't trust manufavturers. If I pay 2x premium I have zero trust the product is better made or will last longer. My $300 laptop worked for 10 years, $3000 laptop has issues after 2 years.

We have a Toshiba TV that has lasted 15 years. Kinda getting to the end of it's usefull life - its 720p.

On the other hand, I don't expect image technology to advance at the same rate any more, we are at the usefull limit of resolution for example. So maybe if I was buying today, I would actually keep the TV for 20 years.


> Consumers don't trust manufavturers. If I pay 2x premium I have zero trust the product is better made or will last longer. My $300 laptop worked for 10 years, $3000 laptop has issues after 2 years.

I'm confused — why are you talking about trusting "manufacturers" as a whole? Isn't the whole point of your anecdote that one manufacturer is trustworthy and the other is not?


As far as I am concerned, none of them can be trusted. For me there is no way to know which product is designed to last - I bought a few that are, by random chance alone.

Well maybe there are a handful, in some markets, but Miele does not make TVs


I don't really know much about TVs, but in my experience, Apple certainly makes long-lasting products — my 2008 and 2012 macbooks and 2014 phone are still in good shape, albeit with dead batteries. In cars, from a second-hand experience (I don't drive and hope to never start), Volvo still has great quality.

Personally, I know more about clothes. Once you switch from fast fashion (Zara, Bershka, Uniqlo) to quality brands (for me, it's Rick Owens, Margiela, Thom Krom and McQueen, adjust for your personal style), they do last a lot longer — I still wear T-shirts that I bought in 2016. However, it means that I shop much less often, own much less clothes and they're all the same style (and the same colour — full black). And I see that people pay lip service to degrading quality, but what majority of them _actually_ want is a lot of regular dopamine from shopping and a wide variety of garments.

You can eat out at fast-food everyday, or spend the same budget to eat at a Michelin-stared place once a couple of weeks (1 star; 2 or 3 stars it would be once a month or more). People tend to choose the former, but then blame "the industry" for their own choices.


That's the thing about the thing about the thing, I have a 10 year old Samsung LCD that's just fine.


I have 7 LN46A's now and left over parts from others that had panel defects.


Is it actually just fine, or have you not noticed the slow and gradual degradation of its color accuracy and brightness?


I haven't noticed it. So it's not a problem.

I DO notice OLED burn-in, so that <<is>> a problem.


Rtings are doing a major test which seems to indicate it's not a major issue for the majority of modern OLEDS: https://www.rtings.com/tv/tests/longevity-burn-in-test-updat...


Some of these look pretty rough. I would be pissed if my TV had the burn-in I see with some of those Sonys, for example.


There is a mix of newer and older OLED and LED TVs in there, and that is 6 months, 20 hours a day of watching CNN. That said, it doesn't look quite as good as the last time I checked on the results. The new QD-OLEDs seem to do worse than standard OLEDs.


If anything I am impressed. They are running these at max brightness with static images on screen for 20 hours a day. I would have guessed the burn in would have been way worse.


There's no control images there to tell you how much it actually degraded. The panels might not be uniform when brand new either.


At the top of the table, you can choose to view month 0, which would be the day they started the test. There's your control.


The real answer is that it doesn't matter: vast majority people buying bleeding-edge OLED TVs (yours truly included) will not use the same TV for 10 years.

And for exactly the same reason most people buying high-end computers and phones won't — the tech will have moved on in 5 years to such a degree that newer products will be significantly better.


I bought a 43" Sanyo LCD TV (720p) in 2004 that still works today. It's good enough for movies.

I have a 15" Samsung CRT TV from the 90's that still works great. I use it for retrogaming.

> vast majority people buying bleeding-edge OLED TVs (yours truly included) will not use the same TV for 10 years.

IMHO people mostly upgrade their TV because a bigger one comes out. At some point we will reach the maximum practical size for a home TV (kinda like CD drives maxed out at 52x) and then I think people will buy them less. TVs are already kinda cheap and boring and if they don't keep getting bigger, the average consumer will eventually treat it as a commodity item (and maybe it will be priced that way too).


I agree. I am a moderate TV user. I bought a new one about 3 years ago, and went with a 65". I could have fit an 80 on the wall, but it didn't seem necessary. And I'm glad I didn't. The 65" is already too big if you get up off the couch and stand nearer to it. It's kind of disorienting to watch from any closer than the opposite side of the room. And it was a mid-range TV at the time and still looks plenty good for me. I don't foresee upgrading anytime soon, unless something goes wrong. Honestly the thing I like the least about it has nothing to do with the display. It's the annoying Android software.


Sure, but then by your description you're not someone who's interested in buying bleeding edge tech (at least not for TV's).

I'm not saying those people don't exist — they clearly do!

But those are not people who buy the most expensive/best TV's on the market.


You should certainly be prepared to buy a new TV/monitor after 10 years of regular use, with OLED. If not burn-in or color drift, brightness is likely to decrease noticeably.


Is burn in still a thing? I was under the impression burn in rate had been reduced to a reasonably negligible value in modern OLEDs.


Yes, burn in still exists for new flagship OLED panels

But you're very unlikely to experience it compared to past technologies, even if you are displaying a static image for extremely long periods


OLED panels willl always suffer from burn-in in some form. What I meant by "is it a thing" is "will users ever experience it to a detectable degree". I guess the answer here is "No" then.

I've seen the video where the guy leaves a bright high-contrast static image on his Nintendo Switch OLED for 10,000 hours before finally seeing a small amount of burn-in, but that's not the level that should concern average users. Was just curious if typical large screens fared any worse than the new Switch.


I just want the tech to get to where it’s like consumer SSDs; they do have limited wear endurance but practically most home users will never get close to it anymore on high capacity modern SSDs.


Plus, the TVs have good default settings to prevent it even if you're negligent.


> even if you're negligent.

It's a TV!!! There shouldn't be any "negligent" :-)

Some people are fine with tiptoeing around the problem, I imagine most people just don't do anything and mess up their TV.

I shouldn't have to configure FVWM for 20 days to be able to operate the appliance correctly.


CRTs had the same problem for decades, so while I agree it shouldn’t I’m certainly not unfamiliar.


You wouldn't have to configure anything, even on older TVs. But there are things you can do that will eventually harm it, like leave the same bright image on the TV for hours. That's negligence, especially if the manual tells you not to.

The newer TVs can detect that scenario and turn the TV off automatically, unless you specifically disable that setting.

TVs are a luxury good, and there will always be ways to be negligent with anything. Leaving them out in the rain is negligent. Some cleaners will harm the screen, I'm sure. Using them in a too-hot or too-cold environment will harm them. I don't expect my TV to be battle-hardened, and I'm certainly not going to pay the price for that. Instead, I'll take care of it.


CRT TVs automatically degaussed every time they were turned on...


I have an LG.. if you left your nintendo switch paused on a static image it's going to put a screensaver on after a while. It knows the image is static.


Reduced yes, negligible not really.


Which was always so silly. No amount of screen brightness can make up for loss of detail in a dark scene and even OLED maximum brightness is enough to hurt your eyes.


Small-format OLEDs are great, but not large-format not yet.

1. Large-format OLED maximum brightness is very low. LG C2 peak SDR brightness is some 425 nits. A Samsung Neo-QLED QN90B does 1200 nits, and an iPhone 14 (small OLED) does 800 nits. In my experience, a room with windows facing the sun with thin curtains require 1000-1500 nits SDR to be comfortably legible. 2000 would be nice to have, but that's not available for the masses yet.

2. That SDR brightness is only maintained for a 10% window, and quickly drops off for larger bright areas. At 50%, it's only 350 nits. The QN90B does 850 nits sustained at 50%.

3. Same panel does peak 6-700 nit HDR, well below the 1000-4000 nits generally needed for HDR content. An iPhone 14 does 1200 nits, The QN90B does around 1700 nits. The high peak is needed to give proper "blinding" effect to e.g. skies or lights, as opposed to just looking like SDR-style bright white surfaces, especially outside a dark cinema room.

4. WOLED, which is the dominant panel technology for large high-end OLED panels, lose color saturation quickly as they approach their (already low) maximum brightness, due to relying on a white subpixel for raw luminance.

5. MiniLED backlight for LCD with 10k+ diming zones - while a stopgap - get dimming artifacts far enough down to be competitive on contrast, while destroying OLED in brightness and color saturation.

Large-panel OLED will beat LCD, but right now it is has many shortcomings. QD-OLED shows significant promise, with latest generations finally starting to see some decent brightness, but it's early (and rtings report both thermal throttling and some quite unfortunate burn-in behavior on those).

And yes, 1000 nits+ for SDR is retina-searing once the sun sets, but for those who have forgotten how sunlight feels I can report that it is quite bright and requires effort to outshine. Something like ambient light control is necessary to adjust throughout the day to not accidentally go blind.


Personally, 500 nits full screen white at 3m away is enough to hurt my eyes. I’m grateful OLEDs exceed that only for very small areas and briefly.

Also, no amount of screen brightness can make dark scenes visible in a bright room. You have to make the room darker anyway, so why bother with a brighter screen?


Because lots of people like a bright living room and want to watch Youtube without pulling the curtains?

Sure, to get the full cinema experience you should darken the room, or just watch at night. But having a bright and legible screen without having to close the curtains is still worth a lot.


Same. Sometime I even have to dial it down if I'm watching it in dark(er) room, I also try to have lights on in room/hallway next to my living room just so my eyes don't bleed when bright scene appears


This is just plain false. Enough screen brightness works with current generation LCDs. OLED specifically is just incapable of delivering it yet. 500 nits is only bright in a darker room.

Darkening rooms in daytime for non-cinema watching is just a workaround, which is both impractical and not necessarily preferred. 1000-1500 nits is not perceived as bright in daylight, but is perfectly comfortable and legible. A little more would be nice, but not much.

(If you are light sensitive and it hurts because of that, then this is a different topic - but that is not the norm)


Glare from windows or reflected from light walls/furniture can easily exceed that, so any dark scenes (or dark editor theme) are entirely illegible. No amount of screen brightness can make blacks darker under glare, which will show up as the brightest white the screen can produce.

Darkening the environment is the only option for displaying any dark colours and it happens to make high screen brightness unnecessary.


I have bought a monitor with 600 nits so that in the summer I can work at home without drawing curtains. I can have sun in the room and still read the display without issues.


The current top-of-the-line LG OLEDs hit ~1500 nits in HDR: https://www.rtings.com/tv/reviews/lg/g3-oled

The QD-OLEDs from Samsung which use a similar trick as their "Quantum-Dot" LCDs do, also hit similar levels: https://www.rtings.com/tv/reviews/samsung/s95c-oled#page-tes...


Both of those tests report real SDR brightness of 500 nits, or 600 for 10% sustained. The QN90B test reports 1200 nits and 2000 (!) nits respectively: https://www.rtings.com/tv/reviews/samsung/qn90b-qled

Real scene SDR brightness is all that matters for use in a bright room, as it represents the overall brightness of the screen when playing normal content. HDR brightness on the other hand is only observed in certain scenes of particular HDR content mastered to use it - not to mention that such HDR content is more likely to be consumed in a dark room to enjoy the darker details not present in SDR.

I think the limiting factor right now on those new OLED sets is thermal throttling and power limits. But all is not bad - it shows great promise of near a future where self-emissive panels have food brightness.


Peak small-window nits is not the proper Rtings metric for viewing in a bright room. It's Sustained 100%, which is only ~300 nits on the brightest, most expensive OLED panel on the market. A bright LCD panel puts out 600-800 nits here. This is the closest metric to something like watching sports in a well-lit room.


rtings "real scene" brightness is a better measure than 100% sustained - I think some brightness limiting is acceptable for a fully white screen.

Still, that's 500 for the those OLED sets vs 1200 for the QN90B.


When I was choosing a new TV I was a bit concerned about brightness. 2000 nits is so much better (on paper) than 400 nits. But I wasn't planning on putting my TV outdoors and got a C2.

My current brightness settings are at 30% when watching TV at night. "Bright room" settings bumps pixel brightness to 45%, which is good enough for a room with windows on 3 out of 4 walls. It is much brighter then a 10 year old LCD it replaced. And I had no problem using that old one until I put two screens side by side.


I also know people happy with their OLEDs, but if you use your OLED at anything less than 100% brightness throughout the day, the only reasonable explanation is that your living room is much darker than mine. Which is fine, it makes it easier for you to enjoy your TV so good for you - but if someone crazy if they suggested that an iPhone never needed more than 30% screen brightness (which if the scales match would be brighter than your TV setting), I'm sure everyone would agree that they were wrong, even if it is sometimes enough.

For reference, I have my QN95A on max brightness (with auto-dim to not be killed at night), and it's not quite as bright as I want, but bright enough to be comfortable - and I don't think I'm being unreasonable. I do have windows both left and right and can look straight at the sun from my couch position twice a day (sometimes both ways due to neighboring reflections), and for WAF reasons my curtains are the thin kind.


This simply hasn’t been my experience with a 43” OLED TV, which I use as both a TV and my main monitor when working. It’s facing away from the window, which I’m sure helps, but I’ve never had to draw the curtains to see what I’m doing.

Maybe if your TV is facing a window that’s more of a problem, but if so I’d suggest not doing that. You’re never going to get a good experience from a TV facing a light source because of nothing else it will reflect off the screen.


A brighter tv is less expensive than rearchitecting your livingroom.

I don't have walls that don't face huge windows in my livingroom, so I had to splurge for a Samsung Q90. It does well for the most part. On sunny days I still miss even more brightness.


You look to have great experience in this field. What do you think about MLA OLED, and which one of these various technologies have the best chance of getting price-competitive with normal direct-lit LEDs at large screen sizes?


OLEDs really struggle in daylight, but are certainly fine in dimly lit rooms.


Our 77" LG C2 looked fine, even when we have all our windows open


I have a recent QLED, my parents in law have a recent OLED (LG), seems to be the same. I'd say the QLED is easier to watch during the daylight with sun in the room.


Having a Neo-QLED myself, the brightness is definitely a plus - still a bit dimmer than I'd like with daylight, but certainly better than having, you know, half the brightness with an regular OLED...

The main downside I can note is that the dimming zones are way too large - especially noticeable with local dimming set to High, and credits scrolling. Not noticeable for normal content though. Mini-LED backlight would have made it indistinguishable from most OLEDs in anything but the darkest of cinema rooms...


What generation do you have? 2022 or 2023? Dimming zones get smaller each generation. Neo-QLED branding startet 2022?


2021, QN95A75QE, an impulse-buy while they were massively discounted some time before the 2022 release. Mine has some ~700-ish dimming zones IIRC.

Moving bright content on dark background reveals the dimming zone resolution as the zones chase the content - credits is the main thing that comes to mind, where the text brightness fluctuates as it transitions zones. A circular loading indicator can also trip this, but the TV recognizes it after a rotation and then evenly illuminates the path.

Turning down the dimming from max would likely help, but some may be surprised to hear that credits are not my main content priority.

The brightness is great for my bright living room though, and worlds apart from the OLED sets I've seen in similar settings.

As an aside, super happy I got the 2021 - a friend got the S95B, and the newer Tizen UI is way more intrusive and more ad-filled.


Thanks!


I think all Neo-QLED TVs are mini-LED - something like 1000 dimming zones?


IIRC, 500-750-ish. Good mini-LED are in the several thousands.


You can also use a ton of small leds for the backlight, like the display Apple calls the ‘Liquid Retina XDR’ display and there’s probably a world of controller software to improve on for years.


Which is what the LCD-TV industry did for years. It's no longer economic, except maybe in specific high-margin industries (like medical displays)

But even in those areas I'm confident that its existing tech, I doubt that there's alot of investment going on.

WOLED panels are on a trajectory from Premium to High-Tier with production volume increasing each year, with RGB-OLED increasing in panel-size each year as well. The window for investing into R&D for a large-volume LCD-panel which outperforms OLED in accuracy but at a lower price has already closed.


The generic term for this is mini-led which is not to be confused with micro-led. Apple is expected to start using MicroLED displays within a couple years.


I find it quite astonishing their newer monitors run iOS.


I believe the point is that the main research is now on oled and micro-led. Those are truly LCD-less. Instead of having a backlight that is variously dimmed by an LCD panel, they have self emmisive pixels.

Micro led truly aims to just have a single LED per (sub) pixel.


Micro-led seems to me to be one of those technologies that's always just around the corner.

Perhaps I'm a pessimist, but from the launch of the first mainstream OLED tvs several years ago, I've been hearing "OLED is just a stopgap, Microled is where it's at, and that's not far off now".

Ten years later ... oled is still where it's at, or LCD for a lot of people. I know these things take time, but the anticipation of this tech seems to have lasted at least a decade so far.


Maybe you're not old enough to remember, but OLED was also touted as the future for many years before it actually became available - I remember hearing about OLED at the time when flat panel LCD monitors were starting to become widespread in the early to mid 2000s, but it still took some years for them to appear in phones and then, much later, in TVs. Actually the first device with an OLED display that I had was a cheapo Chinese USB-stick-with-integrated-MP3-player which I bought more out of curiosity for the OLED display and never actually used because I already had a phone that could play MP3s (this one: https://en.wikipedia.org/wiki/Sony_Ericsson_K800i).


My dad had a textbook that said something like 'when current is applied liquid crystals glow, although this effect is interesting it is completely useless. The first LCD watches came out as he was finishing school.


That's not how LCD Displays work, though.


The book was thrown away a few decades ago, I remember seeing it as a kid, but floods got it since. I'm not sure what the quote is.


They do if you power them wrong enough


The first commercial OLEDs launched in 2007 I think, I had an OLED phone in 2012(AMOLED?), and then the tvs got 'big' commercially in what, 2017ish?

My point isn't necessarily that Microled is endlessly delayed, so much as that people are endlessly talking about it, have been for years, and it's still years away from even high-end consumer models. I don't remember the same for OLED, in fact I remember people getting very hyped over LCD vs Plasma, and OLED not really getting a mention as a tv tech, even when it was in phones. Perhaps it's just my memory.


> Micro-led seems to me to be one of those technologies that's always just around the corner.

You can actually buy a MicroLED TV today, but they're huge and expensive:

https://www.samsung.com/us/televisions-home-theater/tvs/micr...


How funny they are selling a $150,000 display and still feel the need to inflate its size from 109.2" to 110".


Once it's powered on for 10 minutes, the heat causes the materials to expand.

Soon thereafter, the thrust from the cooling fans causes the tv to leave forever, but for those first few minutes it is a viewing experience beyond any other. Just last week, one was spotted cruising past the freeway at low altitude, out to sea. Some say it is still going.

Mind you, the display technology doesn't produce mind excess heat. That would be the smart-tv cpu as it loads up a barrage of ads with which to torment watchers with. Skip ad 14:56s


They are so massive that the difference is caused by the curvature of spacetime.


Huh? MicroLED is pretty new. I think the first research demonstration of a microLED display was in 2009 or 2010. You can buy microLED TVs right now. A bit more then 10 years to go from research to product is totally normal.


It’s a bit of a stretch to say they are available now. Yes, there are very low volumes of a few very expensive models being sold, but that’s not really what I’m talking about.


I thought MicroLED was about per pixel backlight for good old LCD.

Gonna hit wiki.

Edit: it's not LCD hehe. I have memorized some incorrect info


That's "mini-LED". MicroLED has small enough LEDs to use the LEDs directly as the pixels.


You seem to be right. I don't have them too confused, though. There have been MiniLED display on the market for years, and they're all LCD


Manufacturers blur this distinction on purpose to sell cheaper items


I believe mini-LED/zone dimming was a development expected to be followed by MicroLED, yet longevity and yield continued to be problem in Micro which left “Mini” branding hanging in the air. IIRC.


Yeah, LCD tech isn't going anywhere; unless OLED is or becomes cheaper (long term, it probably will), they'll be churning out LED panels for decades to come yet.


There are companies still selling EL display panels.


Where is the old plasma displays? I suspect those at least are not being made any more?


Panasonic ended their Viera line about 10 years ago, unfortunately. AFAIK they were the last &/ the best.

Mine still works great, although it still draws ~200 watts of energy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: