337
this post was submitted on 22 Apr 2024
337 points (98.8% liked)
Technology
59207 readers
3474 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
That's fucking insane. HDR 1400 displays are at least 1,400 nits. 614,000 nits seems like you'd be staring at the fucking sun.
Tbh the burn-in issue is the reason why I don't like OLEDs as computer monitors. I know phones and TVs don't tend to have major burn-in issues, but the fact that it exists sucks. TVs have a variable-enough image that long-term use isn't an issue imo, and even the most thrifty person will probably end up replacing their phone every 4~6 yrs. However, I'm used to having computer monitors be long-term things. My last monitor lasted about 10yrs before it died.
Aw, that's disappointing. At the same time though, if they're able to get even 10% of the 614,000 nits on commercial units, then they'd have to lose a significant amount of brightness to dim to current display levels.
Yeah, I hope so too.
So the formula for nits to Lumen is below:
Bruh...
1m² of the sun is 127,000 Lumen. This TV is at most 2 m². It'd certainly be the last thing you ever saw.
I checked the linked paper and sadly this brightness reduced the cell lifetime from over 5000h at 100 Nits to just around 5h.
So unless they find some magic, even better chemistry this TV as bright as the sun won't happen.