this post was submitted on 13 Dec 2023
591 points (98.8% liked)
Technology
59207 readers
3247 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Google chooses codecs based on what it guesses your hardware will decode. (iPhones get HEVC, Android gets VP9, etc) They just didn’t put much thought into arm based home devices outside of a specific few like the shield.
Why wouldn't it be my browser asking for the codecs it prefers instead of the website trying to guess my computer's hardware ?
Lots of hardware lies about its useful capabilities.
Can you run 4k? Of course. But can you run more than 4 frames a second?
The browser can lie all they want, at the end of the day the user has the final word if they want to change things.
My by now rather ancient rk3399 board can hardware-decode both at 4k 60Hz. Which has nothing to do with the fact that it's aarch64, but that Rockchip included a beast of a VPU (it was originally designed for set-top boxes).
How about, dunno, asking the browser what kind of media it would prefer?