this post was submitted on 10 May 2024
19 points (70.2% liked)

Technology

59374 readers
7248 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

A research team at Stanford is developing a new AI-assisted holographic imaging technology it claims is thinner, lighter, and higher quality than anything its researchers have seen.

the Stanford tech is currently just a prototype

top 16 comments
sorted by: hot top controversial new old
[–] [email protected] 39 points 6 months ago (2 children)

Any time a news headline asks a question, the answer is almost always "no"

[–] [email protected] 10 points 6 months ago* (last edited 6 months ago) (1 children)

After reading the article, this might be an exception.

[–] [email protected] 6 points 6 months ago

Agreed. The form factor is right. AR technology will only reach the possibility of mass adoption when it can fit in/on the existing eye-glasses form factor.

[–] [email protected] 4 points 6 months ago

This is the best summary I could come up with:


But Stanford’s Computational Imaging Lab has an entire page with visual aid after visual aid that suggests it could be onto something special: a thinner stack of holographic components that could nearly fit into standard glasses frames, and be trained to project realistic, full-color, moving 3D images that appear at varying depths.

Like other AR eyeglasses, they use waveguides, which are a component that guides light through glasses and into the wearer’s eyes.

But researchers say they’ve developed a unique “nanophotonic metasurface waveguide” that can “eliminate the need for bulky collimation optics,” and a “learned physical waveguide model” that uses AI algorithms to drastically improve image quality.

Although the Stanford tech is currently just a prototype, with working models that appear to be attached to a bench and 3D-printed frames, the researchers are looking to disrupt the current spatial computing market that also includes bulky passthrough mixed reality headsets like Apple’s Vision Pro, Meta’s Quest 3, and others.

Postdoctoral researcher Gun-Yeal Lee, who helped write the paper published in Nature, says there’s no other AR system that compares both in capability and compactness.

Companies like Meta have spent billions buying and building AR glasses technology, in the hopes of eventually producing a “holy grail” product the size and shape of normal glasses.


The original article contains 319 words, the summary contains 212 words. Saved 34%. I'm a bot and I'm open source!