this post was submitted on 12 Sep 2023
84 points (95.7% liked)

Technology

59440 readers
5762 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Researchers are experimenting with robots to help speed up the restoration of coral reefs.
...
Their researchers have been training an artificial intelligence to control collaborative robots (cobots), which work closely alongside humans.

"Some of these processes in coral propagation are just repetitive pick and place tasks, and they're ideally suited to robotic automation," says Ms Foster

A robotic arm can graft or glue coral fragments to the seed plugs. Another places them in the base, using vision systems to make decisions about how to grab it.

"Every piece of coral is different, even within the same species, so the robots need to recognise coral fragments and how to handle them," says Nic Carey, senior principal research scientist at Autodesk.

"So far, they're very good at handling the variability in coral shapes."

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 4 points 1 year ago (1 children)

Oh, man, I just realized the machine learning verbiage like saying “trained” instead of “programmed” is being embraced by tech companies in part to avoid responsibility for human choices.

Not that the coral reef robot scientists are doing that. They’re obviously using machine learning. I just found it odd to read they “trained” a non-living thing.

[–] [email protected] 6 points 1 year ago

There's a difference. When you program a machine it follows rigid logic. It's predictable.

When you train a machine it does not. It can make its own inferences and operate outside of strict parameters. It can also make bad inferences, what we call AI hallucinations.

I don't know that what you're saying is wrong about avoiding responsibility, but programmed is not the right word for what's basically a genie in a bottle. And we still hold accountable the member of the tribe that lets the genie out, or should anyway.