this post was submitted on 01 Nov 2023
145 points (88.4% liked)
Technology
59440 readers
3566 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Explaining what happens in a neural net is trivial. All they do is approximate (generally) nonlinear functions with a long series of multiplications and some rectification operations.
That isn't the hard part, you can track all of the math at each step.
The hard part is stating a simple explanation for the semantic meaning of each operation.
When a human solves a problem, we like to think that it occurs in discrete steps with simple goals: "First I will draw a diagram and put in the known information, then I will write the governing equations, then simplify them for the physics of the problem", and so on.
Neural nets don't appear to solve problems that way, each atomic operation does not have that semantic meaning. That is the root of all the reporting about how they are such 'black boxes' and researchers 'don't understand' how they work.
I wonder how our brain even comes to formulate these steps in a way we can comprehend, the amount of neurons and zones firing on all cylinders seems tiring to imagine
Yeah but most people don't know this and have never looked. It seems way more complex to the layman than it is because instinctually we assume that anything that accomplishes great feats must be incredibly intricate