Going to have generations of people unable to think analytically or creatively, and just as bad, entering fields that require a real detailed knowledge of the subject and they don't. Going to see a lot of fuck ups in engineering, medicine, etc because of people faking it.
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
I teach at a community college. I see a lot of AI nonsense in my assignments.
So much so that I’m considering blue book exams for the fall.
For anyone who is also not from the US:
A blue book exam is a type of test administered at many post-secondary schools in the United States. Blue book exams typically include one or more essays or short-answer questions. Sometimes the instructor will provide students with a list of possible essay topics prior to the test itself and will then choose one or let the student choose from two or more topics that appear on the test.
EDIT, as an extra to solve the mystery:
Butler University in Indianapolis was the first to introduce exam blue books, which first appeared in the late 1920s.[1] They were given a blue color because Butler's school colors are blue and white; therefore they were named "blue books".
Importantly it is hand written, no computers.
Biggest issue is that kids’ handwriting often sucks. That’s not a new problem but it’s a problem with handwritten work.
Unfortunately, I think many kids could easily approach AI the same way older generations thought of math, calculators, and the infamous “you won’t have a calculator with you everywhere.” If I was a kid today and I knew I didn’t have to know everything because I could just look it up, instantly; I too would become quite lazy. Even if the AI now can’t do it, they are smart enough to know AI in 10 years will. I’m not saying this is right, but I see how many kids would end up there.
know AI in 10 years will.
That kind of the main problem: there is no indication that it will. I know one thing: current way LLM works, the chances that the problem of "lying" and "hallucinations", will even be solved are slim to none. There could be some mechanism that works in tandem with the bullshit generator machine to keep it in check, but it doesn't exist yet.
So most likely either we will collectively learn this fact and stop relying on this bullshit, which means there is a generation of kids who essentially skipped a learning phase, or we don't learn this fact, and there will be a society of mindless zombies that are fed lies and random bullshit on a second-to-second basis.
Both cases are bleak, but the second one is nightmarish.
This could be complete bullshit because im not an expert but i sometimes think that we could have a future where without testing and nurturing peoples critical thinking skills we end up with people who dont know how to create a rational argument or assess information they are given for its accuracy and authenticity, or to know when they are being deceived by malicious actors.
English writing assignments as simple as a book report require you to take different views and angles on something to understand it better and the nuances of the whole, but tell a LLM to write it for you and you are not developing that part of your own mind where you may learn to do things like see the whole story above the individual events noise, see things from others perspective/feelings and understand alternate world views. These are critical for having empathy for others and understanding the world around you.
And that is just one small example i came up with.
We are already there. Just look at the state of society right now and observe the critical thinking and media literacy skills of the average person.
In the words of cyberpunk author Wiilam Gibson: “The future is already here – it’s just not very evenly distributed.“
The cynical view of America’s educational system—that it is merely a means by which privileged co-eds can make the right connections, build “social capital,” and get laid—is obviously on full display here.
Cynical? I call that realistic. That's what privileged co-eds have been using it for the past 100 years.
If we decide to ban smartphones from schools we should ban them from work too. I'm supposed to be writing an article right now and instead I'm here. Then we should ban them from streets so that people have to pay attention to where they are going and the things going on around them. At that point we'd have something like functioning human beings again instead of mindless zombies. We could still have terminals for plugging into the Machine but our time with it should be regulated (like it already is with research clusters) so that we don't waste energy. There, the whole problem is solved and all it takes is a global butlerian jihad.
I'm thinking the only way people will be able to do schoolwork without cheating now is going to be to make them sit in a monitored room and finish it there.
Honest question: how do we measure critical thinking and creativity in students?
If we're going to claim that education is being destroyed (and show we're better than our great^n grandparents complaining about the printing press), I think we should try to have actual data instead of these think-pieces and anecdata from teachers. Every other technology that the kids were using had think-pieces and anecdata.
As far as I can tell, the strongest data is wrt literacy and numeracy, and both of those are dropping linearly with previous downward trends from before AI, am I wrong? We're also still seeing kids from lockdown, which seems like a much more obvious 'oh that's a problem' than the AI stuff.