this post was submitted on 09 Apr 2024
181 points (96.4% liked)
Technology
59374 readers
3846 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I've actually started to recognize the pattern of if something is written in AI
It's hard to describe but it's like an uncanny valley of quality, like if someone uses flowery SAT words to juje up their paper's word count but somehow even more
It's like the writing will occasionally pause to comment on itself and the dramatic effect its trying to achieve
Yeah it's called bullshitting. It's the way lots of people are encouraged to write in high school when the goal is to see if the student can write a large amount of prose with minimal grammatical errors.
But once you get to post-secondary you are expected for your writing to actually have content and be fairly concise in expressing that content. And AI falls on its face trying to do that.
Yeah, this is true! It likes to summarize things at the end in a stereotypical format
It's not a bad format either, AI seem to enjoy the five paragraph assay format above all other even for casual conversations.
Yes, it could be worse, but I'm stealing this and from now on calling the crappy AI essay format an "assay."
The LLM isn't really thinking, it is auto complete trained so the average person would be fooled thinking that text was produced by another human.
I'm not surprised it has flaws like that.
BTW here on Lenny there are communities with AI pictures. Someone created a similar community but with art created by humans.
While the AI results are very good, when you start looking and comparing it with non AI art, you start seeing that the AI while it is unique it still produces a cookie cutter results.
Yep, AI art is just getting through its irrational exuberance phase. It was (and sometimes is) impressive to create art in a style most of us can't draw or paint in. But AI models tend to produce very similar results unless very specifically prompted. AI art creators are also using a lot of other tools (like ControlNet, which allows you to replicate composition elements from another work) to break out of the "default AI model" look.
All of that points to an immediate future where AI art is seen as low-quality and instantly identifiable, except where AI art creators have spent a fair amount of time customizing and tailoring their image. Kind of like...real artists using pre-AI modern tools like Photoshop, filters, etc.
I have issue with using AI to write my resume. I just want it to clean up my grammar and maybe rephrase a few things just in a different way I wouldn't because I don't do the words real good. But I always end up with something that reads like I paid some influencer manager to write it. I write 90% of it myself so its all accurate and doesn't have AI errors. But it's just so obviously too good.
You are putting yourself down unnecessarily. You want your resume to talk you up. Whoever reads it is going to imagine that you embellished anyway. So if you just write it basically, they'll think you're unqualified or just don't understand how to write a resume.
“While the thing you entered in the prompt, it’s important to consult this other source on your prompt. In summary, your prompt.”
Writing papers is archaic and needs to go. College education needs to move with the times. Useful in doctorate work but everything below it can be skipped.
Learning to write is how a person begins to organize their thoughts, be persuasive, and evaluate conflicting sources.
It's maybe the most important thing someone can learn.
The trouble is that if it's skipped at lower levels doctorate students won't know how to do it anymore.
Are they going to know how to do it now if they're all just Chat GPTing it?
Clearly we need some alternative mode to demonstrate mastery of subject matter, I've seen some folks suggesting we go back to pen and paper writing but part of me wonders if the right approach is to lean in and start teaching what they should be querying and how to check the output for correctness, but honestly that still necessitates being able to check if someone's handing in something they worked on themself at all or if they just had something spit out their work for them.
My mind goes to the oral defense, have students answer questions about what they've submitted to see if they've familiarized themselves with the subject matter prior to cooking up what they submitted, but that feels too unfair to students with stage anxiety, even if you limit these kinds of papers to only once a year per class or something. Maybe something more like an interview with accomodation for socially panickable students?
I'm in software engineering. One would think that English would be a useless class for my major, yet at work I still have to write a lot of documents. Either preparing new features, explaining existing, writing instructions for others etc
BTW: with using AI to write essays, you generally have subject that is known and that many people write something similar, all of that was used to train it.
With technical writing you are generally describe something that is brand new and very unique so you won't be able to make AI write it for you.
When I come across a solid dev who is also a solid writer it's like they have super powers. Being about to write effectively is so important.
You can't have kids go through school never writing papers and then get to graduate school and expected to churn out long, well written papers.