this post was submitted on 23 Feb 2024
347 points (96.8% liked)

Technology

59207 readers
3264 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

College student put on academic probation for using Grammarly: ‘AI violation’::Marley Stevens, a junior at the University of North Georgia, says she was wrongly accused of cheating.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] -3 points 8 months ago (1 children)

A calculator (in most cases) can't just do a problem for you, and when it can those calculators are banned (the reason you can't use ti84s on gen chem exams in college, or and 89x in a calc 1 class). Such a tool means that you really don't have to understand to to get the answer. To me your comment reads that if I get the answer to a problem by typing it into wolfram alpha it's the same as working through the problem on your own, as long as you understand how WA got there. I wholeheartedly disagree that somebody that is using wolfram alpha to get all of their answers actually knows jack shit about math, kinda like how anybody using generative AI for writing doesn't have to know jack shit about the subject and just give a semi-specific prompt based on a small amount of prior research. It's very easy for me to type into a GPT bot "write a paper on the social and political factors that led to the haitian revolution. It's a completely different experience to sift through documents and actually learn what happened then write about that. I'm fairly confident I could "write" a solid paper using AI without doing almost any research if it's a topic I know literally anything about. Eg: I don't know very much about the physics of cars but I can definitely get generative AI to give you a decent paper on how and why increases in engine size can lead to an increase in efficiency just by knowing that fact to be true and proofreading the mess the AI throws together for me. The fact that you consider these tools the same as a calculator (which I might add that we still often restrict the use of, eg. no wolfram alpha on your multivariable final) is astounding to me tbh.

[–] [email protected] 3 points 8 months ago

My point is the tool is out there and you cant definitively prove that someone used AI. So we better figure out how to use it and test with the assumption that someone's using it. chatgpt is a fucking inaccurate mess. If you as a professor cant catch up with that youre using not doing your job. and using these AI detection tools is stupid and doesnt fix the problem. So what do we do now?