this post was submitted on 27 Feb 2024
214 points (90.5% liked)
Technology
59374 readers
3392 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How do you write this article and not once reference I/O Psychology or the literature that examines how well various tests predict job performance? (e.g. Schmidt and Hunter, 1998)
I swear this isn't witchcraft. You just analyze the job, determine the knowledge and skills that are important, required at entry, and can't be obtained in a 15 minute orientation, and then hire based on those things. It takes a few hours worth of meetings. I've done it dozens of times.
But really what all that boils down to is get someone knowledgeable about the role and have them write any questions and design the exercises. Don't let some dingleberry MBA ask people how to move Mt. Fuji or whatever dumb trendy thing they're teaching in business school these days.
That's a 74 page article, do you care to summarize it or provide a specific area?
Thanks for a reference. Interesting.
The cool thing about it is that the core of it is really just one page.
There's a page in there with a list of types of tests and their respective r values, which is a number between zero and one that explains how well a given type of test predicts job performance based on this gigantic meta analysis the researchers ran. Zero means there's no relationship between the test and job performance and one means the test predicts job performance perfectly.
Generally you want something better than .3 for high stakes things like jobs. Education and experience sits at ... .11 or so. It's pretty bad. By contrast, skills tests do really well. Depending on the type they can go over .4. That's a pretty big benefit if you're hiring lots of people.
That said it can be very hard to convince people that "just having a conversation with someone" isn't all that predictive at scale. Industry calls that an "unstructured interview" and they're terrible vectors for unconscious or conscious bias. "Hey, you went to the same school as me..." and now that person is viewed favorably.
Seriously this stuff is WELL STUDIED but for some reason the MBA lizards never care. It's maddening.
For anyone who's interested, there's a copy of the study here: https://home.ubalt.edu/tmitch/645/session%204/Schmidt%20&%20Oh%20validity%20and%20util%20100%20yrs%20of%20research%20Wk%20PPR%202016.pdf
A problen in any mericratic system is accounting for personal bias. Its very hard for some people to see someone do something different and assume they are as good or better then themselves.
THEY went to college or THEY didn't and all of the personal reasons they assumed when making that decision get reapplied to others.
THEY didn't get where they are by being hired off of well studied hiring mechanism so why would think it needs changed, the old system works for them.
Same issue with election reforms.