TechTakes

1165 readers
12 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
1
2
3
 
 

archive link

https://archive.ph/n3Ffq

Judge Mehta’s Google decision is likely to be appealed. “Regardless of who wins or loses, this case probably has a date with the Supreme Court,” Mr. Kovacic said.

ah well

4
 
 

as I was reading through this one, the quotes I wanted to pull kept growing in size until it was just the whole article, so fuck it, this one’s pretty damning

here’s a thin sample of what you can expect, but it gets much worse from here:

Internal conversations at Nvidia viewed by 404 Media show when employees working on the project raised questions about potential legal issues surrounding the use of datasets compiled by academics for research purposes and YouTube videos, managers told them they had clearance to use that content from the highest levels of the company.

A former Nvidia employee, whom 404 Media granted anonymity to speak about internal Nvidia processes, said that employees were asked to scrape videos from Netflix, YouTube, and other sources to train an AI model for Nvidia’s Omniverse 3D world generator, self-driving car systems, and “digital human” products. The project, internally named Cosmos (but different from the company’s existing Cosmos deep learning product), has not yet been released to the public.

5
6
 
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

7
 
 

Feedback welcome! Here's the TL;DR list

  1. Listen more to more Black people
  2. Post less – and think before you post
  3. Call in, call out, and/or report anti-Blackness when you see it
  4. Support Black people and Black-led instances and projects

Other suggestions?

8
9
 
 

The internet has run out of training data - cadence's weblog (personal blog) https://cadence.moe/blog/2024-05-28-the-internet-has-run-out-of-training-data #AI #AIHype @techtakes @fuck_ai

10
11
12
13
14
15
16
17
 
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

18
19
20
21
1
submitted 3 months ago* (last edited 3 months ago) by [email protected] to c/[email protected]
 
 

"subreddit rules. Speak pro-ai thoughts freely."

DefendingAIArt is a subreddit run by mod "Trippy-Worlds," who also runs the debate sister subreddit AIWars. Some poking around made clear that AIWars is perfectly fine with having overt Nazis around, for example a guy with heil hitler in his name who accuses others of lying because they are "spiritually jewish." So we're off to a great start.

the first thing that drew my eye was this post from a would be employer:

My hobby is making games. Every artist have spoken to regarding my current project has rejected currency in exchange for referencing Al-made images.

not really clear what the title means, but this person seems to have had a string of encounters with the most based artists of all time.

Has anyone experienced this? They see Al work and lose their mind, some even have the nads to expect to get a pay multiplier to 'compensate" for the "theft" like my surname is fucking Altman. Like, bro, I can barely afford your highly- accomplished and talented ass and would be doing it for myself if had your skillset, yet you reject my money with prejudice because pushed my shitty programmer art a bit further with a piece of software which can't even use to a fraction of its full potential? That's a greeeeeeeeaaa way to convince me to keep your artstation username out of my prompts to public models, even if believe that particular spirit of behavior should be illegal

also claims to have been called "racial and gender slurs" for using ai art and that he was "kicked out of 20 groups" and some other things. idk what to tell this guy, it legitimately does suck that wealthy people have the money to pay for lots of art and the rest of us don't

Could we Ban the "No Al" Symbol? Someone proposed an idea to me: why not gather evidence and present it to the authorities who prohibited the display of the Swastika and other hate symbols? I was impressed by this suggestion. After researching, I found out that there are organizations that can categorize it as illegal if we can show evidence of the harm it has caused. I believe we can unite people, including artists who have suffered due to false accusations by anti-Al rioters, to support this cause. If we all sign a petition, we can ban the symbol, which would prevent its misuse on platforms like DeviantArt and stop the spread of misinformation. Would you support this initiative? Would you sign to end ignorance and compel them to advocate for fair regulations for Al, ensuring that nobody has to encounter this symbol and that those who use it for malicious purposes find no refuge?Or is it just not possible? Let's discuss.

I really enjoyed browsing around this subreddit, and a big part of that was seeing how much the stigma around AI gets to people who want to use it. pouring contempt on this stuff is good for the world

the above guy would like to know what combination of buttons to press to counter the "that just sounds like stealing from artists" attack. a commenter leaps in to help and immediately impales himself:

'just block and move on' 'these are my real life friends' 'oh...'

you hate to see it. another commenter points out that well ... maybe these people just aren't your friends

'antis will always just stab you in the back'

to close out, an example of fearmongering:

So I made a post on a sub with a rule against Alart and the Auto-mod does this...I'm assuming its fearmongering right? automod: Your comments and posts are being sold by Reddit to Google to train Al. You cannot opt out.

22
1
submitted 3 months ago* (last edited 3 months ago) by [email protected] to c/[email protected]
23
24
25
view more: next ›