assassin_aragorn

joined 1 year ago
[–] [email protected] 4 points 1 week ago

Well put.

I'm sure plenty of people would be happy to be a personal assistant for searching, summarizing, and compiling information, as long as they were adequately paid for it.

[–] [email protected] 12 points 1 week ago (3 children)

There is an easy answer to this, but it's not being pursued by AI companies because it'll make them less money, albeit totally ethically.

Make all LLM models free to use, regardless of sophistication, and be collaborative with sharing the algorithms. They don't have to be open to everyone, but they can look at requests and grant them on merit without charging for it.

So how do they make money? How goes Google search make money? Advertisements. If you have a good, free product, advertisement space will follow. If it's impossible to make an AI product while also properly compensating people for training material, then don't make it a sold product. Use copyright training material freely to offer a free product with no premiums.

[–] [email protected] 2 points 1 week ago (2 children)

Copyright is a lesser evil compared to taking human labor and creativity for free to sell a product.

[–] [email protected] 1 points 1 week ago

Psi is used a lot in engineering. But honestly, pressure units are a bit of a mess. The metric unit is a Pascal, which is fundamentally defined as a Newton per square meter – unsurprisingly, that is an incredibly small quantity of pressure. It’s roughly 101,500 Pascals for standard atmospheric pressure. You’ll typically see pressure written in either kPa, MPa, or bars (1E5 Pascals) within a metric framework. For perspective, it’s 14.7 psi (lbs per square inch) for an atmosphere.

And personally, I think all of these are pretty silly when we could be using 1 atm instead, which is literally defined as standard atmospheric pressure. It’s a much easier way to visualize and intuitively grasp pressures.

BTU is another fun one. It’s the energy needed to raise 1 lb of water by 1 degF. Calorie is the energy to raise 1 g of water by 1 degC. Both are very pragmatic definitions and have a degree of intuition. Then they’re the metric unit, the Joule, which suffers from the same issue as Pascal. It’s the work done by a 1 Newton force pushing an object 1 meter. Once again, pretty small.

[–] [email protected] 2 points 2 weeks ago (1 children)

In some cases I'd argue, as an engineer, that having no calculator makes students better at advanced math and problem solving. It forces you to work with the variables and understand how to do the derivation. You learn a lot more manipulating the ideal gas formula as variables and then plugging in numbers at the end, versus adding numbers to start with. You start to implicitly understand the direct and inverse relationships with variables.

Plus, learning to directly use variables is very helpful for coding. And it makes problem solving much more of a focus. I once didn't have enough time left in an exam to come to a final numerical answer, so I instead wrote out exactly what steps I would take to get the answer -- which included doing some graphical solutions on a graphing calculator. I wrote how to use all the results, and I ended up with full credit for the question.

To me, that is the ultimate goal of math and problem solving education. The student should be able to describe how to solve the problem even without the tools to find the exact answer.

[–] [email protected] 6 points 2 weeks ago

That's a slippery slope fallacy. We can compensate the person with direct ownership without going through a chain of causality. We already do this when we buy goods and services.

I think the key thing in what you're saying about AI is "fully open source... locally execute it on their own hardware". Because if that's the case, I actually don't have any issues with how it uses IP or copyright. If it's an open source and free to use model without any strings attached, I'm all for it using copyrighted material and ignoring IP restrictions.

My issue is with how OpenAI and other companies do it. If you're going to sell a trained proprietary model, you don't get to ignore copyright. That model only exists because it used the labor and creativity of other people -- if the model is going to be sold, the people whose efforts went into it should get adequately compensated.

In the end, what will generative AI be -- a free, open source tool, or a paid corporate product? That determines how copyrighted training material should be treated. Free and open source, it's like a library. It's a boon to the public. But paid and corporate, it's just making undeserved money.

Funny enough, I think when we're aligned on the nature and monetization of the AI model, we're in agreement on copyright. Taking a picture of my turnips for yourself, or to create a larger creative project you sell? Sure. Taking a picture of my turnips to use in a corporation to churn out a product and charge for it? Give me my damn share.

[–] [email protected] 0 points 2 weeks ago (1 children)

Google doesn't sell the search engine as a product.

[–] [email protected] 3 points 2 weeks ago

Not to mention, a lot of museums have no photography rules.

[–] [email protected] 3 points 2 weeks ago

AI is the capitalist dream. Exploit the labor and creativity of others without paying them a cent.

[–] [email protected] 9 points 2 weeks ago (2 children)

They're someone else's turnips though, not yours. If you're going to make money selling pictures of them, don't you think the person who grew the turnips deserves a fair share of the proceeds?

Or from another perspective, if the person who grew them requests payment in return for you to take pictures of them, and you don't want to pay it -- why don't you go find other turnips? Or grow your own?

These LLMs are an end product of capitalism -- exploiting other people's labor and creativity without paying them so you can get rich.

[–] [email protected] 89 points 2 weeks ago

I can't make money without using OpenAI's paid products for free.

Checkmate motherfucker

[–] [email protected] 4 points 4 weeks ago (1 children)

Ah this makes more sense. I thought I had heard about a recusal.

 

I'm rather curious to see how the EU's privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn't have a paywall)

 

Which of the following sounds more reasonable?

  • I shouldn't have to pay for the content that I use to tune my LLM model and algorithm.

  • We shouldn't have to pay for the content we use to train and teach an AI.

By calling it AI, the corporations are able to advocate for a position that's blatantly pro corporate and anti writer/artist, and trick people into supporting it under the guise of a technological development.

 

There's just something fucking hilarious about laying off employees, mocking them, and being sued for improperly firing them -- and then whining that your competitor hired them and that they have access to Twitter information still.

I believe this fits well under the "fuck around and find out" doctrine.

view more: next ›