this post was submitted on 12 Jun 2024
1322 points (98.7% liked)

Memes

45550 readers
1679 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 35 points 4 months ago (2 children)

Well, most of the requests are handled on device with their own models. If it’s going to ChatGPT for something it will ask for permission and then use ChatGPT.

So the Apple Intelligence isn’t all ChatGPT. I think this deserves to be mentioned as a lot of the processing will be on device.

Also, I believe part of the deal is ChatGPT can save nothing and Apple are anonymising the requests too.

[–] [email protected] 1 points 4 months ago (1 children)

chatgpt won't save anything? Doubtful.

[–] [email protected] 7 points 4 months ago (2 children)

Brother I do not care about your doubts.

I want hard facts here.

Do you think that if you enter into a contract with a company like Apple they’ll just be like, aww shit they weren’t supposed to do that. Anyway let’s carry on.

No. This would open OpenAi up to potential lawsuits.

Even if they did save stuff. It gets anonymised by Apple before even being sent to ChatGPT servers.

[–] [email protected] 3 points 4 months ago (1 children)

thing is apple doesnt give a shit about ur privacy

[–] [email protected] 2 points 4 months ago (1 children)

Finally, a reasonable comment.

I would concede that they want to keep it all for themselves, although a lot of anonymising of data is done.

My point is Apple are not sharing it with every third party on the Earth.

If you’re using Android then you don’t really have a leg to stand on, unless you’re using GrapheneOS and you’ve sandboxed Google services.

I would rather use a device that maybe keeps it all for themselves. Rather than one where it is shared with Everyman and his dog.

Plenty of things you can shit on Apple for, but this isn’t one of them I’m afraid.

[–] [email protected] 1 points 4 months ago

careful, that's a hardcore tankie troll you replied to.

[–] [email protected] 3 points 4 months ago (2 children)

The hard fact is OpenAI is already exposing itself to lawsuits by training on copyrighted material.

So the proof here should be "what makes them trustworthy this time?"

[–] [email protected] 3 points 4 months ago* (last edited 4 months ago) (1 children)

Because Apples lawyers will go ham.

I don’t want my comments here to be received as shilling Apple, more that I want them to based on actual information that is provided and not opinion pieces.

The fact is, if they were to caught saving data then Apple would just end the contract. Is it worth it for them to lose out on that cash, for the sake of using it. When they can just use all the other sources where they are allowed to do that.

Anyway, I don’t care what anonymised data they may or may not save. It won’t be tied to me.

Edit: Do you have some information on this existing lawsuits and the contracts they broke?

[–] [email protected] 3 points 4 months ago (1 children)

Because Apples lawyers will go ham.

Google pays Apple $20 billion a year to keep their search on Apple devices. The subtext of "search" is Google pays Apple for your search data.

Apple has sold your data for the right price to Google, so there should be no expectation that they won't do the same with other companies.

[–] [email protected] 2 points 4 months ago (1 children)

They sell Google the right to keep it as the default, not that they’re selling data.

Again, point me to some proof of it being actually selling data. As to my understanding they pay for the default engine to be Google.

[–] [email protected] 1 points 4 months ago (1 children)

That Google is the search engine means Google gets that valuable search data. So they pay to be the default search engine to get your data.

[–] [email protected] 1 points 4 months ago

Sure, but let’s be honest. Even if it wasn’t the majority of people are still using Google anyway.

I prefer Arc Search myself.

[–] [email protected] 3 points 4 months ago

There's kind of a difference between "we scraped the internet and decided to use copyrighted content anyways because we decided to interpret copyright law as not being applicable to the content we generate using copyrighted content" (omegalul) and "we explicitly agreed to a legally-binding contract with Apple stating we won't do that".

[–] [email protected] 0 points 4 months ago (2 children)

Well, most of the requests are handled on device

Doubt.

Voice recognition, image recognition, yes. But actual questions will go to Apple servers.

[–] [email protected] 13 points 4 months ago* (last edited 4 months ago) (1 children)

Doubt.

Is this conjecture or can you provide some further reading, in the interest of not spreading misinformation.

Edit: I decided to read the info from Apple.

With Private Cloud Compute, Apple sets a new standard for privacy in AI, with the ability to flex and scale computational capacity between on-device processing, and larger, server-based models that run on dedicated Apple silicon servers. When requests are routed to Private Cloud Compute, data is not stored or made accessible to Apple and is only used to fulfill the user’s requests, and independent experts can verify this privacy.

Additionally, access to ChatGPT is integrated into Siri and systemwide Writing Tools across Apple’s platforms, allowing users to access its expertise — as well as its image- and document-understanding capabilities — without needing to jump between tools.

Say what you will about Apple, but privacy isn’t a concern for me. Perhaps, some independent experts will verify this in time.

[–] [email protected] 3 points 4 months ago (1 children)

Which is exactly what I said. It's not local.

That they are keeping the data you send private is irrelevant to the OP claim that the AI model answering questions is local.

[–] [email protected] 3 points 4 months ago

OP here being me.

Well, most of the requests are handled on device with their own models. If it’s going to ChatGPT for something it will ask for permission and then use ChatGPT.

I feel I was pretty explicit in explaining how some requests will go to ChatGPT.

[–] [email protected] 3 points 4 months ago

Apple has published papers on small LLM models and multimodal models already. I would be surprised if they aren't using them for on-device processing.