this post was submitted on 29 Mar 2024
1016 points (98.0% liked)

Curated Tumblr

4394 readers
917 users here now

For preserving the least toxic and most culturally relevant Tumblr heritage posts.

The best transcribed post each week will be pinned and receive a random bitmap of a trophy superimposed with the author's username and a personalized message. Here are some OCR tools to assist you in your endeavors:

Don't be mean. I promise to do my best to judge that fairly.

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 73 points 11 months ago (3 children)

Apple, afaik, used to be doing this on-device rather than in the cloud. Not quite sure about the situation today.

[–] [email protected] 22 points 11 months ago (1 children)
[–] [email protected] 33 points 11 months ago (1 children)

I don’t. Corps gonna corp, if they can. But I’ve checked this using all the development, networking, and energy monitoring tools at my disposal and apple’s e2e and on-device guarantee does appear to hold. For now.

Still, those who can should audit periodically, even if they’re only doing it for the settlement.

[–] [email protected] 8 points 11 months ago (1 children)
[–] [email protected] 5 points 11 months ago* (last edited 11 months ago) (1 children)

Security is in my interest, but yw

[–] [email protected] 6 points 11 months ago (1 children)

They were inferencing a cnn on a mobile device? I have no clue but that would be costly battery wise at least.

[–] [email protected] 1 points 11 months ago* (last edited 11 months ago) (1 children)

They’ve been doing ML locally on devices for like a decade. Since way before all the AI hype. They’ve had dedicated ML inference cores in their chips for a long time too which helps the battery life situation.

[–] [email protected] 1 points 11 months ago

It couldn’t quite be a decade, a decade ago we only just had the vgg — but sure, broad strokes, they’ve been doing local stuff, cool.