this post was submitted on 27 Feb 2025
981 points (96.8% liked)

Technology

63375 readers
6146 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Per one tech forum this week: “Google has quietly installed an app on all Android devices called ‘Android System SafetyCore’. It claims to be a ‘security’ application, but whilst running in the background, it collects call logs, contacts, location, your microphone, and much more making this application ‘spyware’ and a HUGE privacy concern. It is strongly advised to uninstall this program if you can. To do this, navigate to 'Settings’ > 'Apps’, then delete the application.”

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 14 points 3 hours ago

Google says that SafetyCore “provides on-device infrastructure for securely and privately performing classification to help users detect unwanted content

Cheers Google but I'm a capable adult, and able to do this myself.

[–] [email protected] 19 points 8 hours ago

Thnx for this, just uninstalled it, google are arseholes

[–] [email protected] 39 points 11 hours ago (9 children)
[–] [email protected] 26 points 9 hours ago (3 children)

To quote the most salient post

The app doesn't provide client-side scanning used to report things to Google or anyone else. It provides on-device machine learning models usable by applications to classify content as being spam, scams, malware, etc. This allows apps to check content locally without sharing it with a service and mark it with warnings for users.

Which is a sorely needed feature to tackle problems like SMS scams

[–] [email protected] 1 points 2 hours ago (1 children)

You don't need advanced scanning technology running on every device with access to every single bit of data you ever seen to detect scam. You need telco operator to stop forwarding forged messages headers and… that's it. Cheap, efficient, zero risk related to invasion of privacy through a piece of software you did not need but was put there "for your own good".

[–] [email protected] 3 points 1 hour ago

I will perhaps be nitpicking, but... not exactly, not always. People get their shit hacked all the time due to poor practices. And then those hacked things can send emails and texts and other spam all they want, and it'll not be forged headers, so you still need spam filtering.

[–] [email protected] 8 points 7 hours ago (1 children)

Why do you need machine learning for detecting scams?

Is someone in 2025 trying to help you out of the goodness of their heart? No. Move on.

[–] [email protected] 4 points 6 hours ago (1 children)

If you want to talk money then it is in businesses best interest that money from their users is being used on their products, not being scammed through the use of their products.

Secondly machine learning or algorithms can detect patterns in ways a human can't. In some circles I've read that the programmers themselves can't decipher in the code how the end result is spat out, just that the inputs will guide it. Besides the fact that scammers can circumvent any carefully laid down antispam, antiscam, anti-virus through traditional software, a learning algorithm will be magnitudes harder to bypass. Or easier. Depends on the algorithm

[–] [email protected] 1 points 5 hours ago

I don't know the point of the first paragraph...scams are bad? Yes? Does anyone not agree? (I guess scammers)

For the second we are talking in the wild abstract, so I feel comfortable pointing out that every automated system humanity has come up with so far has pulled in our own biases and since ai models are trained by us, this should be no different. Second, if the models are fallible, you cannot talk about success without talking false positives. I don't care if it blocks every scammer out there if it also blocks a message from my doctor. Until we have data on consensus between these new algorithms and desired outcomes, it's pointless to claim they are better at X.

[–] [email protected] 6 points 7 hours ago (1 children)

if the cellular carriers were forced to verify that caller-ID (or SMS equivalent) was accurate SMS scams would disappear (or at least be weaker). Google shouldn't have to do the job of the carriers, and if they wanted to implement this anyway they should let the user choose what service they want to perform the task similar to how they let the user choose which "Android system WebView" should be used.

[–] [email protected] 4 points 6 hours ago

Carriers don't care. They are selling you data. They don't care how it's used. Google is selling you a phone. Apple held down the market for a long time for being the phone that has some of the best security. As an android user that makes me want to switch phones. Not carriers.

[–] [email protected] 8 points 8 hours ago

If the app did what op is claiming then the EU would have a field day fining google.

load more comments (7 replies)
[–] TheGoddessAnoia -2 points 4 hours ago (1 children)

True or not, one can avoid the whole issue by using your phone as a phone, maybe to send texts, with location, mike, and camera switched off permanently, and all the other apps deleted or disabled. Sure, Google will still know you called your SO daily and your Mom once a week (NOT ENOUGH!), and that you were supposed to pick up the dry cleaning last night (did you?). Meh. If that's what floats the Surveillance Society's boat, I am not too worried.

[–] [email protected] 5 points 3 hours ago

People can go further than that and install a ROM for their phone that doesn't have any Google apps on it. People can even use applications that normally require Google Play Services by using microG, which spoofs things. You can also root your phone with Magisk and use apps to block anything leaking anything else.

[–] [email protected] 10 points 11 hours ago

laughs in GrapheneOS

[–] [email protected] 2 points 8 hours ago

Great, it'll have to plow through ~30GB of 1080p recordings of darkness and my upstairs neighbors living it up in the AMs. And nothing else.

[–] [email protected] 8 points 12 hours ago (1 children)

More information: It's been rolling out to Android 9+ users since November 2024 as a high priority update. Some users are reporting it installs when on battery and off wifi, unlike most apps.

App description on Play store: SafetyCore is a Google system service for Android 9+ devices. It provides the underlying technology for features like the upcoming Sensitive Content Warnings feature in Google Messages that helps users protect themselves when receiving potentially unwanted content. While SafetyCore started rolling out last year, the Sensitive Content Warnings feature in Google Messages is a separate, optional feature and will begin its gradual rollout in 2025. The processing for the Sensitive Content Warnings feature is done on-device and all of the images or specific results and warnings are private to the user.

Description by google Sensitive Content Warnings is an optional feature that blurs images that may contain nudity before viewing, and then prompts with a “speed bump” that contains help-finding resources and options, including to view the content. When the feature is enabled, and an image that may contain nudity is about to be sent or forwarded, it also provides a speed bump to remind users of the risks of sending nude imagery and preventing accidental shares. - https://9to5google.com/android-safetycore-app-what-is-it/

So looks like something that sends pictures from your messages (at least initially) to Google for an AI to check whether they're "sensitive". The app is 44mb, so too small to contain a useful ai and I don't think this could happen on-phone, so it must require sending your on-phone data to Google?

[–] [email protected] 3 points 4 hours ago

I guess the app then downloads the required models

[–] [email protected] 6 points 12 hours ago (1 children)

The countdown to Android's slow and painful death is already ticking for a while.

It has become over-engineered and no longer appealing from a developer's viewpoint.

I still write code for Android because my customers need it - will be needing for a while - but I've stopped writng code for Apple's i-things and I research alternatives for Android. Rolling my own environment with FOSS components on top of Raspbian looks feasible already. On robots and automation, I already use it.

[–] [email protected] 1 points 7 hours ago (1 children)

What's over engineered about it?

[–] [email protected] 1 points 1 hour ago* (last edited 1 hour ago)

In my experience, the API has iteratively made it ever harder for applications to automatically perform previously easy jobs, and jobs which are trivial under ordinary Linux (e.g. become an access point, set the SSID, set the IP address, set the PSK, start a VPN connection, go into monitor / inject mode, access an USB device, write files to a directory of your choice, install an APK). Now there's a literal thicket of API calls and declarations to make, before you can do some of these things (and some are forever gone).

The obvious reason is that Google tries to protect a billion inexperienced people from scammers and malware.

But it kills the ability to do non-standard things, and the concept of your device being your own.

And a big problem is that so many apps rely on advertising for its income stream. Spying a little has been legitimized and turned into a business under Android. To maintain control, the operating system then has to be restrictive of apps. Which pisses off developers who have a trusting relationship with their customer and want their apps to have freedom to operate.

[–] [email protected] 40 points 18 hours ago

People don't seem to understand the risks presented by normalizing client-side scanning on closed source devices. Think about how image recognition works. It scans image content locally and matches to keywords or tags, describing the person, objects, emotions, and other characteristics. Even the rudimentary open-source model on an immich deployment on a Raspberry Pi can process thousands of images and make all the contents searchable with alarming speed and accuracy.

So once similar image analysis is done on a phone locally, and pre-encryption, it is trivial for Apple or Google to use that for whatever purposes their use terms allow. Forget the iCloud encryption backdoor. The big tech players can already scan content on your device pre-encryption.

And just because someone does a traffic analysis of the process itself (safety core or mediaanalysisd or whatever) and shows it doesn't directly phone home, doesn't mean it is safe. The entire OS is closed source, and it needs only to backchannel small amounts of data in order to fuck you over.

Remember the original justification for clientside scanning from Apple was "detecting CSAM". Well they backed away from that line of thinking but they kept all the client side scanning in iOS and Mac OS. It would be trivial for them to flag many other types of content and furnish that data to governments or third parties.

[–] [email protected] 34 points 18 hours ago (2 children)

I didn't have it in my app drawer but once I went to this link, it showed as installed. I un-installed it ASAP.

https://play.google.com/store/apps/details?id=com.google.android.safetycore&hl=en-US

load more comments (2 replies)
[–] [email protected] 12 points 15 hours ago (2 children)

Thanks. Just uninstalled. What a cunts

[–] [email protected] 6 points 10 hours ago (2 children)

Do we have any proof of it doing anything bad?

Taking Google's description of what it is it seems like a good thing. Of course we should absolutely assume Google is lying and it actually does something nefarious, but we should get some proof before picking up the pitchforks.

[–] [email protected] 2 points 3 hours ago (1 children)

Whether the people at Google who did this knows they are evil or thinks they are not evil doesn't really even matter. Having a phone app that automatically scans all your photos should scare the shit out of you. At the very least it wastes your battery and slows down your phone.

[–] [email protected] 1 points 2 hours ago

If it provided a feature to automatically block incoming dick pics, which Google claims it's for, was fully local, and only scanned incoming messages, not my own gallery, which is what Google claims, I would likely find it useful. There is nothing wrong with the idea in general.

At the very least it wastes your battery

Again, if it's an optional feature that you can choose to turn on or off, there is nothing wrong with that.

[–] [email protected] 8 points 9 hours ago* (last edited 9 hours ago) (3 children)

Google is always 100% lying.
There are too many instances to list and I'm not spending 5 hours collecting examples for you.
They removed don't be evil long time ago

[–] [email protected] 12 points 7 hours ago* (last edited 7 hours ago)

They removed don’t be evil long time ago

See, this is why I like proof. If you go to Google's Code of Conduct today, or any other archived version, you can see yourself that it was never removed. Yet everyone believed the clickbait articles claiming so. What happened is they moved it from the header to the footer, clickbait media reported that as "removed" and everyone ran with it, even though anyone can easily see it's not true, and it takes 30 seconds to verify, not even 5 hours.

Years later you are still repeating something that was made up just because you heard it a lot.

Of course Google is absolutely evil and the phrase was always meaningless whether it's there or not, but we can't just make up facts just because it fits our world view. And we have to be aware of confirmation bias. Yeah Google removing "don't be evil" sounds about right for them, right? It makes perfect sense. But it just plain didn't happen.

[–] [email protected] 1 points 6 hours ago

Maybe you should given your closing sentence is incorrect and just bolsters the fact we shouldn't blindly take everything we see at face value

[–] [email protected] 2 points 8 hours ago

Why check any sources first when you can just blindly rage and assume the worst?

https://grapheneos.social/@GrapheneOS/113969399311251057

[–] [email protected] 11 points 13 hours ago

I uninstalled it, and a couple of days later, it reappeared on my phone.

load more comments
view more: next ›