this post was submitted on 26 Feb 2026
67 points (94.7% liked)

Privacy

5397 readers
125 users here now

Welcome! This is a community for all those who are interested in protecting their privacy.

Rules

PS: Don't be a smartass and try to game the system, we'll know if you're breaking the rules when we see it!

  1. Be civil and no prejudice
  2. Don't promote big-tech software
  3. No apathy and defeatism for privacy (i.e. "They already have my data, why bother?")
  4. No reposting of news that was already posted
  5. No crypto, blockchain, NFTs
  6. No Xitter links (if absolutely necessary, use xcancel)

Related communities:

Some of these are only vaguely related, but great communities.

founded 1 year ago
MODERATORS
 

New study shows smart chatbots can figure out who you really are from just a few posts... and it only costs a couple of dollars.

all 22 comments
sorted by: hot top controversial new old
[–] homesweethomeMrL@lemmy.world 45 points 3 weeks ago (3 children)

Step 2: Search the whole internet: It quietly checks LinkedIn, Google, other Reddit accounts, etc., to find possible real people who match those clues.

Oh. Whew.

[–] gigachad@piefed.social 24 points 3 weeks ago

To be honest, internet search got so shitty, soon it will be a really impressive skill to search the internet efficiently

[–] floquant@lemmy.dbzer0.com 7 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

I thought Lemmy comments might be indexed anyway, but neither Kagi nor DDG turned up anything for my username. Wonder if it's different for other instances?

Edit: Kagi actually does show some threads I've participated in by enabling the "Fediverse Forums" filter

[–] Draconic_NEO@lemmy.dbzer0.com 3 points 3 weeks ago

Not surprising considering those challenges against AI scrapers likely also effect Search Engine crawlers. Stuff can get through if it's federated on other servers that don't have such measures but if you don't participate in communities on those instances it's less likely.

[–] Mac@mander.xyz 2 points 3 weeks ago* (last edited 3 weeks ago)

Searched "lemmy floquant" on DDG and one of your comments is the second result.

First result was the user profile of the same name but different instance.

[–] Draconic_NEO@lemmy.dbzer0.com 4 points 3 weeks ago* (last edited 3 weeks ago)

Yeah, call this article what it is, clickbait fear mongering.

[–] x550@lemmy.dbzer0.com 30 points 3 weeks ago (2 children)

Nothing of substance here. Stylometric analysis was already a thing. Easy enough to defeat and with correct opsec you can avoid it. Burn accounts regularly. Use accounts for specific topics or share accounts with others. Dont post personal details online , on the internet everyone is a cat.

[–] other_cat@piefed.zip 28 points 3 weeks ago

I certainly am!

[–] umbrella@lemmy.ml 3 points 3 weeks ago

what kind of opsec can defeat it?

[–] floquant@lemmy.dbzer0.com 20 points 3 weeks ago (1 children)

I don't like having to be vague about my age, nationality, job etc, because I'd rather be honest and relate to others online, but sadly it's a necessity in the modern landscape

[–] CucumberFetish@lemmy.dbzer0.com 10 points 3 weeks ago (1 children)

That's why you have multiple accounts. Some that are for above table things which can have your personal details and others for eating the rich

[–] GamingChairModel@lemmy.world 7 points 3 weeks ago

One account that can be correlated to place/city, willing to discuss local news and issues.

One account that can be correlated to family status, willing to mention details about relationships.

One account that can be correlated to career, willing to mention details about educational background, industry news, the job market, the workplace, etc.

One account that can be correlated to each distinct hobby or interest. Some interests can correlate among themselves (like an all sports account that discusses multiple sports) and are safe to discuss on a single account. Like my current account that is tech oriented, including some stuff about games or Linux or networking or even the tech industry. But keep the different interests on separate accounts.

Then different accounts for topics that you consider controversial or private.

And, preferably, spread all those accounts across multiple instances so that instance admins can't link accounts from metadata (client, OS, IP address, email verification), use completely unique usernames, and avoid unique markers like esoteric phrases, unique autocorrect errors, etc.

Even if an adversary can link two accounts, they probably can't link all of them.

[–] CucumberFetish@lemmy.dbzer0.com 13 points 3 weeks ago (2 children)

Looks like the LLM can be used to cross reference data from your pseudo private account to your public account. What a surprise

[–] pivot_root@lemmy.world 11 points 3 weeks ago (1 children)

It's a good thing that I work for Dick's Fish & Chips located on the main street of a bustling city in Antarctica. I wouldn't want the LLM to get it wrong.

The same Dick's Fish & Chips where in 1998, The Undertaker threw Mankind off Hell In A Cell, and plummeted 16 ft through an announcer's table?

That Dick's Fish & Chips?

[–] AmbitiousProcess@piefed.social 5 points 3 weeks ago

Yup. The only difference between this and what any individual could already do is just time and scale.

Data brokers and government surveillance organizations have already had specialized tools to do this sort of thing for a while now, it's just that LLMs reduce the complexity and specialization needed to actually make an implementation that works well as an individual person.

[–] HubertManne@piefed.social 11 points 3 weeks ago

and if they are wrong they still get the couple of dollars so win win. I can unmask anyone you want online from just a few posts and a name randomizer.

[–] Willoughby@piefed.world 8 points 3 weeks ago

Oh no, who's making them use all those terrible services that whore them out like that?

[–] corsicanguppy 3 points 3 weeks ago

upto

If the 'journalist' can't use a spell-check, I don't trust his opinions on automation.