this post was submitted on 21 Feb 2025
73 points (89.2% liked)

Cybersecurity

6409 readers
341 users here now

c/cybersecurity is a community centered on the cybersecurity and information security profession. You can come here to discuss news, post something interesting, or just chat with others.

THE RULES

Instance Rules

Community Rules

If you ask someone to hack your "friends" socials you're just going to get banned so don't do that.

Learn about hacking

Hack the Box

Try Hack Me

Pico Capture the flag

Other security-related communities [email protected] [email protected] [email protected] [email protected] [email protected]

Notable mention to [email protected]

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 16 points 1 week ago* (last edited 1 week ago) (18 children)

This is the web chat client/app, just like OpenAI sharing data with Microsoft, or Copilot doing the same. If you self host these LLMs your data stays within your LAN.

[–] [email protected] 6 points 1 week ago* (last edited 1 week ago) (13 children)

You can't practically self-host Deepseek R1.

Look, I use the 32B distil on my 3090 every day, but it is not the same thing as full R1. And people need to stop conflating the two.

And (theoretically) API usage through one of many R1 providers is private.

[–] [email protected] 6 points 1 week ago* (last edited 1 week ago) (2 children)

Do you know of a provider is actually private? The few privacy policies I checked all had something like "We might keep some of your data for some time for anti-abuse or other reasons"...

[–] [email protected] 1 points 1 week ago

I mean, not with certainty. If the risk of your input leaking is that great, you can just host your own VM with the 32B to be more certain.

[–] [email protected] 0 points 1 week ago

Trust me bro, they are private

load more comments (10 replies)
load more comments (14 replies)