i read that as training spicy autocomplete on wolfram alpha instead of another spicy autocomplete
Forced closure of that pipeline would probably impact Hungary the most, and maybe Serbia
also, relying on spicy autocomplete when trying to put together a deadly device sounds like cyberpunk-flavored darwin award material
i don't understand the "safety" angle here. if chatgpt can output authoritatively-looking sentence-shaped string about pipebombs, then it's only because similar content about pipebombs is already available on wide open internet. if model is closed, then at worst they would have to monitor its use (not like google blocks any similar information from showing up). if model is open, then no safeguards make sense in the first place. i guess it's more about legal liability for openai? now they can ignore it with all these bills about "ai safety" gone (for now)
you enjoy it? suspicious
many such cases...
CCP is out after their precious ~~bodily fluids~~ internet points
nuclear winter is not a thing that can possibly happen, from what i understand
sagan et al overstated amount of soot put in upper atmosphere over 10x, for no particular reason other than trying to make a point
notice how no one talked about it after desert storm? oilfield fires provided negative evidence
This only works well for fast things tho
the word you're looking for is "jedzie"
strictly speaking polish (and all slavic languages i think?) doesn't have a verb like "go". you have to specify, you can ride, drive, walk, sail, swim, roll and so on but you can't "go". that verb (which means "[there it] drives") would be usually used for land vehicles, for boats we'd use "płynie" ("swim", "sail", "flow" depending on context)
that's a tiny amount of harm reduction if there are other ways to get there
it can go in opposite way: some segment of propmtfondlers specifically went after one open-source locally ran model because it was "uncensored" (i think it was mistral) the logic in this one was, there's no search going out so you can "look up" anything and no one would be any wiser. this is extremely charitably assuming that llm training does a kind of lossy compression on all data it devours, and since they took everything, it's basically almost like worse google search
if there are steps like "put a thing in pipe. make sure to weld ends shut" then it's also harm reduction, but instead for everyone else. imagine getting eldest son'd by a bot, pathetic