cypherpunks

joined 3 years ago
MODERATOR OF
[–] [email protected] 9 points 6 hours ago (2 children)

I often see Rust mentioned at the same time as MIT-type licenses. Is it just a cultural thing that people who write Rust dislike Libre licenses?

The word "libre" in the context of licensing exists to clarify the ambiguity of the word "free", to emphasize that it means "free as in freedom" rather than "free as in beer" (aka no cost, or gratis) as the FSF explains here.

The MIT license is a "libre" license, because it does meet the Free Software Definition.

I think the word you are looking for here is copyleft: the MIT license is a permissive license, meaning it is not a copyleft license.

I don't know enough about the Rust community to say why, but from a distance my impression is that yes they do appear to have a cultural preference for permissive licenses.

[–] [email protected] 10 points 18 hours ago

fyi: GNU coreutils are licensed GPL, not AGPL.

there is so much other confusion in this thread, i can't even 🤦

[–] [email protected] 9 points 18 hours ago* (last edited 17 hours ago)

imo the pejorative connotation of that word, and homophobia generally, is ultimately rooted in misogyny

"always has been" meme with "wait, it's all about maintaining the patriarchy?"

46
2meirl4meirl (leminal.space)
 
[–] [email protected] 1 points 1 day ago (3 children)

They were not "passing the time" (more than people on ISS always are... 🤡).

The decision was made in August that they would join the Expedition 71/72 crew.

As a result, Stephanie Wilson and Zena Cardman did not get to fly on Crew-9.

Yes they could go back to Earth in case of an emergency and would have to take the other passengers with them prior to the end of their missions.

It would have been prior to the end of their own missions as well, since they became members of Expedition 72.

They couldn’t not return to Earth simply because they decided they wanted to go back.

Do you think astronauts ever go to space with the expectation that they can "return to Earth simply because they decided they wanted to go back"? 😂

[–] [email protected] 5 points 1 day ago (5 children)

The fact is they were stranded.

The astronauts said "we don't feel abandoned, we don't feel stuck, we don't feel stranded", because they were prepared for this contingency. And they object to this nonsense framing too: "Help us change the rhetoric, help us change the narrative. Let’s change it to ‘prepared and committed."

They couldn’t go back to Earth when initially planned, nor at any time before the following mission ended.

You are simply mistaken. At all times there were multiple options available for them to go back early if it ever became necessary to.

Changing the planned duration of a mission is not the same as being "stranded".

[–] [email protected] 31 points 1 day ago (1 children)

Apple makes the source code to all their core utilities available

Apple makes the source code for many open source things they distribute available, but often only long after they have shipped binaries. And many parts of their OS which they developed in-house which could also be called "core utilities" are not open source at all.

Every Linux distro uses CUPS for printing. Apple wrote that and gave it away as free software.

Apple did not write cups.It was was created by Michael R. Sweet in 1997, and was GPL-licensed and used on Linux distros before Mac OS X existed. Apple didn't want to be bound by the GPL so they purchased a different license for it in 2002.

Later, in 2007 they bought the source code and hired msweet to continue its development, and at some point the license of the FOSS version was changed to "GNU General Public License ("GPL") and GNU Library General Public License ("LGPL"), Version 2, with an exception for Apple operating systems."

In 2017 it was relicensed Apache 2.0.

Finally, "In December 2019, Michael left Apple to start Lakeside Robotics. In September 2020 he teamed up with the OpenPrinting developers to fork Apple CUPS to continue its development. Today Apple CUPS is the version of CUPS that is provided with macOS® and iOS® while OpenPrinting CUPS is the version of CUPS being further developed by OpenPrinting for all operating systems."

[–] [email protected] 12 points 1 day ago

As others have said, it depends on the city but probably yes.

For example, in San Mateo, California, it would be an offense under title 11 of the Municipal Code:

11.12.050 EXCESSIVE ACCELERATION OF MOTOR VEHICLES.

It is unlawful for any person operating a motor vehicle within the City to so accelerate the same as to cause audible noise by tire friction on pavement or to cause the tires of said vehicle to leave skid marks upon the pavement, except when such acceleration is reasonably necessary to avoid a collision. Any person violating this section shall be guilty of a public offense.

[–] [email protected] 24 points 1 day ago* (last edited 1 day ago) (11 children)

stranded astronauts from the ISS

The Facts Behind the Delayed Return of U.S. Astronauts

In response to Musk’s claims, several astronauts took to X to refute the idea that the astronauts were purposefully abandoned. Andreas Mogensen, a former SpaceX astronaut from Denmark, posted: “What a lie. And from someone who complains about lack of honesty from the mainstream media.” In response to Mogensen, Elon replied: “You are fully rеtarded. SpaceX could have brought them back several months ago. I OFFERED THIS DIRECTLY to the Biden administration and they refused. Return WAS pushed back for political reasons. Idiot.”

Mogensen responded by stating, “Elon, I have long admired you and what you have accomplished, especially at SpaceX and Tesla. You know as well as I do, that Butch and Suni are returning with Crew-9, as has been the plan since last September. Even now, you are not sending up a rescue ship to bring them home. They are returning on the Dragon capsule that has been on ISS since last September.”

Steve Stich, the program manager for NASA’s Commercial Crew Program, said after the determination was made that Williams and Wilmore should not return on the Boeing Starliner, NASA officials met with SpaceX officials and considered “a wide range of options” and ultimately decided to attach the astronauts to the previously scheduled Crew-9 mission.

“When we looked at the situation at the time, we had a Crew-9 launch in front of us, it made sense to take the opportunity to bring Crew-9 up with just two seats and have Butch and Suni fill in and do the rest of the long duration mission,” said Dana Weigel, manager of NASA’s International Space Station Program.

“We thought the plan that we came up with made a lot of sense, and that, especially for Butch and Suni we know they’re experienced astronauts, they’re great in space,” Bowersox said. “We knew they’d be great additions to the crew and we knew that for most astronauts, spending extra time on orbit’s really a gift. And we thought they’d probably enjoy their time there. So we thought it was a good way to go … for a lot of reasons.”

see also: NASA astronauts — from space — discredit Trump claims they’re stranded

@[email protected] is any of the above news to you, or were you aware of this already when you posted your "stranded astronauts from the ISS" comments?

15
Cow tools (en.wikipedia.org)
 

Today, we’re excited to announce AI Labyrinth, a new mitigation approach that uses AI-generated content to slow down, confuse, and waste the resources of AI Crawlers and other bots that don’t respect “no crawl” directives. When you opt in, Cloudflare will automatically deploy an AI-generated set of linked pages when we detect inappropriate bot activity, without the need for customers to create any custom rules.

And it's "free"! (visibility in to all of that traffic is more than sufficient payment for them 🤑)

Here are some perhaps-contradictory highlights from their blog post (emphasis mine), which I'm pretty sure was itself written with LLM assistance:

No real human would go four links deep into a maze of AI-generated nonsense.

When these links are followed, we know with high confidence that it's automated crawler activity, as human visitors and legitimate browsers would never see or click them. This provides us with a powerful identification mechanism, generating valuable data that feeds into our machine learning models. By analyzing which crawlers are following these hidden pathways, we can identify new bot patterns and signatures that might otherwise go undetected.

But as bots have evolved, they now proactively look for honeypot techniques like hidden links, making this approach less effective.

AI Labyrinth won’t simply add invisible links, but will eventually create whole networks of linked URLs that are much more realistic, and not trivial for automated programs to spot. The content on the pages is obviously content no human would spend time-consuming, but AI bots are programmed to crawl rather deeply to harvest as much data as possible. When bots hit these URLs, we can be confident they aren’t actual humans, and this information is recorded and automatically fed to our machine learning models to help improve our bot identification. This creates a beneficial feedback loop where each scraping attempt helps protect all Cloudflare customers.

This is only the first iteration of using generative AI to thwart bots for us. Currently, while the content we generate is convincingly human, it won’t conform to the existing structure of every website. In the future, we’ll continue to work to make these links harder to spot and make them fit seamlessly into the existing structure of the website they’re embedded in. You can help us by opting in now.

 

Today, we’re excited to announce AI Labyrinth, a new mitigation approach that uses AI-generated content to slow down, confuse, and waste the resources of AI Crawlers and other bots that don’t respect “no crawl” directives. When you opt in, Cloudflare will automatically deploy an AI-generated set of linked pages when we detect inappropriate bot activity, without the need for customers to create any custom rules.

And it's "free"! (visibility in to all of that traffic is more than sufficient payment for them 🤑)

Here are some perhaps-contradictory highlights from their blog post (emphasis mine), which I'm pretty sure was itself written with LLM assistance:

No real human would go four links deep into a maze of AI-generated nonsense.

When these links are followed, we know with high confidence that it's automated crawler activity, as human visitors and legitimate browsers would never see or click them. This provides us with a powerful identification mechanism, generating valuable data that feeds into our machine learning models. By analyzing which crawlers are following these hidden pathways, we can identify new bot patterns and signatures that might otherwise go undetected.

But as bots have evolved, they now proactively look for honeypot techniques like hidden links, making this approach less effective.

AI Labyrinth won’t simply add invisible links, but will eventually create whole networks of linked URLs that are much more realistic, and not trivial for automated programs to spot. The content on the pages is obviously content no human would spend time-consuming, but AI bots are programmed to crawl rather deeply to harvest as much data as possible. When bots hit these URLs, we can be confident they aren’t actual humans, and this information is recorded and automatically fed to our machine learning models to help improve our bot identification. This creates a beneficial feedback loop where each scraping attempt helps protect all Cloudflare customers.

This is only the first iteration of using generative AI to thwart bots for us. Currently, while the content we generate is convincingly human, it won’t conform to the existing structure of every website. In the future, we’ll continue to work to make these links harder to spot and make them fit seamlessly into the existing structure of the website they’re embedded in. You can help us by opting in now.

 

Today, we’re excited to announce AI Labyrinth, a new mitigation approach that uses AI-generated content to slow down, confuse, and waste the resources of AI Crawlers and other bots that don’t respect “no crawl” directives. When you opt in, Cloudflare will automatically deploy an AI-generated set of linked pages when we detect inappropriate bot activity, without the need for customers to create any custom rules.

And it's "free"! (visibility in to all of that traffic is more than sufficient payment for them 🤑)

Here are some perhaps-contradictory highlights from their blog post (emphasis mine), which I'm pretty sure was itself written with LLM assistance:

No real human would go four links deep into a maze of AI-generated nonsense.

When these links are followed, we know with high confidence that it's automated crawler activity, as human visitors and legitimate browsers would never see or click them. This provides us with a powerful identification mechanism, generating valuable data that feeds into our machine learning models. By analyzing which crawlers are following these hidden pathways, we can identify new bot patterns and signatures that might otherwise go undetected.

But as bots have evolved, they now proactively look for honeypot techniques like hidden links, making this approach less effective.

AI Labyrinth won’t simply add invisible links, but will eventually create whole networks of linked URLs that are much more realistic, and not trivial for automated programs to spot. The content on the pages is obviously content no human would spend time-consuming, but AI bots are programmed to crawl rather deeply to harvest as much data as possible. When bots hit these URLs, we can be confident they aren’t actual humans, and this information is recorded and automatically fed to our machine learning models to help improve our bot identification. This creates a beneficial feedback loop where each scraping attempt helps protect all Cloudflare customers.

This is only the first iteration of using generative AI to thwart bots for us. Currently, while the content we generate is convincingly human, it won’t conform to the existing structure of every website. In the future, we’ll continue to work to make these links harder to spot and make them fit seamlessly into the existing structure of the website they’re embedded in. You can help us by opting in now.

[–] [email protected] 6 points 4 days ago (2 children)

for example, on a linux distro, we could modify the desktop environment and make it waaaaay lighter by getting rid of jpg or png icons and just using pure svg on it.

this has largely happened; if you're on a dpkg-based distro try running this command:

dpkg -S svg | grep svg$ | sort

...and you'll see that your distro includes thousands of SVG files :)

explanation of that pipeline:

  • dpkg -S svg - this searches for files installed by the package manager which contain "svg" in their path
  • grep svg$ - this filters the output to only show paths which end with svg; that is, the actual svg files. the argument to grep is a regular expression, where $ means "end of line". you can invert the match (to see the paths dpkg -S svg found which only contain "svg" in the middle of the path) by writing grep -v svg$ instead.
  • the sort command does what it says on the tin, and makes the output easier to read

you can run man dpkg, man grep, and man sort to read more about each of these commands.

view more: next ›