CodeMonkey

joined 2 years ago
[โ€“] CodeMonkey@programming.dev 40 points 4 months ago (7 children)

Maybe it is just my experience, but in the last decade, employers stopped trying to recruit and retain top developers.

I have been a full time software engineer for more than a decade. In the 2010s, the mindset at tech giants seemed to be that they had to hire the best developers and do everything they could to keep them. The easiest way to do both was to be the best employer around. For example, Google had 20% time, many companies offered paid sabbaticals after so many years, and every office had catering once a week (if not a free cafeteria). That way, employees would be telling all of their friends how great it is to work for you and if they decide to look for other work, they would have to give up their cushy benefits.

Then, a few years before the pandemic, my employer switched to a different health insurance company and got the expected wave of complaints (the price of this drug went up, my doctor is not covered). HR responded with "our benefits package is above industry averages". That is a refrain I have been hearing since, even after switching employers. The company is not trying to be the best employer that everyone wants to work at, they just want to be above average. They are saying "go ahead and look for another employer, but they are probably going to be just as bad".

Obviously, this is just my view, so it is very possible that I have just been unlucky with my employers.

[โ€“] CodeMonkey@programming.dev 4 points 5 months ago (5 children)

pip is a perfectly usable package manager and is included in most python distributions now. Is it perfect? No, but it is good enough for every team I have been on.

[โ€“] CodeMonkey@programming.dev 2 points 6 months ago

I had a worse experience. My first internship was doing web development in ColdFusion. Why that language? Because when the company was first starting, none of the funders wanted to learn Linux/Apache administration and CF ran on Windows.

Also, the front end development team did not have version control but shared code via a file server.

[โ€“] CodeMonkey@programming.dev 9 points 6 months ago* (last edited 6 months ago) (5 children)

C does exactly what you tell it, no more. Why waste cycles setting a variable to a zero state when a correct program will set it to whatever initial state it expects? It is not user friendly, but it is performant.

[โ€“] CodeMonkey@programming.dev 16 points 7 months ago (1 children)

Are we really doing fine? 4% linux market share? Windows is a default?

I suspect that the issue hindering adoption is GNU and other user land projects, not the Linux kernel. Plenty of people use devices that pair a Linux kernel with an easy to use UI and popular software (see Android and Chromebook).

Many people would happily switch to a Linux based OS that had the exact same GUI as their current OS and ran the exact same software. That is not a realistic requirement in practice.

It is possible that Linux would have more adoption if they invested more money into having drivers for a wider range of hardware, but having Linux kernel develers write drivers instead of hardware vendors is not a strategy that scales well.

[โ€“] CodeMonkey@programming.dev 83 points 8 months ago (8 children)

Senior developer tip: squash the evidence.

[โ€“] CodeMonkey@programming.dev 11 points 8 months ago (2 children)

This is not a supply chain attack, it is sudden extreme enshitification. according to the article, the attacker also bought the GitHub repo, so all releases should be considered tainted. The community will have to find a fork from before the acquisition and hope that there are no pre-purchase favors smuggled in.

[โ€“] CodeMonkey@programming.dev 8 points 10 months ago (2 children)

The early days of the Internet, there was a cottage industry to burn Linux ISOs to CDs and selling them.

[โ€“] CodeMonkey@programming.dev 4 points 11 months ago

I work in Java, Golang, Python, with Helm, CircleCI, bash scripts, Makefiles, Terraform, and Terragrunt for testing and deployment. There are other teams handling the C++ and SQL (plus whatever dark magic QA uses).

[โ€“] CodeMonkey@programming.dev 9 points 11 months ago (1 children)

I am well aware of learning, but people tend to learn by comprehension and understanding. Completing phrases without understanding the language (or the concept of language) is the realm of LLM and Scrabble players.

[โ€“] CodeMonkey@programming.dev 58 points 11 months ago (7 children)

About 10 years ago, I read a paper that suggested mitigating a rubber hose attack by priming your sys admins with subconscious biases. I think this may have been it: https://www.usenix.org/system/files/conference/usenixsecurity12/sec12-final25.pdf

Essentially you turn your user to be an LLM for a nonsense language. You train them by having them read nonsense text. You then test them by giving them a sequence of text to complete and record how quickly and accurately they respond. Repeat until the accuracy is at an acceptable level.

Even if an attacker kidnaps the user and sends in a body double, with your user's id, security key, and means of biometric identification, they will still not succeed. Your user cannot teach their doppelganger the pattern and if the attacker tries to get the user on a video call, the added lag of the user reading the prompt and dictating the response should introduce a detectable amount of lag.

The only remaining avenue the attacker has is, after dumping the body of the original user, kidnap the family of another user and force that user to carry out the attack. The paper does not bother to cover this scenario, since the mitigation is obvious: your user conditioning should include a second module teaching users to value the security of your corporate assets above the lives of their loved ones.

view more: โ€น prev next โ€บ