If you actually have deep knowledge in a specialty, then you describe yourself as that specialty. ‘Full stack engineer’ coneys that you don’t have a specialty/are a master of nothing/your skills are _ shaped.
firelizzard
Experience != expertise or skill. I have never met someone who was actually good at both. Maybe if your backend is just some SQL queries. I am a backend engineer and I’m adequate at front end but I’d never hire someone whose skills were merely adequate unless I thought they had the potential to reach ‘good’.
Scripting languages being languages that are traditionally source distributed.
- Source distributed means you can read the source if it hasn't been obfuscated. OTOH, it is trivial to decompile Java and C# so this isn't a real difference for those languages (which happen to be compiled languages). So it's only relevant for languages specifically compiled to machine code.
- Source distributed means the recipient needs to install something. OTOH, Java and C#, again.
So the only ways that the distribution mechanism matter are really a difference between How does the distribution mechanism matter beyond that? And even those points are
They tend to be much easier to write
I'm assuming you are not saying "real" languages should be hard to write...
run slower
Objective-C and Go run slower than C and they're all compiled languages. Sure, an interpreter will be slower than a compiled language but modern languages aren't simply interpreted (i.e. JIT, etc).
often but not always dynamically typed, and operate at a higher level
There are dynamically typed compiled languages, and high level compiled languages.
It’s not a demeaning separation, just a useful categorization IMO.
Calling one class of languages "real" and another class something else is inherently demeaning. I wouldn't have cared enough to type this if you used "compiled vs scripting" instead of "real vs scripting". Though I disagree with using "scripting" at all to describe a language since that's an assertion of how you use the language, not of the language itself. "Interpreted" on the other hand is a descriptor of the language itself.
As someone who loves C there are lots of languages that seem too limiting and high level, doesn’t mean they aren’t useful tho.
I personally can't stand Java because the language designers decided to remove 'dangerous' features like pointers and unsigned integers because apparently programmers are children who are incapable of handling the risk. On the other hand I love Go. It's high level enough to be enjoyable and easy to write, but if you want to get into the weeds you can.
That line is blurring to the point where it barely exists any more. Compiled languages are becoming increasingly dynamic (e.g. JIT compilation, code generation at runtime) and interpreted languages are getting compiled. JavaScript is a great example: V8 uses LLVM (a traditional compiler) to optimize and compile hot functions into machine code.
IMO the only definition of “real” programming language that makes any sense is a (Turing complete) language you can realistically build production systems with. Anything else is pointlessly pedantic or gatekeeping.
I’d rather use a language that doesn’t treat me like an incompetent child, removing unsigned ints because “they’re a source of bugs”.
Or use a statically typed language that’s actually modern instead of C
Why? In my experience using a real debugger is always the superior choice. The only time I don’t is when I can’t.
Huh? Main file? Do you mean main package? A module can contain an arbitrary number of main packages but I don’t see how that has anything to do with this post. Also are you saying modules are equivalent to classes? That may be the strangest take I’ve ever heard about Go.
The person who uses the shitty tool is a moron. The person who makes the shitty tool is an asshole. At least in this case where the shitty tool is actively promoting shitty PRs.
If the ask is, why was the hardware like that in the first place the answer is because it can’t be fully validated.
But that's not the question. There are two questions: Who should be responsible for patching hardware vulnerabilities? And if the answer is "the kernel" then should speculative but never demonstrated vulnerabilities be patched? Linus' answer is the hardware manufacturer, and no.
Is this really the hardware vendor’s problem though? It’s the consumers problem.
Maybe we're running into the ambiguity of language. If you mean to say, "Who does it cause a problem for? The consumer." then sure. On the other hand what I mean, and what I think Linus means, is "Who's responsible for the vulnerability existing? Hardware vendors. Who should fix it? Hardware vendors."
If the ask is why should a speculative fix go into the Kernel [...]
Depends on what you/we/they mean by "speculative". IMO, we need to do something (microcode, kernel patches, whatever) to patch Spectre and Meltdown. Those have been demonstrated to be real vulnerabilities, even if no one has exploited them yet. But "speculative" can mean something else. I'm not going to read all the LMK emails so maybe they're talking about something else. But I've seen plenty of, "Well if X, Y, and Z happen then that could be a vulnerability." For that kind of speculative vulnerability, one that has not been demonstrated to be a real vulnerability, I am sympathetic to Linus' position.
If your job is to make websites and you make sites that don’t work on a browser that has over 100 million users you’re not doing your job.