Marzepansion

joined 2 years ago
[–] Marzepansion@programming.dev 32 points 2 years ago* (last edited 2 years ago) (1 children)

I don't agree with what this proposal is aiming to do (and goes against prior EU related privacy rulings), but unfettered free speech isn't as "free" as the average American thinks it is, besides that the EU already doesn't have free speech. Many regions ban Nazi related speech for obvious historical reasons.

I'd reconsider using America's "free speech" as a model as they barely practice what they preach. Sure they have free speech, but they lack privacy protection mechanisms that then allow their police to skirt the rules and obtain evidence using tools that completely breach the veil of privacy, something many EU countries (including my own) have voted can never be used. The scope of intel gathering their intelligence community is capable of already is at a level where privacy no longer exist and all you're left with is the illusion of it.

What I'm saying is, sure this proposal is bad, but what we need isn't free speech, but protected privacy. Something the EU is having some decent success with already (compare to the US where this is conveniently forgotten as technology improves, see the earlier police argument to see what that leads to). Speech isn't going to be the only problem, as cameras achieve the ability to do facial recognition and track you everywhere (something I know EU is/has banned, see the "AI act"), and more technology allows for other types of tracking

[–] Marzepansion@programming.dev 5 points 2 years ago* (last edited 2 years ago)

You raised an issue that the other bulletpoint has the solution for, I really don't see how these are "key differences".

In Rust there always only one owner while in C++ you can leak ownership if you are using shared_ptr.

That's what unique_ptr would be for. If you don't want to leak ownership, unique pointer is exactly what you are looking for.

In Rust you can borrow references you do not own safely and in C++ there is no gurantee a unique_ptr can be shared safely.

Well yeah, because that's what shared_ptr is for. If you need to borrow references, then it's a shared lifetime. If the code doesn't participate in lifetime, then ofcourse you can pass a reference safely even to whatever a unique_ptr points to.

The last bulletpoint, sure that's a key difference, but it's partially incorrect. I deal with performance (as well as write Rust code professionally), this set of optimizations isn't so impactful in an average large codebase. There's no magical optimization that can be done to improve how fast objects get destroyed, but what you can optimize is aliasing issues, which languages like C++ and C have issues with (which is why vendor specific keywords like __restrict exists). This can have profound impact in very small segments of your codebase, though the average programmer is rarely ever going to run into that case.

[–] Marzepansion@programming.dev 8 points 2 years ago (1 children)

I participated in this, have to say it was fun and it's been a thing I've said for years could make (at least) linear algebra lessons more interesting to young people. Shaders are the epitome of "imagery through math", and if something like this was included in my linear algebra classes I would have paid much more interest in school.

Funny now that this is my day job. I'm definitely looking forward to the video by IQ that is being made about this event.

To explain some of the error pixels: the way you got a pixel on the board was by elaborately writing down all operations in details (yes this included even simply multiplications), the goal wasn't if the pixel was correct or not, and depending on the location of your pixel the calculation could be a bit more complex, as long as you had written down your steps to get the result as detailed as possible.

More than likely simple mistakes were made in some of these people's calculations that made them take a wrong branch when dealing with conditionals. Hopefully the postmortem video will shed some light on these.

[–] Marzepansion@programming.dev 4 points 2 years ago

He's making a video as a post mortem to this experiment, so it might still be released. But I can see why it would be better not to share them (aside from privacy/legal concerns as there was no such release agreement), some of the contributors used their real names, I may be one of them. It could be a bit shameful to see this attached to your real name. They might have submitted their initial draft and then, due to circumstances, could not update the results in the several hour window that was afforded to you.

Luckily my pixels look correct though.

[–] Marzepansion@programming.dev 5 points 2 years ago* (last edited 2 years ago)

It's perhaps better that patch notes are written by programmers and not linguists. Incorrectly using a (harmless) phrase is perfectly okay. It doesn't detract from the important bits of the announcement at all.

edit: damn, that's a big reaction for an accidental mistake someone wrote in a patch notes highlight article.

[–] Marzepansion@programming.dev 1 points 2 years ago

Ah, I see now what you meant. I thought you were being sarcastic due to the italics, my bad!

[–] Marzepansion@programming.dev 3 points 2 years ago (2 children)

If you read the linked post there you'll see that it's about devs discussing Serde's actions on the GH issue. How are they not related?

[–] Marzepansion@programming.dev 5 points 2 years ago* (last edited 2 years ago)

Hey, game dev here (well currently working for a company that works with many dev studios), graphics programmer in particular. It depends on what you want to do, is your primary usage going to be programming? You can get away with integrated graphics cards, as long as you stick to programmer-art quality level of environment details (which you would normally do anyway to test code).

You can get pretty far into the dev process with minimal need for a detailed 3D env.

There will be a perf hit for an external GPU just because of the physics involved (proximity and type of connection matters a lot in computers, this is why your CPU has L-caches on the cores).

I actually have a crap GPU always laying around because it's also the best to test out performance issues. Nothing drives you to improve perf than a choppy framerate ;)

Most of my colleagues in my previous company were rocking 960's or worse till last year, myself included. And we were a team of graphics programmers working on GPU driver-like software.

I'd say, try it out, download Godot and an example project, run it and see how well it performs. If the perf looks fine, congrats, it's a good idea. If the performance is bad, look at the quality of the example project and think "will I make anything visually more complex?" if not, congrats everything is good. Otherwise, well consider an external GPU if you think that's best.

I'd suggest getting a desktop though if you ever decide to keep going down the game dev line, just to keep upgrade costs low. I operate on a +-5 years cadence to modify parts, alternating between my CPU and GPU mainly. So I don't replace the entire thing, but in 2 years I'll be updating my CPU and in 4 it'll be my GPU. I also have a crap laptop for when I'm on the road, and use the desktop for my actual work. I can always remote desktop into my desktop if I need something with more power to compile or render.

To put your hardware in perspective, it would have beaten my desktop of 15 years ago and I was already doing game dev back then just fine. So you could definitely do game dev with it, the big question is "what type of game dev".

(sorry for the chaotic nature of this response, hope you got something helpful out of it)

[–] Marzepansion@programming.dev 21 points 2 years ago* (last edited 2 years ago) (2 children)

Besides some countries in the EU already have electronic ID identifiers. They can just contact them to verify I'm claiming who I am without this weird "yeah we need a picture of you, and look through your webcam". Banks don't need to do this to verify who I am, so I don't see why "X" needs this weird privacy invading process

Thankfully I don't care about X (lol), and with more and more of my industry moving to mastodon I'm quite happy that I need it less and less to keep up with papers and articles

[–] Marzepansion@programming.dev 42 points 2 years ago (1 children)

As with all jokes it matters who the audience is. My friends can make off-colour jokes with me, I can reciprocate with off-jokes. But I would never do this with people not fully aware of my actual opinions. This also counts to clear misogynistic jokes.

My closest female friends they would be fine with it, they've known me for years, I've supported them in their lowest and they know I would never mean the a horrible thing I say. They'll happily reciprocate with some toxic male jokes, or some gay jokes. That said, even when I make them they are both clear intended to be jokes, but if they ever looked uncomfortable then it would be my guilt to bear, as at the end, as the audience they are meant to enjoy the joke, not be sad or hurt by it.

Making them to strangers is a big no-no, and if strangers are in the room with you at the time (like a party) you also have to "match the energy" of your friend. That means don't randomly do something misogynistic that they would understand to be a joke, but strangers would not. I think this is the hardest for most people as they don't consider that strangers witnessing could also be accidental audiences.

[–] Marzepansion@programming.dev 9 points 2 years ago

we should be shooting: the millionaires who hire the poachers

Damn, I was looking forward to eating them. :(

But you're entirely right. Obviously the poachers do the hunting, but there are people rich enough out there that put a price on rhinos to begin with, they are the real problem. They wouldn't be hunted if there was no incentive.

view more: ‹ prev next ›