fushuan

joined 2 years ago
[–] [email protected] 4 points 3 hours ago

Maybe that's why bro brought girls.

[–] [email protected] 2 points 1 day ago (1 children)

I don't want to generalise but... Yeah... That's a big reason why a lot of manhuas just suck.

[–] [email protected] 1 points 1 day ago

Thanks for the recommendation! I liked Feng Shen Ji too to be honest. It's one of my favorites haha.

[–] [email protected] 7 points 1 day ago

It's just... So deserved, you know? Sometimes you can't but laugh in the face of such karma and fucking irony.

[–] [email protected] 4 points 1 day ago

Not cuda, but a lower level nvidia proprietary API, your point still stands though.

[–] [email protected] 11 points 1 day ago (1 children)

They replaced it with a lower level nvidia exclusive proprietary API though.

People are really misunderstanding what has happened.

[–] [email protected] 3 points 1 day ago (2 children)

That's the problem, I love wuxia, I love stupid power levels and all that, I love my trash comics. It's the story they tell alongside that that irks me to no end.

Close to zero decent dialogue between people besides shitty villains talking big until the hero flips his eyelash and suddenly everyone is bowing to them. There's no interesting characters, no banter.

To give some examples of wuxia done right (I like the top batch more but the other ones are alright):

Return of the mount hua sect Infinite levelling: murim Return of the crazy demon Nano machine Absolute sword sense Star instructor or something

I reincarnated as the crazed heir Path of the shaman

Half wuxia: Murim login

All of these have a mc with personality, there's decent banter between characters, there's an interesting plot, plot changes make sense...

There's also the "dungeon" theme with portals that grew in popularity since solo levelling (around then) that is basically the same trash plot with another power level vector.

[–] [email protected] 5 points 1 day ago (8 children)

It do be like that with manga/manhua tbh. I still prefer Korean webcomics because I like my trash but... It's been ages since I read a decent Chinese webcomic or manhua. They are so fucking trash. I'm sure there are some good ones but... Yeah.

[–] [email protected] 1 points 1 day ago

Not enough for it to make results diverge. Randomness is added to avoid falling into local maximas in optimization. You should still end in the same global maxima. Models usualy run until their optimization converges.

As stated, if the randomness is big enough that multiple reruns end up with different weights aka optimized for different maximas, the randomization is trash. Anything worth their salt won't have randomization big enough.

So, going back to my initial point, we need the training data to validate the weights. There are ways to check the performance of a model (quite literally, the same algorithm that is used to evaluate weights in training is them used to evaluate the trained weights post training) the performance should be identical up to a very small rounding error if a rerun with the same data and parameters is used.

[–] [email protected] 10 points 1 day ago* (last edited 1 day ago)

Holy shit thanks I wasn't getting it.

[–] [email protected] 1 points 1 day ago (2 children)

Hey, I have trained several models in pytorch, darknet, tensorflow.

With the same dataset and the same training parameters, the same final iteration of training actually does return the same weights. There's no randomness unless they specifically add random layers and that's not really a good idea with RNNs it wasn't when I was working with them at least. In any case, weights should converge into a very similar point even if randomness is introduced or else the RNN is pretty much worthless.

[–] [email protected] 2 points 2 days ago

Normal glasses here are 0.25 litters more or less. That's around 4 glasses a litter.

37 glasses are 9.25 liters. Rounded down to 9 because people don't fill their classes to the brim, idk.

Your numbers scare me.

 

Thanks to /u/[email protected] for mentioning KDE window rules. In KDE, we can add rules for windows so that they behave in specific ways. One rule that can be added is the position: remember rule, and it's possible to make that rule apply to all windows by removing the match field. This way, closing and reopening windows keeps them where they were.

This is a very typical complaint about wayland that a lot of people have, something that apparently worked natively with X11 and annoyed me to no end since I had to position all the windows every day when logging into my desktop. No more! I hope this helps :)

https://imgur.com/a/zrvbRPI

 

Title.

view more: next ›