@x0rz Mistakes were made xD
@quad Is it apathy?
@orionwl I'm not specifically familiar with C & its intricacies but shouldn't the language define the concept of 'char' regardless of CPU architecture?
I know some assembly basics but only of simplified architectures so I don't have any knowledge on opcodes specifically recognizing a char type, so is that the case with current day architectures? Or is there something between C code and the assembly code it needs to compile to (i.e. the compiler), that depends on architecture specific behaviour?
@orionwl chars can be signed (on intel)? What does the sign bit do?
Thanks @nic for the dice!
This is wrong; it means you're doing mnemonics wrong: https://twitter.com/JuricaBulovic/status/1047690723649343488
It's in response to this thread:
Been trying to explain to people for years: with a small initial commitment, it's possible to remember mnemonic phrases over periods of *years* where you only have to call it back to mind from time to time (like, gaps of months, eventually...). And you have a physical backup of course, but the very fact that you don't have to access that, ever, is powerful.
@quad I forgot you're in IT support. I had a different demographic in mind entirely, i.e. programmers.
My recent biggest "omg linux why can't I get handy around you" was how I didn't yet know how when an active SELinux is blocking things, it's really obscured when you don't know while looking at the logs and errors.
I was trying to set up RDP to connect from a Windows machine. It took me a week before I figured out SELinux could be the cause, and thanks to a colleague pointing it out even.
@quad This might be more true than I was initially willing/able to perceive.
@sir Hah more strawman arguments. I don't think we're going to agree in this one though. Thanks for the fun talk!
@quad You are right that the reverse is true when that happens. I'm not claiming it doesn't happen.
My comment wasn't really about the why, more about the "which demographic is actually statistically bothered about this"
I can't comment about the code (no pun intended)
The waste of electricity is a strawman argument
GPU's are only used for shitcoins since years
But i'm glad we agree it doesn't have a spec 😂 🍻
@sir What about Bitcoin is it, that has you calling it that?
@sir whitepaper != specification
@sir lol, point me to this mythical specification you claim to exist.
@quad It's mostly only too hard for people that want to care how to use it properly, yet don't know how to go about things, having been spoonfed by Windows.
@verretor Unless they're abandened AWS nodes nobody cares about 🤔