

I’m failing to understand why we would need decimal units at all. Whats the point of them? And why do the original units havr to change name to something as ridiculous as “Gibibyte” while the unnecessary decimal units get the binary’s old name?
I’m failing to understand why we would need decimal units at all. Whats the point of them? And why do the original units havr to change name to something as ridiculous as “Gibibyte” while the unnecessary decimal units get the binary’s old name?
Reject MiB, call it “MB” like it originally was.
Of course it does. OP asked multiple questions, this was sipposes to answer why they used KDE instead of Gnome. I personally think Arch would have the advantage of having the newewst drivers, Proton version etc. available.
If that’s Bitcoin then this would actually give you, at the current exchange rate of BTC to USD:
0.000000000002557444% USD. If you got that card once every milliscond, you would need 31 years to have 2.5$.
I doubt you’d be rich with one card except lol.
Well, thank you for taking the time to write this detailed explanation!
Windows and MacOS use the abbriviation “MB” referring to the binary units, correct? How come that these big OS’s use another unit than these large international bodies recognize?
On a side note, I’ve always found it weird why HDDs or SSDs are/were sold with 128GB, 265GB, 512GB etc. when they are referring to decimal units.