Yep. The reason Windows and macOS are way more accepted than Linux is because they’re essentially idiot proof. Linux is not and that’s not necessarily a good thing if you want the year of the Linux desktop to actually happen one day.
Yep. The reason Windows and macOS are way more accepted than Linux is because they’re essentially idiot proof. Linux is not and that’s not necessarily a good thing if you want the year of the Linux desktop to actually happen one day.
Ok, so arch doesn’t break because it’s unstable, it just breaks anyways. And it doesn’t break more in general, it just breaks worse more often. Got it.
I’ll still stay away from the bleeding edge.
That’s still exactly what I meant? Sure, arch may never break even though it’s unstable but it being unstable heightens the risk of it (or some program) breaking due to changing library versions breaking dependencies.
Dependency issues happen much more rarely on stable systems. That’s why it’s called stable. And I very much prefer a system that isn’t likely to create dependency issues and thus break something when I update anything.
I‘d rather have a system that is stable and a few months out of date than a system that is so up to date that it breaks. Because then I cannot, in a good conscience, use that system on a device that I need to just work every time I start it.
Second this. Am not a huge fan of ubuntu itself and I have had issues with other debian based distros (OMV for example) but mint has always been rock solid and stable on any of my machines. The ultimate beginners distro imo.
Larger downstream distros like manjaro (and steamOS for that matter) can be stable. I wouldn’t call manjaro a beginners distro though, like mint would be (No Linus, there’s no apt in manjaro) but it’s very daily-driveable.
Although, if you’re most people, just stay away from rolling release distros. There’s so little benefit unless you’re running bleeding edge hardware…
If it‘s your first time trying linux, go with mint. It’s stable and almost every tutorial will work for you. If you know your way around a terminal already, the choice is all yours. I personally like Fedora.
That’s why I recommend mint. You have all the benefits of ubuntu but without the corporate stuff. And flatpak instead of snap.
Wasn’t that one of the main critiques of snap/ubuntu/canonical a few years ago already?
Among my personal dislike for its shade of purple, that has been my primary reason to not recommend ubuntu for a while, at least.
No, HDR can’t make your monitor brighter than it is. But it can take full advantage of the brightness and contrast of modern displays in a way SDR cannot. In almost every case HDR looks better than SDR but brighter and/or more contrasty displays take the most advantage.
In a more technical sense, SDR content is mastered with a peak brightness of 100 nits in mind. HDR is mastered for a peak brightness of 1000 nits, sometimes 2000 nits and the resulting improved contrast.
If you don’t watch movies in HDR on a modern TV, you’re not taking full advantage of its capabilities.
That’s incorrect. While it can be assumed that HDR content supports at least 10bit colour, it is not necessary for monitor or content. The main difference is contrast and brightness. SDR is mastered for a brightness of 100 nits and a fairly low contrast. HDR is mastered for brighnesses of usually 1000 or even 2000 nits since modern displays are brighter and capable of higher contrast and thus can produce a more lifelike picture through the additional information within HDR.
Of course you need a sufficiently bright and/or contrasty monitor for it to make a difference. An OLED screen or displays with a lot of dimming zones would produce the best results there. But even a 350nit cheap TV can look a bit better in HDR.
Well, my internet connection would have to be a lot faster, and they would all need devices that support UHD h.265 and HDR10 playback. But if you have have gigabit upload and they all have shields or similar with just as fast connections, you’re good to go without transcoding (if no one wants to access it through mobile)
I regularly watch on my server when I’m not home and a few friends of mine also have access to it, so I need the content to be available in SDR and lower bit rates. When I stream from home, I‘d like to have access to the full quality and HDR though, so either I need multiple versions of each film or hardware encoding/tonemapping and a used gtx 1050ti was a lot cheaper than the required storage would be to have 4 or 5 versions of every film.
But yes, if you’re only streaming within the same network, hardware transcoding isn’t necessary in the slightest. But then a SMB fileshare might also suffice…
As I need hardware transcoding, that makes emby immediately non viable for me. I also usually watch via various apps and on tv, which, if you don’t have emby premiere are also not free to use.
It’s free and open source. That alone is a big plus. And it works fairly well. What does emby do better, that warrants paying $120 for it?
Yea. I like my MacBook and I like macOS (yes, I know, shame on me). But in a few years, when Apple eventually stops supporting it, I can just put Linux on it and keep using it (or give it to a relative who just needs a working computer). It’s good hardware and in true Apple fashion, it will probably outlast its software. I also have an old Core 2 Duo unibody macbook laying around and while it is possible to put the latest macOS on (thanks hackintosh community), Linux is a much better experience and the MacBook is sturdier and has a better trackpad and keyboard than most new laptops, even many that are much more expensive.
I wouldn’t recommend Intel CPUs (at least the last two gens) either but if all that matters to you in a GPU is hardware encoding (quality or codec support), like for a Jellyfin server, Intel ARC is unbeatable.
The commenter I responded to originally seemed confused/surprised by it, though.
Here in Germany at least, if you read almost any printed novel, the type face will include this type of g. It’s so common, that I didn’t realise it’d be strange for some people.
(Although I do recall seeing a post about a kid that was confused by that weird letter, somewhere a while ago. Probably was still back on r*****)
That’s fairly standard for serif fonts like times new roman, baskerville, etc. Although it is uncommon in modern sans serif fonts and/or fonts designed to be viewed on a screen.
Nobara: Has all the gaming features I want on my gaming pc (like gamescope) and is htpc capable. Also, it’s based on Fedora, which I’m familiar with.
Fedora: I like gnome and it’s always fairly up to date and rock solid. Great on my laptop.
Have considered switching to openSUSE though. It’s German (as am I), it’s the first Linux distro I ever used (on my granddad’s PC, more than a decade ago) and I’ve heard a lot of good about tumbleweed.