• 0 Posts
  • 15 Comments
Joined 1 year ago
cake
Cake day: December 23rd, 2023

help-circle
  • I see it as the continuation of a very old problem. Old school engineering didn’t have any standards until a bunch of people died over and over and the public demanded change. The railroads, construction tycoons, factory owners, mine operators etc all bitterly fought, and still fight, engineering safety requirements. Computer industries have continued this. They all oppose public action, hide negative information, and try to pin blame for conspicuous failures on individuals rather than systemic rot.

    I think also because of the relatively less visceral nature of software catastrophes we don’t have a culture of safety. That’s not to say software errors can’t cause horrific accidents but the power grid going down and causing a dozen people in the service area to die is less traumatic than a bridge collapsing and sending a dozen people into an icy river. That’s an extreme example but my point is that humans undervalue harms that are seen as less acutely, physically brutal and software just seems more abstract.

    Most of us aren’t working on power grid either, so when you start trying to quantify our software’s risks you have to speak to “harms” rather than just crimes like negligence, and then you expose this huge contradiction about how responsibility is allocated socially. Like, not only should engineers, pilots, and doctors have higher responsibility to prevent harm, but so should cops, journalists, politicians, billionaires, etc.

    So the risks are undervalued and both intentionally and unconsciously minimized. The result is most of us who’ve seen the inside are quietly horrified and that’s the end of it.

    I don’t know what the answer is except unignorable tragedies because that seems to be the only thing powerful enough to build regulations which are constantly being eroded.



  • It’s the same as learning anything, really. A big part of learning to draw is making thousands of bad drawings. A big part of learning DIY skills is not being afraid to cut a hole in the wall. Plan to screw up. Take your time, be patient with yourself, and read ahead so none of the potential screw-ups hurt you. Don’t be afraid to look foolish, reality is absurd, it’s fine.

    We give children largess to fail because they have everything to learn. Then, as adults, we don’t give ourselves permission to fail. But why should we be any better than children at new things? Many adults have forgotten how fraught the process of learning new skills is and when they fail they get scared and frustrated and quit. That’s just how learning feels. Kids cry a lot. Puttering around on a spare computer is an extremely safe way to become reacquainted with that feeling and that will serve you well even if you decide you don’t like Linux and never touch it again. Worst case you fucked up an old laptop that was collecting dust. That is way better than cutting a hole in the wall and hitting a pipe.


  • KDEnlive is improving, however Resolve is still more powerful and mature. That said, DaVinci’s business model seems precarious. It feels like they could, at any moment, enshittify Resolve and force users into a subscription just to maintain access to old edits. I think for that reason KDEnlive is better for almost all users. If you are a professional filmmaker then the color and vfx workflows of Resolve are probably worth paying for, but in that case it’s probably a FinalCut vs Resolve question anyway.





  • That’s one kind, and Rust’s “ownership” concept does mean there’s built-in compile time checks to prevent dangling pointers or unreachable memory. But there’s also just never de-allocating stuff you allocated even though it’s still reachable. Like you could just make a loop that allocates memory and never stops and that’s a memory leak, or more generally a “resource leak”, if you prefer.

    Rust is really good at keeping you from having a reference to something that you think is valid but it turns out it got mutated way down in some class hierarchy and now it’s dead, so you have a null pointer or you double free, or whatever. But it can’t stop the case where your code is technically valid but the resource leak is caused by bad “logic” in your design, if that makes sense.


  • MoonMelon@lemmy.mltoLinux@lemmy.mlTcl/Tk 9.0 released
    link
    fedilink
    English
    arrow-up
    4
    ·
    8 months ago

    Back in the day TCL was used in a few places in Pixar’s Renderman renderer (called PRMan), and in its connection to Maya. You could write little TCL scripts within the Renderman Artist Tools (RAT) that would be evaluated during scene export. I think this still exists in some form inside Tractor, which is their renderfarm management software.

    It’s been a long time since I used prman but generally Python has replaced everything as the “glue” language, which honestly makes things a lot easier. VFX and game dev used to have a hundred different scripting languages rolling around.



  • In the early 2000s I worked on an animated film. The studio was in the southern part of Orange County CA, and the final color grading / print (still not totally digital then) was done in LA. It was faster to courier a box of hard drives than to transfer electronically. We had to do it a bunch of times because of various notes/changes/fuck ups. Then the results got courier’d back because the director couldn’t be bothered to travel for the fucking million dollars he was making.


  • Repair forum version:

    • Here are the exact bolts you need to loosen: <dead photobucket link>
    • After that make sure you note this gasket: <another dead photobucket link>
    • The replacement part is very hard to find but they carry it here: <404>
    • You’ll find the torque spec here: <domain sold to online casino advertiser>

    Bonus points if the only schematic you can find is a 256 resolution jpg on pinterest that leads to a wordpress site were a bot only posts random schematics to farm pinterest engagement.


  • Man, fuck editing the registry. The duplicate entries, the non-standard locations, the UI of regedit… I had to dig through it so much when I was supporting a corporate launcher application in a Windows facility. Did the Windows dev decide to write their data into multiple registry entries, an INI file, an environment variable… or maybe all of the above? Find out on the next episode of Fuck My Life!


  • Bought a lemur pro 9 a few years ago and have it as a daily driver since. Pop OS works great for the most part but, as other people have mentioned, PopShop is slow/buggy and I often just resort to apt instead. My spouse plays a lot of PC games so when she got sick of Windows I migrated her over, and she’s had very few problems. Every once in awhile a game won’t run but usually that gets figured out in a few weeks by the Proton community.

    A few content creation linux apps only officially support Redhat, so getting them to run is a bit of a pain but that would be the case with any Debian based distro. So overall I haven’t seen the need to distro hop to Mint or something similar.