• brbposting@sh.itjust.works
      link
      fedilink
      arrow-up
      12
      ·
      10 months ago

      I ‘member when iOS apps used to say more than that in the changelogs.

      Shoutout to Voyager for bucking the trend!

      • ferret@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Almost every iPhone out there at this point has automatic app updates turned on so no one ever reads the changelog

        • brbposting@sh.itjust.works
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          Great point.

          Would never turn that on myself! I use an app or two that would break upon update - no fault of the devs, but glad to be in the minority who can keep on keeping on (for now).

      • Pantherina@feddit.de
        link
        fedilink
        arrow-up
        6
        ·
        10 months ago

        My Git history is a mess because I use Githubs online editor, as I still struggle with git and ssh setup. Tbh should be done now, but too lazy.

        Get on my level XD

      • theneverfox@pawb.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        I used to be, but once I started doing a commit each feature I got in the habit - it’s great when you fuck something up and need to see how you did it before

        On solo personal projects I’m much worse, because I’m not afraid to rip it apart and put it back together on a whim…I usually go in with a goal, but then I might decide “this design isn’t going to work much longer, let’s rewrite this”, and 8 hours later I’ve made a bunch of improvements. Maybe even the one I set out to do

        When that happens, I do like Minecraft - I give it a name.

        And since the people I work with never read commit messages, after I list the changes I remember off the top of my head I sometimes do some creative writing. Sometimes I put my next plans to lay them out, sometimes I write about philosophy, sometimes I go on a rant about specific criticisms of the language or vent about how this was so much harder than it should have been. Occasionally I write a haiku

        It’s so much easier to keep up with it when you just have fun with it…I just treat it like a reflection exercise

  • TimeSquirrel@kbin.social
    link
    fedilink
    arrow-up
    91
    arrow-down
    1
    ·
    edit-2
    10 months ago

    I wish I had known about all these problems with Nvidia’s shit on Linux and gone with a new AMD GPU instead of a 3070Ti. I HAD been using AMD/Radeon since the late-00s, I don’t know WTF was wrong with me. Nvidia wasn’t this bad in the early 2000s. It was the only way to run hardware-accelerated Unreal Tournament on Linux at the time.

    • Canadian_Cabinet @lemmy.ca
      link
      fedilink
      arrow-up
      32
      arrow-down
      1
      ·
      10 months ago

      Two years ago I made the switch to AMD when I needed to replace my ageing 1060 (still on Windows back then) and I’m so glad I did because I avoid all of the headaches with getting Nvidia to work on Linux

      • yeehaw@lemmy.ca
        link
        fedilink
        arrow-up
        9
        ·
        10 months ago

        Mine works fine. Never did check which drivers I’m using though. I like the cuda cores for davinci resolve, and dlss for games that struggle on my ultra wide monitor (looking at you, cyberpunk)

        If it wasn’t for those 2 things I will be gone next card refresh. Truthfully the main reason I went Nvidia in the first place was old habits. ATI/AMD traditionally had no driver support for Linux, or at least worse than Nvidia who actually had an official driver package. Things have changed though.

        • Darorad@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          10 months ago

          For replacing dlss, fsr works well and you can use it on games that don’t support it using gamescope.

              • Holzkohlen@feddit.de
                link
                fedilink
                arrow-up
                2
                ·
                10 months ago

                If a game comes with FSR out of the box use that tho. It will not scale stuff like the UI for instance, so you get the UI at native resolution, but the game is getting upscaled. If you use Gamescope for scaling it will scale the whole image and it’s limited to FSR1 while games can come with FSR2 or even 3 (I think 3 is just added frame generation on top which is only useful for playing at >60fps).

    • Miaou@jlai.lu
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      I’ve never had issues with nvidia on Linux, and I expect a high proportion of their customers to run Linux systems, what’s so bad about it?

      • frezik@midwest.social
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        10 months ago

        End user experience is mostly fine. The issues are in how they interact with kernel developers. Or, like, anyone else who doesn’t work inside the company. They sniff their own bullshit and expect you to agree that it’s a rose.

      • AnUnusualRelic@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        Apparently, in some cases, it will seduce your wife, steal all your money and flee to a remote pacific island.

        Usually, there are no issues, it just works as expected.

    • Holzkohlen@feddit.de
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      You can just get an AMD APU and run your PC in hybrid mode. I did that Garuda Linux recently and it was great. Allowed me to finally switch to using Wayland.

    • frezik@midwest.social
      link
      fedilink
      arrow-up
      31
      ·
      10 months ago

      They already have Jensen doing his own sound effects at conference presentations. Do we expect him to sell his leather jacket to keep the company afloat, too?

  • Norgur@kbin.social
    link
    fedilink
    arrow-up
    60
    ·
    10 months ago

    Well, caching more changelogs requires RAM, right? You know the stance of NVidia regarding more RAM.

  • GreatAlbatross@feddit.uk
    link
    fedilink
    English
    arrow-up
    42
    ·
    10 months ago

    At least you can roll back the drivers on a computer.
    It’s even more infuriating when a TV manufacturer rolls out an update with “bug fixes and improvements”, and you know full well that if they broke ARC again, there is no going back to the old version.

  • Tarquinn2049@lemmy.world
    link
    fedilink
    arrow-up
    28
    arrow-down
    1
    ·
    10 months ago

    When the marketing department is more important to a company than the customer support. Rather than actually help the customers, they just make sure customer support never says anything bad about their products. Including the problems they have/had in the patch notes.

    “These are too many fixes, listing them all will make us look bad.”

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      10 months ago

      When the marketing department is more important to a company than the customer support.

      The marketing department is easier to integrate with AI. Those stupid customer support folks have to actually think about the problem and determine a working solution, rather than regurgitating a random assembly of buzzwords and spicy graphics.

  • Sprokes@jlai.lu
    link
    fedilink
    arrow-up
    28
    arrow-down
    3
    ·
    10 months ago

    I bought an old computer to install plex. At one time I wanted to try some tool that does speech to text and decided to install Nvidia drivers to speed the process. I messed up my system and tried for hours to fix it but I gave up. Now I don’t have gui.