• Norgur@kbin.social
    link
    fedilink
    arrow-up
    22
    arrow-down
    1
    ·
    8 months ago

    Thing is: there is always the “next better thing” around the corner. That’s what progress is about. The only thing you can do is choose the best available option for you when you need new hardware and be done with it until you need another upgrade.

      • wrath_of_grunge@kbin.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        8 months ago

        really my rule of thumb has always been when it’s a significant upgrade.

        for a long time i didn’t really upgrade until it was a 4x increase over my old. certain exceptions were occasionally made. nowadays i’m a bit more opportunistic in my upgrades. but i still seek out ‘meaningful’ upgrades. upgrades that are a decent jump over the old. typically 50% improvement in performance, or upgrades i can get for really cheap.

      • jmcs@discuss.tchncs.de
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        8 months ago

        It depends on what you need. I think usually you can get the best bang for buck by buying the now previous generation when the new one is released.

    • AeroLemming@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      8 months ago

      You have a magical button. If you press it now, you will get $100 and it will disappear. Every year you don’t press it, the amount of money you will get if you do press it goes up by 20%. When should you press the button? At any given point in time, waiting just one more year adds an entire 20% to your eventual prize, so it never makes sense to press it, but you have to eventually or you get nothing.

      Same thing with graphics cards.

      • Bizarroland@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        8 months ago

        Is it compound or straight percentage?

        Cuz if it’s just straight percentage then it’s $20 a year, whereas if it is compound then it’s a 2X multiplier every three and a half years roughly.

        • AeroLemming@lemm.ee
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          Compound, which more closely models the actual rate at which computing power has grown over the years.

            • AeroLemming@lemm.ee
              link
              fedilink
              arrow-up
              1
              ·
              8 months ago

              Or you could wait 70 years and leave 34 million to people in your will… The point is that there is no mathematically correct choice.

              • Bizarroland@kbin.social
                link
                fedilink
                arrow-up
                1
                ·
                8 months ago

                I think I got about 77 years left in me, unless somebody comes along and kills me that is.

                That at least would be $125 million which isn’t too shabby. I find it hard to believe that anybody would say that $125 million 77 years from now would not be a considerable amount of money.

    • Nik282000@lemmy.ca
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      I bought a 1080 for my last PC build, downloaded the driver installer and ran the setup. There were ads in the setup for the 2k series that had launched the day before. FML

      • Norgur@kbin.social
        link
        fedilink
        arrow-up
        5
        ·
        8 months ago

        Yep. I bought a 4080 just a few weeks ago. Now there is ads for the refresh all over… Thing is: you card didn’t get any worse. You thought the card was a good value proposition for you when you bought it and it hasn’t lost any of that.

    • alessandro@lemmy.caOP
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      choose the best available option

      “The” point. Which is the best available option?

      The simplest answer would be “price per fps”.

      • Norgur@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        Not always. I’m doing a lot of rendering and such. So FPS aren’t my primary concern.

    • kureta@lemmy.ml
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      only thing keeping me is CUDA and there’s no replacement for it. I know AMD has I-forgot-what-it’s-called but it is not a realistic option for many machine learning tasks.

    • Cagi@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      8 months ago

      The only thing giving me pause about ATI cards is their ray tracing is allegedly visibly worse. They say next gen will be much better, but we shall see. I love my current non ray tracing card, an rx590, but she’s getting a bit long in the tooth for some games.

        • Cagi@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          8 months ago

          Not since, oh before most of Lemmy was born. I’m old enough to remember when Nvidia were the anti-monopoly good guys fighting the evil Voodoo stranglehold on the industry. You either die a hero or you live long enough to see yourself become the villain.

          • PenguinTD@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            8 months ago

            yeah, that’s pretty much why I stopped buying Nvidia after GTX 1080. Cuda was bad in terms of their practice, but not that impactful since OpenCL etc can still tune and work properly with similar performance, just software developer/researcher love free support/R&D/money to progress their goal. They are willing to be the minions which I can’t ask them to not take the free money. But RTX and then tensor core is where I draw the line, since their patent and implementation will have actual harm in computer graphic and AI research space but I guess it was a bit too late. We are already seeing the results and Nvidia is making banks with that advantage. They are essentially just applying the Intel playbook but doing it slightly different, they don’t buy the OEM vendors, they “invest” software developers/researcher to use their closed tech. Now everyone is paying the premium if you buy RTX/AI chips from Nvidia and the capital boom from AI will make the gap hard to close for AMD. After all, R&D requires lots of money.

    • BCsven@lemmy.ca
      link
      fedilink
      arrow-up
      4
      ·
      8 months ago

      AMD is a better decision, but my nVidia works great with Linux, but I’m on OpenSUSE and nVidia hosts their own OpenSUSE drivers so it works out of the get go once you add the nVidia repo

      • gnuplusmatt@reddthat.com
        link
        fedilink
        arrow-up
        3
        ·
        8 months ago

        I had an nvidia 660 GT back in 2013, it was a pain in the arse being on a leading edge distro, used to break xorg for a couple of months every time there was an xorg release (which admittedly are really rare these days since its in sunset mode). Buying an amd was the best hardware decision, no hassles and I’ve been on Wayland since Fedora 35.

          • gnuplusmatt@reddthat.com
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            8 months ago

            yeah no, I dont want to be fucking with my machine just because I want to run a modern display server. I want my driver as part of my system. Until NV can get out of their own way and match the AMD experience (or even intel), not interested

        • lowmane@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          8 months ago

          It’s not at all. You have a dated notion of the experience of the past few years+ with an nvidia gpu

          • gnuplusmatt@reddthat.com
            link
            fedilink
            arrow-up
            1
            ·
            8 months ago

            dated notion of the experience

            Do I still have to load a module that taints my kernel and could break due to ABI incompatibility? Does wayland work in an equivalent manner to the in kernel drivers that properly support GBM?

  • dellish@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    Perhaps this is a good place to ask now the topic has been raised. I have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

    • vivadanang@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

      in a laptop? practically none. there are some very rare ‘laptops’ out there - really chonk tops - that have full size desktop gpu’s inside them. the vast majority, on the other hand, will have ‘mobile’ versions of these gpus that are basically permanently connected to the laptop’s motherboard (if not being on the mobo itself).

      one example of a laptop with a full-size gpu (legacy, these aren’t sold anymore): https://www.titancomputers.com/Titan-M151-GPU-Computing-Laptop-workstation-p/m151.htm note the THICK chassis - that’s what you need to hold a desktop gpu.

    • chemsed@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      In my experience, AMD is not more reliable on updates. I had to clean install trice to be able to have my RX 6600 function properly and months later, I have a freezing issue that may be caused by my GPU.

    • gazab@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      You could use an separate external gpu if you have thunderbolt ports. It’s not cheap and you sacrifice some performance but worth it for the flexibility in my opinion. Check out https://egpu.io/

  • joneskind@beehaw.org
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    It really is a risky bet to make.

    I doubt full price RTX 4080 SUPER upgrade will worth it over a discounted regular RTX 4080.

    SUPER upgrades never crossed the +10%

    I’d rather wait for the Ti version

    • wrath_of_grunge@kbin.social
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      8 months ago

      really the RTX 4080 is going to be a sweet spot in terms of performance envelope. that’s a card you’ll see with some decent longevity, even if it’s not being recognized as such currently.

      • joneskind@beehaw.org
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        It will depend on the power upgrade offered by the 50XX and the game development studios appetite for more power.

        But TBH I don’t see Nvidia able to massively produce a 2 times faster chip without increasing its price again

        Meaning, nobody will get the next gen most powerful chip, game devs will have to take that into account and the RTX 4080 will stay relevant for longer time.

        Besides, according to SteamDB, most of gamers still have an RTX 2080 or less powerful GPU. They won’t sell their games if you can play it decently on those cards.

        The power gap between high-ends GPUs is growing exponentially. It won’t stay sustainable very long

    • UnspecificGravity@discuss.tchncs.de
      link
      fedilink
      arrow-up
      9
      arrow-down
      1
      ·
      8 months ago

      For the vast majority of customers that aren’t looking to spend close to a grand for a card that is infinitesimally better than a card for half the price, AMD has plenty to offer.

    • Fridgeratr@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      8 months ago

      AMD is absolutely cutting it!! They may not get DLSS or ray trace as well but their cards still kick ass