Is there any GPU that stands out to guys as a giod value, or do you believe that everybody should skip them?

I’m liking the 5070 Ti with 16GB 256-bit transfer speed 896 GB/s for $750USD. The 5080 for $1000USD has 16GB 256-bit for 960 GB/s. I don’t see value for the extra $250.

The both have 2x 9th Gen NVENC encoder. The 5080 has 2x 6th Gen decoder, 5070 Ti has 1x 6th Gen decoder. I can use that for OBS recording while watching other videos.

  • terraborra@lemmy.nz
    link
    fedilink
    English
    arrow-up
    21
    ·
    2 months ago

    Assuming you’re primarily interested in gaming performance; wait for reliable 3rd party non-DLSS benchmarks.

    From the Nvidia presentation, the 5070ti looks great, but the performance uplift over previous gen in their slides pretty much only applies to games with frame generation. Not every game will implement DLSS 4 let alone DLSS. You may still need the better rasterisation of the 5080 depending on your monitor resolution and desired fps.

    • FeelzGoodMan420@eviltoast.org
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 months ago

      Non-DLSS isn’t looking much more powerful. Like it or not, games are going to rely more and more on this type of upscaling tech (and frame generation.) It kinda sucks because it gives excuses not to spend time and money properly optimizing games. On the other hand, these types of technology are quite interesting and the new dlss model looks good (I’ll be waiting for a proper review to have an opinion.)

      All that being said, I’ll upgrade next generation. I need to see a lot more new good games come out to justify upgrading my PC parts. I don’t think dropping $1000+ just to replay Cyberpunk for the 3rd time with higher frame rate is worth the expense.

      • Mettled@reddthat.comOP
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        4
        ·
        2 months ago

        I am of the opinion that DLSS and FSR is an admission of failure by GPU engineers that they are not capable,–so far,-- to design a GPU that does 4K 160fps with psycho raytracing on, zero upscaling, zero frame generation.

        I do believe that they are wrking on it, but nVidia/AMD demand gimmicks in the meantime to continue selling GPU’s.

        I suspect that the 5090 will be the first card to do 1440p with psycho raytracing at 144 fps without DLSS enabled.

        There’s something about Reflex 2 that is bothering me or concerning me, but I have no clue what it is.

        • Poopfeast420@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          6
          ·
          2 months ago

          I am of the opinion that DLSS and FSR is an admission of failure by GPU engineers that they are not capable,–so far,-- to design a GPU that does 4K 160fps with psycho raytracing on, zero upscaling, zero frame generation.

          How is it an admission of faliure? They probably can design a GPU for that, but do you want to pay hundreds of thousands, because the chip uses a full silicon wafer?

          Do you think NVIDIA or AMD should have sat on that technology for decades, until it’s good enough for 4k 144fps? Then you would probably say, it’s not good enough, because it can’t do 8k 144fps. Also, why 4k as your arbitrary limit? Most people are still on 1080p. So why not just say it’s good enough, when the hardware can do 1080p 60fps.

          I suspect that the 5090 will be the first card to do 1440p with psycho raytracing at 144 fps without DLSS enabled.

          Definitely not, since it can’t even do 30fps in 4k, with all the bells and whistles and no DLSS. 1440p ist probably not even gonna be 60fps.

          There’s something about Reflex 2 that is bothering me or concerning me, but I have no clue what it is.

          What? I’m pretty sure the technology they’re using, Frame Warping, has been around for years, and it’s used in VR, so you can just look that up and see what it does.

        • vrighter@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 months ago

          anyone could have told them that. real time path tracing is a pipe dream, even now. The actual raytracing output is a very noisy, incomplete image. Halving the noise requires 4x the compute. We won’t get realtime raytracing anytime this decade for sure, if ever.

    • Mettled@reddthat.comOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      2 months ago

      I don’t use DLSS. I have never tried a game that has DLSS enabled. I like to max out path tracing/raytracing but disabld DLSS.

      I woyld guess that the 5070 Ti is at tue very least 15% better than 4070 Ti. Maybe some games 12%, other games 20% better.

      The new NVENC in the 50 series is also a very strong point of interest for me due to frequently using OBS.

  • Ashtear@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 months ago

    Way too early to speculate. Until the cards are independently benchmarked, there’s no way to assess value.

    • Mettled@reddthat.comOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 months ago

      I wish everybody paid closer attention to input latency with frame generation.

    • Murvel@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 months ago

      Not at all since they’re developing Reflex to be AI boosted.

            • Murvel@lemm.ee
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 months ago

              So rasterised vector graphics are ‘real’ to you and DLSS is ‘fake’. How very technical of you…

              • vrighter@discuss.tchncs.de
                link
                fedilink
                English
                arrow-up
                6
                ·
                edit-2
                2 months ago

                one calculates what the pixel’s color should be. That’s the actual color that that pixel should be. The real one.

                The other one doesn’t try to calculate it and makes an educated guess. That by definition is “not rendering the pixel”.

                It has nothing to do with how realistic it looks. And yes, having written both software raytracers and rasterizers, I am technical about this stuff a bit more than just calling vulkan and having it do its magic. I actually dived in all the way

                • Murvel@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  5
                  ·
                  2 months ago

                  Both are data computations of a fictional scene… what difference does it make?!

  • Eiri@lemmy.ca
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 months ago

    Eh, their benchmarks were so garbled with DLSS-corrupted data that I can’t really say.

    One think I know for sure is that the 5070 is quite a bit of money for a 12 GB card.

    • Mettled@reddthat.comOP
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      2 months ago

      5070, yeah. If it had 16 gigs, that’s what I would buy, but I’m thinking of trying to get the money for a 5070 Ti. The 5080 is not a good valie.

  • Honestly the presentation was very underwhelming. Improvements in raster seem fairly small and don’t warrant an upgrade. DLSS still lacks in visual quality and has annoying artifacts, and I worry that the industry will use it as an excuse to release poorly optimised games. Counting DLSS frames as part of the frame count is just misleading.

    NVENC is cool, but I don’t use that often enough for it to be a selling point to me.

    I’ve been enjoying the memes about the presentation though, because what the fuck was that mess.

    • tehbilly@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      I absolutely hate DLSS for the artifacts, I’d honestly rather lower resolution and quality to achieve a good framerate at this point.

  • sp3ctr4l@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    2 months ago

    I hate it I hate it I hate it.

    This AI hallucinated frame crap is bullshit.

    Their own demos show things like the game is running at 30ish fps, but we are hallucinacting that up to 240!

    Ok…great.

    I will give you that that is wonderful for games that do not really depend on split second timing / hit detection and/or just have a pause function as part of normal gameplay.

    Strategy games, 4x, city/colony builders, old school turn based RPGs… slow paced third person or first person games…

    Sure, its a genuine benefit in these kinds of games.

    But anything that does involve split second timing?

    Shooters? ARPGs? Fighting games?

    Are these just… all going to be designed around the idea that actually your input just has a delay?

    That you’ll now be unable to figure out if you missed a shot or got shot from a guy behind a wall… due to network lag, or your own client rendering just lied to you?

    I am all onboard with intelligent upscaling of frames.

    If you can render natively at 5 or 10 or 15 % of the actual frame you see, and then upscale those frames and result in an actually higher true FPS?

    Awesome.

    But not predictive frame gen.

    • Mettled@reddthat.comOP
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      2 months ago

      It’s comforting to find other people who have a strong hate on for frame generation. I person have no interest in upscaling, but frame generation is a conjob. It does nothing for latency, so players so more frames but the same input lag. That sounds discombobulating, or disjointed.

  • alessandro@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 months ago

    Unreasonable expensive stuff will sold out by rich people, scalpers and extra rich people who buy from them.

    Screwing the economy “frames per $/€” for everyone.

  • Yozul@beehaw.org
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    I think at this point if you put a gun to my head and told me to either buy an Nvidia card or never play a video game again I’d get a lot more reading done.

  • Romkslrqusz@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    For starters, there’s more to gpu performance than memory speed and quantity.

    believe that everybody should skip them

    This strikes me as a bit weird. Everyone uses graphics cards for different things, everyone has different priorities, and most people who have a PC have different hardware.

    I’ve got clients who edit video for work, and others who do it as a hobby. In the professional sphere, render times can have a pretty direct relationship with cashflow, so having the ‘best’ can mean the hardware pays for itself several times over.

    I’ve got clients who only play one game and find it runs great on their current setup, others who are always playing the latest games and want them to perform well, and still others who play a game professionally/competitively and need every frame they can get. Some are happy at 1080p, others prefer 4k, and some may want to drive a high-end VR headset.

    For some people, taking advantage of a new GPU might also require a new PSU of even a total platform upgrade.

    To one person, a few hundred dollars is disposable income whereas to another it might represent their ability to eat that month.

    These are all variables that will influence what is appropriate for one person or another.

    If someone were to have ~$600 to spend, be in need of an upgrade to meet the requirements of an upcoming game they want to play at launch, and have a platform that will support it, I’m likely to recommend an RTX5070 to them.

    If someone were to be happy enough with their current performance, I’m likely to recommend they wait and see what AMD puts out - or potentially even longer.

    Personally, I’ve always waited until a game I’m excited for performs poorly before upgrading.