Upscaling and Frame Generation are disasters meant to conceal unfulfilled promises from GPU makers for 4k gaming, and as a coverup for the otherwise horrible performance some modern games have, even at 1080/1440p resolutions.

Upscaling will never, no matter how much AI and overhead you throw at it, create an image that is as good as the same scene rendered at native res.

Frame Generation is a joke, and I am absolutely gobsmacked that people even take it seriously. It is nothing but extra AI frames shoved into your gameplay, worsening latency, response times, and image quality, all so you can artificially inflate a number. 30FPS gaming is, and will always be, infinitely better as an experience, than AI frame doubling a 30fps experience to 60FPS.

and because both these technologies exist, game devs are pushing out less optimized to completely unoptomized games that run like absolute dogshit, requiring you to use upscaling and shit even at 1080p just to get reasonable frame rates on GPUs that should run it just fine if it was optimized better (and we know its optimization, because some of these games do end up getting that optimization pass long after launch, and wouldnt you know… 9fps suddenly became 60fps)

  • Lucy :3@feddit.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    2 months ago

    To concretize:

    • Upscaling with ML may make the image acceptable if you don’t look at anything the devs don’t want you. You’re supposed to look at yourself, your objective/enemy and maybe a partner. Everything else, especially foliage, hair etc., looks like shit. Flimmery, changing with distance and perspective. Lighting is weird. Thing is, if we aren’t supposed to pay attention, we could just go back to HL-level graphics. Even HL 1. However, this would break the aspect of stuff looking good, of you being able to enjoy looking around and not only feeling immersed, but like you are seeing something that you will never actually see in real life - not because it looks unrealistically artificial, but too beautiful, too crazy and too dreamy to be real. However, when I get literally sick by actually looking around, because stuff changes abstractly, not like my brain expects it, that takes away the only actual visual advantage over GoldSrc.
    • Frame Gen does not make sense for literally any group of people:
    • You have 20FPS? Enjoy even worse input lag (because in order to generate frame B between A and C, the generation algorithm actually needs to know frame C, leading to another 1/20 second delay + time to actually generate it) and therefore a nice 80FPS gameplay for your eyes, while your brain throws up because the input feels like =< 10FPS.
      • You have 40FPS and want 80-160FPS? Well, that might be enjoyable to anyone watching, because they only see smoother gameplay, meanwhile you, again, have a worse experience than 40FPS. I can play story games at 40FPS, no problem, but halving the input lag literally more than doubled? Fuck no. I see so many YouTubers being like: “Next we’re gonna play the latest and greatest test, Monkey Cock Dong Wu II. And of course it has 50% upscaling and MFG! Look at that smooth gameplay!” - THE ONLY THING THAT’S SMOOTH IS YOUR BRAIN YOU MONKEY, I CAN LITERALLY SEE THE DESYNC BETWEEN YOUR MOUSE MOVEMENTS AND INGAME. WHICH WAS NOT THERE IN OTHER BENCHMARKS, GAMES OR THE DESKTOP. I’M NOT FUCKING BLIND. And, of course, worse foliage than RDR2. Not because of the foliage itself, but the flimmering between.
      • You have 60FPS? And you know that you can actually see a difference between 60 and 144, 240Hz etc.? Well that’s wrong, you only notice the difference in input lag. So it’s going to be worse than smooth now. Because, again, the input lag get’s even worse with FG, and much worse with MFG.
      • You have 80 FPS but are a high FPS gamer needing quick reaction times and stuff? Well yeah, it’s only getting worse with FG. Obviously.

    And about 4k gaming … play old games, with new GPUs. RDR2 works quite good on my 7800XT. More than 60FPS, which is very enough if you aren’t a speedrunner or similar. No FSR, of course.

  • real_squids@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    2 months ago

    One important thing - upscaling does help with low spec/low power gaming (esp on smaller screens). Obviously it’s a double edged sword (promotes pushing out games quicker), but it has some really cool uses. Now forced TAA on the other hand…

    Both are tolerable, but only if they’re not forced, and for some reason companies have a hard-on for forcing them. Kinda like how 103° FOV limit somehow became a standard even in fast-paced competitive games

  • bizarroland@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    Trust me bro, a few hundred more billion dollars worth of R&D spread out between 37 companies that keep taking turns to buy each other out in hopes of making a trillion dollars on some slop somebody vomited out of the dark recesses of their souls over the next 59 years and it will get better I promise trust me bro

  • sunzu2@thebrainbin.org
    link
    fedilink
    arrow-up
    1
    ·
    2 months ago

    Ray tracing is still prolly fecade away for mainstream too

    Tech has been fake promises at least since covid

    Nothing really changed practically for gaming since peak 2015 period.

    Just more MTX and scamming

    • A_Random_Idiot@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Oh, believe me, I agree… I agree so hard thats a worthy of a whole different post, lol.

      Its capable of making pretty screenshots. But ultimately its a pointless tax that serves no real purpose besides artificially increasing the price of GPUs. . . because what better way to increase the price of a GPU than to start tacking other features onto it, Right nVidia?

  • esteemedtogami@lemmy.zip
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 months ago

    Absolutely true. I never bother to turn these options on if a game offers them because in the best case it doesn’t do a whole lot and in the worst case it makes the game look awful. I’d rather just play with real frames even if it means playing at a lower frame rate.

  • ShadowRam@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    Upscaling will never, no matter how much AI and overhead you throw at it, create an image that is as good as the same scene rendered at native res.

    That’s already been proven false back when DLSS 2.0 released.

    • A_Random_Idiot@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      No it hasnt, You are just regurgitating nvidia’s marketing.

      You can’t stretch a picture and have it look just as good as natively rendering it at that higher resolution.

      You can not create something from nothing. No matter how much AI guesswork you put into filling in the gaps, it will never be as good as just rendering it at the larger res. It will never look as good at the original resolution pre-AI stretching either.

      • moonlight@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        2 months ago

        It’s using information from multiple frames, as well as motion vectors, so it’s not just blind guesses.

        And no, it’s not as good as a ‘ground truth’ image, but that’s not what it’s competing against. FXAA and SMAA don’t look great, and MSAA has a big performance penalty while still not eliminating aliasing. And I think DLSS quality looks pretty damn good. If you want something closer to perfect, there’s DLAA, which is comparable to SSAA, without nuking your framerate. DLSS can match or exceed visual fidelity at every level, while offering much better performance.

        Frame gen seems like much more of a mixed bag, but I think it’s still good to have the option. I haven’t tried it personally, but I could see it being nice in single player games to go from 60 -> 240 fps, even if there’s some artifacting. I think latency would become an issue at lower framerates, but I don’t really consider 30 fps to be playable anyway, at least for first person games.

        And yes, it has been used to excuse poor optimization, but so have general hardware improvements. That’s an entirely separate issue, and doesn’t mean that upscaling is bad.

        Also I think Nvidia is a pretty anti-consumer company, but that mostly has to do with business stuff like pricing. Their tech is quite good.

        • sp3ctr4l@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 months ago

          Eh… The latest versions of DLSS and FSR are getting much better image quality in stills…

          But they also still are not as good image quality as actually rendering the same thing, natively, at full resolution, as was the quote you are disputing.

          Further, the cards that can run these latest upscsling techs, to reach 4k60fps, 4k90fps, in very demanding games, without (fake) frame gen?

          Its not as as bad with AMD, but they also don’t yet offer as high calibre a GPU as Nvidia’s top end stuff (though apparently 9080 XT rumors are starting to float around)…

          But like, the pure wattage draw of a 5080 or 5090 is fucking insane. A 5090 draws up to 575 watts, on its own.

          You can make a pretty high powered 1440p system if you use the stupendously high cpu performance per watt, high powered 9745hx or 9745hx3d cpu + mobo combos that minisforum makes… and the entire PSU for the entire system shouldn’t need to exceed 650 watts.

          … A 5090 alone draws nearly as much power as basically the one resolution step down system.

          This, to me, is completely absurd.

          Whether or not you find the power draw difference between an ‘ultra 1440p’ build and an ‘ultra 4k’ build ridiculous… the price point difference between the those pcs and monitors is… somewhere between 2x and 3x as expensive, and hopefully we can agree that that in fact is ridiculous, and 4k, high fidelity gaming remains far out of the reach of the vast majority of pc gamers.

          EDIT:

          Also, the vast majority of your comment is comparing native + some AA algo to… rendering at 75% to 95% and then upscaling.

          For starters, again the original comment was not talking about native + some AA, but just native.

          Upscaling introduces artefacts and innacuracies, such as smudged textures, weird ghosting that resembles older, crappy motion blur techniques, loss of lod style detail for distsnt objects, sometimes gets confused between HUD elements and the 3d rendered scene and warps them together…

          Just because intelligent temporal upscaling also produces what sort of look like, but isn’t actually AA… doesn’t mean it does not have these other costs of achieving this ‘AA’ in a relatively sloppy manner that also degrades other elements of the finished render.

          Its a tradeoff between an end result at the same res that is worse, to some degree, but rendered faster, to some degree.

          Again, the latest versions of intelligent upscalers are getting better at getting the quality closer to a native render while maintaining a higher fps…

          But functionally what this is, is an overall ‘quality’ slider that is basically outside of or on top of all of a games other, actual quality settings.

          It is a smudge factor bandaid that covers up poor optimization within games.

          And that poor optimization is, in almost all cases… real time ray tracing/path tracing of some kind.

          A huge chunk of what has driven and enabled the development of higher fidelity, high fame rate rendering in the last 10 or 15 years has been figuring out basically clever tricks and hacks in your game design, engine design, and rendering pipeline, that make it so realtime lighting is only used where it absolutely needs to be used, in a very optimized way.

          Then, about 5 years ago, most AAA game devs/studios just stopped doing those optimizations and tricks, as a cost cutting measure in development… because ‘now the hardware can optimize automagically!’

          No, it cannot, not unless you think all PC gamers have a $5,000 dollar rig.

          A lot of this is tied to UE 5 being an increasingly popular, but also increasingly shit optimized engine.

      • ShadowRam@fedia.io
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        2 months ago

        You are just regurgitating nvidia’s marketing.

        No, this is general not only general consensus, but it’s measurably better when comparing SNR.

        You can personally hate it for any reason you want.

        But it doesn’t change the fact that AI up-scaling produces a more accurate result than native rendering.

        • gazter@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          I don’t understand. This isn’t really a subject I care much about, so forgive my ignorance.

          Are you saying that an AI generated frame would be closer to the actual rendered image than if the image rendered natively? Isn’t that an oxymoron? How can a guess at what the frame will be be more ‘accurate’ than what the frame would actually be?

          • sp3ctr4l@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            2 months ago

            They did in fact say that, and that is in fact nonsense, verifiable in many ways.

            Perhaps they misspoke, perhaps they are misinformed but uh…

            Yeah, it is fundamentally impossible to do what he actually described.

            Intelligent temporal frame upscaling is getting better and better at producing a frame that is almost as high quality as a natively rendered frame, for less rendering time, ie, higher fps… but its never going to be ‘better’ quality than an actual native render.