• Yaarmehearty@lemmy.ml
    link
    fedilink
    arrow-up
    1
    ·
    7 days ago

    The preference against DOF is fine. However, I’m looking at my f/0.95 and f/1.4 lenses and wondering why it’s kind of prized in photography for some genres and hated in games?

  • Aux@feddit.uk
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    7 days ago

    The title should be “anon can’t afford rtx5090”.

  • yonder@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    8 days ago

    Out of all of these, motion blur is the worst, but second to that is Temporal Anti Aliasing. No, I don’t need my game to look blurry with every trailing edge leaving a smear.

    • sp3ctr4l@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      7 days ago

      TAA is kind of the foundation that almost all real time EDIT: raytracing frame upscaling and frame generation are built on, and built off of.

      This is why it is increasingly difficult to find a newer, high fidelity game that even allows you to actually turn it off.

      If you could, all the subsequent magic bullshit stops working, all the hardware in your GPU designed to do that stuff is now basically useless.

      EDIT: I goofed, but the conversation thus far seems to have proceeded assuming I meant what I actually meant.

      Realtime raytracing is not per se foundationally reliant on TAA, DLSS and FSR frame upscaling and later framgen tech however basically are, they evolved out of TAA.

      However, without the framegen frame rate gains enabled by modern frame upscaling… realtime raytracing would be too ‘expensive’ to implement on all but fairly high end cards / your average console, without serious frame rate drops.

      Befor Realtime raytracing, the paradigm was that all scenes would have static light maps and light environments, baked into the map, with a fairly small number of dynamic light sources and shadows.

      With Realtime raytracing… basically everything is now dynamic lights.

      That tanks your frame rate, so Nvidia then barrelled ahead with frame upscaling and later frame generation to compensate for the framerate loss that they introduced with realtime raytracing, and because they’re an effective monopoly, AMD followed along, as did basically all major game developers and many major game engines (UE5 to name a really big one).

      • Vlyn@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        8 days ago

        What? All Ray Tracing games already offer DLSS or FSR, which override TAA and handle motion much better. Yes, they are based on similar principles, but they aren’t the mess TAA is.

        • sp3ctr4l@lemmy.zip
          link
          fedilink
          English
          arrow-up
          0
          ·
          8 days ago

          Almost all implementations of DLSS and FSR literally are evolutions of TAA.

          TAA 2.0, 3.0, 4.0, whatever.

          If you are running DLSS or FSR, see if your game will let you turn TAA off.

          They often won’t, because they often require TAA to be enabled before DLSS or FSR can then hook into them and extrapolate from there.

          Think of TAA as a base game and DLSS/FSR as a dlc. You very often cannot just play the DLC without the original game, and if you actually dig into game engines, you’ll often find you can’t run FSR/DLSS without running TAA.

          There are a few exceptions to this, but they are rare.

          • Vlyn@lemmy.zip
            link
            fedilink
            English
            arrow-up
            0
            ·
            7 days ago

            TAA just means temporal anti aliasing. Temporal as in relying on data from the previous frames.

            The implementation of DLSS and FSR are wholly separate from the old TAA. Yes, they work on the same principals, but do their own thing.

            TAA as a setting gets disabled because the newer methodes fully overwrite it. Some games hide the old setting, others gray it out, it depends.

            • sp3ctr4l@lemmy.zip
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              7 days ago

              The implementation of DLSS and FSR are wholly separate from the old TAA. Yes, they work on the same principals, but do their own thing.

              TAA as a setting gets disabled because the newer methodes fully overwrite it.

              This is very often false.

              DLSS/FSR need per pixel motion vectors, or at least comparisons, between frames, to work.

              TAA very often is the thing that they get those motion vectors from… ie, they are dependent on it, not seperate from it.

              Indeed, in many games, significant other portions/features of a game’s graphical engine bug out massively when TAA is manually disabled, which means these features/portions are also dependent on TAA.

              Sorry to link to the bad site, but:

              https://www.reddit.com/r/FuckTAA/comments/motdjd/list_of_known_workarounds_for_games_with_forced/

              And here’s all the games that force TAA which no one has yet figured out how to disable:

              https://www.reddit.com/r/FuckTAA/comments/rgxy44/list_of_games_with_forced_taa/

              Please go through all of these and notice how many modern games:

              1. Do not allow the user to turn off TAA easily, forcing them to basically mod the game by manually editing config files or more extensive workarounds.

              2. Don’t even tell the user that TAA is being used, requiring them to dig through the game to discover that it is.

              3. When TAA is manually disabled, DLSS/FSR breaks, or other massive graphical issues crop up.

              TAA is the foundational layer that many modern games are built on… because DLSS/FSR/XeSS and/or other significant parts of the game’s graphical engine hook into the pixel motion per frame comparisons that are done by TAA.

              The newer methods very often do not overwrite TAA, they are instead dependent on it.

              Its like trying to run or compile code that is dependent on a library you don’t actually have present… it will either fail entirely, or kind of work, but in a broken way.

              Sure, there are some instances where DLSS/FSR is implemented in games, in a way that is actually its whole own, self contained pipeline… but very often, this is not the case, TAA is a dependency for DLSS/FSR or other graphical features of the game engine.

              TAA is massively different that older MSAA or FXAA or SMAA kinds of AA… because those don’t compare frames to previous frames, they just apply an effect to a single frame.

              TAA provides ways of comparing differences in sequences of frames, and many, many games use those methods to feed into many other graphical features that are built on top of, and require those methods.

              To use your own words: TAA is indeed a mess, and you apoarently have no idea how foundational this mess is to basically all the new progression of heavily marketed, ‘revolutionary’ graphical rendering techniques of the past 5 ish years.

              • Vlyn@lemmy.zip
                link
                fedilink
                arrow-up
                1
                ·
                5 days ago

                I get all that, but it’s still not what I’m trying to explain. If TAA is forced in a game that supports DLSS/FSR it’s still not used in the image itself, but rather just the motion vector data gets piped into the new algorithm.

                Otherwise even with DLSS/FSR active you’d have all the smearing and bad quality of the original TAA implementation, which you simply don’t.

                So it’s just pedantic if a toggle in a game appears on/off or at all, if the engine just uses the motion vector data and then uses DLSS/FSR/XeSS or what have you to actually do the anti-aliasing.

    • CanadaGeese@lemmy.zip
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      8 days ago

      Honestly motion blur done well works really well. Cyberpunk for example does it really well on the low setting.

      Most games just dont do it well tho 💀

  • Artyom@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    8 days ago

    Step 1. Turn on ray tracing

    Step 2. Check some forum or protondb and discover that the ray tracing/DX12 is garbage and gets like 10 frames

    Step 3. Switch back to DX11, disable ray tracing

    Step 4. Play the game

  • Onionguy@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    8 days ago

    Taps temple Auto disable ray tracing if your gpu is too old to support it ( ͡° ͜ʖ ͡°)

  • Baguette@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    7 days ago

    Depth of field and chromatic aberration are pretty cool if done right.

    Depth of field is a really important framing tool for photography and film. The same applies to games in that sense. If you have cinematics/cutscenes in your games, they prob utilize depth of field in some sense. Action and dialogue scenes usually emphasize the characters, in which a narrow depth of field can be used to put focus towards just the characters. Meanwhile things like discovering a new region puts emphasis on the landscape, meaning they can use a large depth of field (no background blur essentially)

    Chromatic aberration is cool if done right. It makes a little bit of an out of place feel to things, which makes sense in certain games and not so much in others. Signalis and dredge are a few games which chromatic aberration adds to the artstyle imo. Though obviously if it hurts your eyes then it still plays just as fine without it on.

    • justastranger@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      7 days ago

      Chromatic aberration is also one of the few effects that actually happens with our eyes instead of being an effect designed to replicate a camera sensor.

    • ysjet@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 days ago

      I feel like depth of field and motion blur have their place, yeah. I worked on a horror game one time, and we used a dynamic depth of field- anything you were looking at was in focus, but things nearer/farther than that were slightly blurred out, and when you moved where you were looking, it would take a moment (less than half a second) to ‘refocus’ if it was a different distance from the previous thing. Combined with light motion blur, it created a very subtle effect that ratcheted up anxiety when poking around. When combined with objects in the game being capable of casting non-euclidean shadows for things you aren’t looking at, it created a very pervasive unsettling feeling.

    • ShortFuse@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 days ago

      Most “film grain” is just additive noise akin to digital camera noise. I’ve modded a bunch of games for HDR (RenoDX creator) and I strip it from almost every game because it’s unbearable. I have a custom film grain that mimic real film and at low levels it’s imperceptible and acts as a dithering tool to improve gradients (remove banding). For some games that emulate a film look sometimes the (proper) film grain lends to the the look.

      • kautau@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        7 days ago

        Agreed. It fits very well in very specific places, but when not there, it’s just noise

  • Psythik@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    7 days ago

    Hating on hair quality is a new one for me. I can understand turning off Ray Tracing if you can have a low-end GPU, but hair quality? It’s been at least a decade since I’ve last heard people complaining that their GPU couldn’t handle Hairworks. Does any game even still use it?

  • ShortFuse@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    7 days ago

    Bad effects are bad.

    I used to hate film grain and then did the research for implementing myself, digging up old research papers on how It works at a scientific level. I ended up implementing a custom film grain in Starfield Luma and RenoDX. I actually like it and it has a level of “je ne sais quoi” that clicks in my brain that feels like film.

    The gist is that everyone just does additive random noise which raises black floor and dirties the image. Film grain is perceptual which acts like cracks in the “dots” that compose an image. It’s not something to be “scanned” or overlayed (which gives a dirty screen effect).

    Related, motion blur is how we see things in real life. Our eyes have a certain level of blur/shutter speed and games can have a soap opera effect. I’ve only seen per-object motion blur look decent, but fullscreen is just weird, IMO.

  • lime!@feddit.nu
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    8 days ago

    motion blur is essential for a proper feeling of speed.

    most games don’t need a proper feeling of speed.

    • Waffle@infosec.pub
      link
      fedilink
      arrow-up
      1
      ·
      8 days ago

      Motion blur is guarenteed to give me motion sickness every time. Sometimes I forget to turn it off on a new game… About 30 minutes in I’ll break into cold sweats and feel like I’m going to puke. I fucking hate that it’s on by default in so many games.

    • sp3ctr4l@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      8 days ago

      … What?

      I mean… the alternative is to get hardware (including a monitor) capable of just running the game at an fps/hz above roughly 120 (ymmv), such that your actual eyes and brain do real motion blur.

      Motion blur is a crutch to be able to simulate that from back when hardware was much less powerful and max resolutions and frame rates were much lower.

      At highet resolutions, most motion blur algorithms are quite inefficient and eat your overall fps… so it would make more sense to just remove it, have higher fps, and experience actual motion blur from your eyes+brain and higher fps.

      • AdrianTheFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        You still see doubled images instead of a smooth blur in your peripheral vision I think when you’re focused on the car for example in a racing game.

          • AdrianTheFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            I mean just from persistence of vision you’ll see multiple copies of a moving object if your eyes aren’t moving. I have realized tho that in the main racing game I use motion blur in (beamng) I’m not actually reaching above 80fps very often.

            here, I copied someone’s shader to make a quick comparison:

            with blur: https://www.shadertoy.com/view/wcjSzV

            without blur: https://www.shadertoy.com/view/wf2XRV

            Even at 144hz, one looks smooth while the other has sharp edges along the path.

            Keep in mind that this technically only works if your eye doesn’t follow any of the circles, as that would require a different motion blur computation. That’s obviously not something that can be accounted for on a flatscreen, maybe in VR at some point though if we ever get to that level of sophistication. VR motion blur without taking eye movement into account is obviously terrible and makes everyone sick.

            Someone else made a comparison for that, where you’re supposed to follow the red dot with your eye. (keep in mind that this demo uses motion blur lengths longer than a frame, which you would not have if aiming for a human eye-like realistic look)

            https://www.shadertoy.com/view/7tdyW8

            • Fonzie!@ttrpg.network
              link
              fedilink
              arrow-up
              1
              ·
              1 day ago

              Okay now I got it from your explanation, thanks!

              By the way, the first two Shadertoys aren’t working for me, I just get “:-( We either didn’t find the page you were looking for, or there was an internal error.”. The third one works, though…

      • lime!@feddit.nu
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        8 days ago

        my basis for the statement is beam.ng. at 100hz, the feeling of speed is markedly different depending on whether motion blur is on. 120 may make a difference.

  • sp3ctr4l@lemmy.zip
    link
    fedilink
    English
    arrow-up
    0
    ·
    8 days ago

    Now… in fairness…

    Chromatic abberation and lense flares, whether you do or don’t appreciate how they look (imo they arguably make sense in say CP77 as you have robot eyes)…

    … they at least usually don’t nuke your performance.

    Motion blur, DoF and ray tracing almost always do.

    Hairworks? Seems to be a complete roll of the dice between the specific game and your hardware.

    • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 🇮 @pawb.social
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      8 days ago

      Motion Blur and depth of field has almost no impact on performance. Same with Anisotropic Filtering and I can not understand why AF isn’t always just defaulted to max, since even back in the golden age of gaming it had no real performance impact on any system.

      • sp3ctr4l@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        7 days ago

        You either haven’t been playing PC games very long, or aren’t that old, or have only ever played on fairly high end hardware.

        Anisotropic filtering?

        Yes, that… hasn’t been challenging for an affordable PC an average person has to run at 8x or 16x for … about a decade. That doesn’t cause too much framerate drop off at all now, and wasn’t too much until you… go all the way back to the mid 90s to maybe early 2000s, when ‘GPUs’ were fairly uncommon.

        But that just isn’t true for motion blur and DoF, especially going back further than 10 years.

        Even right now, running CP77 on my steam deck, AF level has basically no impact on my framerate, whereas motion blur and DoF do have a noticable impact.

        Go back even further, and a whole lot of motion blur/DoF algorithms were very poorly implemented by a lot of games. Nowadays we pretty much get the versions of those that were not ruinously inefficient.

        Try running something like Arma 2 with a mid or low range PC with motion blur on vs off. You could get maybe 5 to 10 more fps having it off… and thats a big deal when you’re maxing out at 30 to 40ish fps.

        (Of course now we also get ghosting and smearing from framegen algos that ironically somewhat resemble some forms of motion blur.)