Currently trying to refund the new Indiana Jones game because it’s unplayable without raytracing cri. My card isn’t even old, it’s just 8GB of VRAM is the absolute minimum apparently so my mobile 3060 is now useless. I miss when I used to be able to play new games in 2014 on my shitty AMD card at 20fps, yeah it didn’t look great but developers still included a very low graphics option for people like me. Now you need to be upgrading every 2 years to keep up.

  • Gucci_Minh [he/him]@hexbear.net
    cake
    link
    fedilink
    English
    arrow-up
    21
    ·
    7 days ago

    TAA, dof, chromatic aberration, motion blur, vignetting, film grain, and lens flare. Every modern dev just dumps that shit on your screen and calls it cinematic. Its awful and everything is blurry. And sometimes you have to go into an ini file because it’s not in the settings.

    • genderbitch [she/her, it/its]@hexbear.net
      link
      fedilink
      English
      arrow-up
      14
      ·
      7 days ago

      Chromatic aberration! When I played R&C: Rift Apart on PS5 I was taking screenshots and genuinely thought there was some kind of foveated rendering in play because of how blurry the corners of the screen looks. Turns out it was just chromatic aberration, my behated.

      Hate film grain too because I have visual snow and I don’t need to stack more of that shit in my games.

      • Gucci_Minh [he/him]@hexbear.net
        cake
        link
        fedilink
        English
        arrow-up
        14
        ·
        7 days ago

        Dev: should we make our game with a distinctive style that is aesthetically appealing? Nah slap some noise on the screen and make it look like your character is wearing dirty oakleys and has severe astigmatism and myopia that’ll do it.