Why Nvidia DLSS 5 is the most hated thing in gaming right now

Why Nvidia DLSS 5 is the most hated thing in gaming right now

Nvidia just dropped a bomb at GTC 2026, and it's not the kind of "breakthrough" they were hoping for. They're calling DLSS 5 the "GPT moment for graphics," but if you've spent five minutes on X or Reddit lately, you know the internet has a different name for it: the AI slop filter.

For years, we've accepted AI upscaling and frame generation because they gave us more performance for "free." But DLSS 5 is different. It doesn't just make the image sharper; it reinterprets it. It's essentially taking the game's lighting and character models and running them through a generative AI beauty filter in real-time. The result? Graphics that look technically "photorealistic" but feel soulless, uncanny, and—in the case of several high-profile demos—borderline insulting to the original artists.

The Yassification of Grace Ashcroft

The moment this controversy went nuclear was during the Resident Evil Requiem demo. Nvidia showed off a side-by-side comparison of the character Grace Ashcroft. With DLSS 5 off, she looks like a real person in a survival horror game—tired, rugged, and appropriately "haggard" for the setting.

With DLSS 5 on, she looks like she just stepped out of a Sephora. Her lips are plumped, her skin is airbrushed, and the dark circles under her eyes are gone. Gamers immediately dubbed this "yassification." It’s a bizarre choice for a horror game where the protagonist is supposed to look like they’re struggling to survive.

This isn't just a technical glitch. It's a fundamental shift in what DLSS is. Until now, DLSS (Deep Learning Super Sampling) was a reconstruction tool. It took a low-resolution image and tried to guess what the high-resolution version looked like based on "ground truth" data. DLSS 5 uses a real-time neural rendering model to "infuse pixels with photoreal lighting and materials." That’s fancy talk for "the AI is making stuff up on the fly."

Where the tech goes off the rails

  • Age Erasure: In Hogwarts Legacy, a 15-year-old student ended up looking like a 25-year-old Instagram influencer.
  • Lost Atmosphere: Environmental shots in Starfield lost their moody, curated lighting in favor of a bright, high-contrast HDR look that feels like an over-processed smartphone photo.
  • Uncanny Valley: Because the AI is "guessing" materials like skin and hair, characters often end up with a plastic, waxy sheen that screams "AI-generated."

Jensen Huang says you're wrong

Nvidia CEO Jensen Huang isn't backing down. When asked about the "AI slop" backlash during a Q&A, he was blunt: "Well, first of all, they’re completely wrong."

Huang’s argument is that DLSS 5 isn't a mindless filter. He claims it "fuses controllability of the geometry and textures with generative AI." According to Nvidia, developers have full "artistic control" via an SDK that lets them mask off certain areas or tune the intensity. They’re basically saying, "Don't blame the tool, blame how it's being used."

But here’s the problem: if the official Nvidia reveal—the one they spent months preparing—looks this bad, what hope do we have for a random Ubisoft open-world game?

How DLSS 5 actually works under the hood

If we ignore the "yassified" faces for a second, the technology is actually terrifyingly impressive from a computer science perspective. It’s no longer just about pixels; it’s about semantic understanding.

DLSS 5 takes the raw color buffer and motion vectors from the game engine. The neural network then identifies what it’s looking at—this is hair, this is skin, this is rusted metal. It then applies a "neural rendering" pass that simulates how light should interact with those specific materials in a perfect world.

In theory, this allows a single GPU to produce visuals that would normally require a massive render farm. In practice, the current GTC demos were running on two RTX 5090s—one to render the game and one dedicated entirely to the DLSS 5 model. Nvidia says they'll optimize this for a single card by the fall 2026 launch, but it shows just how heavy this "AI pass" really is.

The death of art direction

The loudest critics aren't just complaining about "fake frames" anymore. They’re worried about the homogenization of art.

If every game uses the same Nvidia-trained model to "beautify" its lighting and characters, won't every game start to look the same? An art director spends years picking a specific color palette and lighting rig to evoke a certain emotion. If a player toggles on DLSS 5 and the AI decides to "fix" that lighting to make it more "photorealistic," the original artistic intent is dead.

It’s the same struggle we’ve seen in digital photography. Modern smartphones use so much computational processing that a photo of a sunset often looks nothing like what your eyes actually saw. It looks "better" to an algorithm, but it’s a lie.

Current games confirmed for DLSS 5 support

  1. Assassin’s Creed Shadows
  2. Starfield
  3. Resident Evil Requiem
  4. The Elder Scrolls IV: Oblivion Remastered
  5. Hogwarts Legacy
  6. Delta Force

Is there a middle ground

It’s not all bad. Some early hands-on reports from the GTC floor suggest that for environmental details—like the way light catches a glass of water or how shadows wrap around clutter on a desk—DLSS 5 is actually transformative. It fills in the "gaps" that even path tracing misses because of limited ray budgets.

The issue is that Nvidia led with faces. Humans are hardwired to spot tiny inconsistencies in faces. When the AI changes a character's jawline or smooths out their wrinkles, our brains scream "fake."

If Nvidia wants to win back the "slop" accusers, they need to show that this tech can respect a game's aesthetic. We don't need a "make everything look like a Hollywood VFX shot" button. We need tools that help developers realize their specific vision faster.

If you’re worried about your favorite characters being "looks-maxed" into oblivion, the best thing you can do is stay vocal. Bethesda and Capcom are already doing damage control, promising that the final implementation will be "under artist control."

You should check your Nvidia App beta settings on March 31st; while DLSS 5 is a fall release, the new DLSS 4.5 Dynamic Multi Frame Generation is dropping then. It'll give you a taste of how aggressive Nvidia is getting with AI-driven smoothness before the "neural rendering" era truly begins.

AC

Ava Campbell

A dedicated content strategist and editor, Ava Campbell brings clarity and depth to complex topics. Committed to informing readers with accuracy and insight.