Highly anticipated: As the unveiling of consumer Blackwells draws near, clear images of Nvidia’s next-generation graphics cards are beginning to materialize. The new lineup’s flagship product will undoubtedly set new performance benchmarks, but the latest information suggests that it will also use one of the biggest chips in Nvidia’s history.

Trusted leaker “MEGAsizeGPU” recently claimed that Nvidia’s upcoming GB202 graphics processor, which will power the GeForce RTX 5090, uses a 24mm x 31mm die. If the report is accurate, it might support earlier rumors claiming the graphics card will retail for nearly $2,000.

A 744mm² die would make the GB202 22 percent larger than the RTX 4090’s 619mm² AD102 GPU. It would also be the company’s largest die since the TU102, which measured 754mm² and served as the core of the RTX 2080 Ti, released in 2018.

  • .Donuts@lemmy.world
    link
    fedilink
    arrow-up
    23
    ·
    27 days ago

    it might support earlier rumors claiming the graphics card will retail for nearly $2,000

    This is crazy. My 980 ti was expensive considering it sold at $649. The 3090 was $1,499 and the 4090 $1,599 (MSRP).

    • dinckel@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      27 days ago

      People continue to buy it, so they will continue selling it. On top of that, gaming isn’t even their biggest market, so we stopped, it wouldn’t put a dent into their business.

      It’s a big shame though

    • tleb@lemmy.ca
      link
      fedilink
      arrow-up
      7
      ·
      27 days ago

      I think Nvidia is poisoning the well with the $2000 rumours so that they’re the heroes when it releases at ~$1750

    • zod000@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      27 days ago

      It is no coincidence that the 980 ti was the last top end GPU I purchased from Nvidia. Their greed is out of control and I can’t believe a meaningful number of people have gone along with it.

    • Bakkoda@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      27 days ago

      I’m still rocking my 2080 ti and i think i paid just over 1k for it. I won’t be buying nvidia again anyway but the prices just put me off so much

          • SturgiesYrFase@lemmy.ml
            link
            fedilink
            arrow-up
            1
            ·
            26 days ago

            They’re still perfectly functional. As long as you’re happy not having ray tracing, and are willing to settle for not running everything on max gfx, there’s no reason to buy a 50, 40 or 30 series. I’m able to play most new games that interest me on medium gfx with no issues.

  • Cowbee [he/they]@lemmy.ml
    link
    fedilink
    arrow-up
    11
    arrow-down
    1
    ·
    edit-2
    27 days ago

    My next PC will be small, Linux-based (probably NixOS), and focus on power efficiency. AAA gaming more often than not loses to indies, and there is currently a library of games you can play on PC that could never be completed by one person alone anyways that run very well with even modest “modern” specs.

    I’ve been playing Crosscode and it’s been such a stellar game, meanwhile titles like S.T.A.L.K.E.R 2 are less feature complete and far more buggy than S.T.A.L.K.E.R Anomaly and GAMMA (which don’t require super computers to run, and are free).

  • B0rax@feddit.org
    link
    fedilink
    arrow-up
    7
    ·
    27 days ago

    The 90 series is always overpriced and inefficient. Tell me about the 80 series.

    • CausticFlames
      link
      fedilink
      arrow-up
      5
      ·
      27 days ago

      I wish they hadn’t done away with the titan naming. the second the 90 skew was released the titan died, and I feel painting it as just another tier higher of a consumer grade card is disingenuous.