DingoBilly,

What’s going on? It’s overpriced and completely unnecessary for most people. There’s also a cost of living crisis.

I play every game I want to on high graphics with my old 1070. Unless you’re working on very graphically intensive apps or you’re a pc master race moron then there’s no need for new cards.

n3m37h,

It was a night and day difference going from a 1060 6gb to a 6700xt. The prices are still kida shit but that goes for everything

chiliedogg,

I still game 1080p and it looks fine. I’m not dropping 2500 bucks to get a 4k monitor and video card to run it when I won’t even register the difference during actual gameplay.

randon31415,

Is this the one that they nerfed so that they could sell them in China around the US AI laws?

potustheplant,

Nope, that’s the 4090.

hark,
@hark@lemmy.world avatar

So many options, with small differences between them, all overpriced to the high heavens. I’m sticking with my GTX 1070 since it serves my needs and I’ll likely keep using it a few years beyond that out of spite. It cost $340 at the time I bought it (2016) and I thought that was somewhat overpriced. According to an inflation calculator, that’s $430 in today’s dollars.

dangblingus,

1060 6gb here. Loving life atm.

Blackmist,

It’ll do for the few pc games I play. FFXIV don’t need much to run. Even handles HL Alyx.

systemglitch,

Ditto. It’s a great card and I don’t feel I’m missing out over what newer cards offer.

Dra, (edited )

I haven’t paid attention to GPUs since I got my 3080 on release day back in Covid.

Why has acceptable level of VRAM suddenly doubled vs 4 years ago? I don’t struggle to run a single game on max settings at high frames @ 1440p, what’s the benefit that justifies the cost of 20gb VRAM outside of AI workloads?

rndll,

GPU rendering and AI.

Asafum,

Lmao

We have your comment: what am I doing with 20gb vram?

And one comment down: it’s actually criminal there is only 20gb vram

Dra,

Lol

AlijahTheMediocre,

If only game developers optimized their games…

The newest hardware is getting powerful enough that devs are banking on people just buying better cards to play their games.

Obi,
@Obi@sopuli.xyz avatar

Personally I need it for video editing & 3D work but I get that’s a niche case compared to the gaming market.

Eccitaze,
@Eccitaze@yiffit.net avatar

An actual technical answer: Apparently, it’s because while the PS5 and Xbox Series X are technically regular x86-64 architecture, they have a design that allows the GPU and CPU to share a single pool of memory with no loss in performance. This makes it easy to allocate a shit load of RAM for the GPU to store textures very quickly, but it also means that as the games industry shifts from developing for the PS4/Xbox One X first (both of which have separate pools of memory for CPU & GPU) to the PS5/XSX first, VRAM requirements are spiking up because it’s a lot easier to port to PC if you just keep the assumption that the GPU can handle storing 10-15 GB of texture data at once instead of needing to refactor your code to reduce VRAM usage.

Dra,

Perfect answer thank you!

Blackmist,

Current gen consoles becoming the baseline is probably it.

As games running on last gen hardware drop away, and expectations for games rise above 1080p, those Recommended specs quickly become an Absolute Minimum. Plus I think RAM prices have tumbled as well, meaning it’s almost Scrooge-like not to offer 16GB on a £579 GPU.

That said, I think the pricing is still much more of an issue than the RAM. People just don’t want to pay these ludicrous prices for a GPU.

Space_Racer,
@Space_Racer@lemm.ee avatar

I’m maxed on VRAM in VR for the most part with a 3080. It’s my main bottleneck.

Hadriscus,

Perhaps not the biggest market but consumer cards (especially nvidia’s) have been the preferred hardware in the offline rendering space -ie animation and vfx- for a good few years now. They’re the most logical investment for freelancers and small to mid studios thanks to hardware raytracing. CUDA and later Optix may be anecdotal on the gaming front, but they completely changed the game over here

mlg,
@mlg@lemmy.world avatar

insert linus torvalds nvidia clip here

trackcharlie,

less than 20gb of vram in 2024?

The entire 40 series line of cards should be used as evidence against nvidia in a lawsuit surrounding intentional creation of e waste

BorgDrone,

The real tragedy is that PCs still have to make do with discrete graphics cards that have separate VRAM.

Kazumara,

600 $ for a card without 16 GB of VRAM is a big ask. I think getting a RX 7800 XT for 500 $ will serve you well for a longer time.

NIB, (edited )

12gB vram is not a bottleneck in any current games on reasonable settings. There is no playable game/settings combination where a 7800xt’s 16gB offers any advantage. Or do you think having 15fps average is more playable than 5fps average(because the 4070s is ram bottlenecked)? Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

The 4070 super offers significantly superior ray tracing performance, much lower power consumption, superior scaling(and frame generation) technology, better streaming/encoding stuff and even slightly superior rasterization performance to the 7800xt. Are these things worth sacrificing for 100€ less and 4gB vram? For most people they arent.

Amd’s offerings are competitive, not better. And the internet should stop sucking their dick, especially when most of the internet, including tech savvy people, dont even use AMD gpus. Hell, LTT even made a series of videos about how they had to “suffer” using AMD gpus, yet they usually join the nvidia shitting circlejerk.

I have an amd 580 card and have bought and recommended AMD gpus to people since the 9500/9700pro series. But my next gpu will almost certainly be an nvidia one. The only reason people are complaining is because nvidia can make a better gpu(as shown by the 4090) but they choose not to. While AMD literally cant make better gpus but they choose to only “competitively” price their gpus, instead of offering something better. Both companies suck.

YeetPics,
@YeetPics@mander.xyz avatar

$100 less IS the advantage.

NIB,

It’s not enough though and the sales are showing it. 7800xt is a decent card but it isnt an amazing offer, it is just a good one. For some people, It is a slightly better value for money option. But those nvidia things have value too. So the value proposition isnt as clearcut, even though it should be considering that AMD is behind.

The steam stats should tell you what consumers think. And while consumers are not infallible, they are a pretty good indicator. The most popular amd card is the 580, which is arguably one of the best cards of all time. Except it came out 6 years ago. Did AMD have a better marketing back then? No. Did they have the performance crown? Nope. But that didnt stop the 580 from being an amazing card.

The 7800xt could have been the new 580, mid/high end card, with decent vram. Except you could get the 580 for 200€, while the 7800xt costs literally three times as much. When your “good” card is so expensive, customers have higher expectations. It isnt just about running games well(cheaper cards can do that too), it is about luxury features, like ray tracing and upscaling tech.

Imagine if the 7800xt was 400€. We wouldnt even have this conversation. But it isnt. In fact, in Europe it launched at basically the same price as a 4070. Even today, it is 50€-80€ cheaper. If nvidia is scamming us with inferior offers, why arent AMD offers infinitely better in value? Because AMD is also scamming us, just very slightly less so.

YeetPics,
@YeetPics@mander.xyz avatar

I disagree.

daq,

$100 sure feels much more solid than rtx that a ton of games don’t even support. There are a bunch of people that just want to play in 4k and couldn’t care less about features you call luxury.

That requires more VRAM and 7800xt and xtx deliver that perfectly.

potustheplant,

A ton? Try “most”.

Caitlyynn,
@Caitlyynn@lemmy.blahaj.zone avatar

Found the Nvidia fanboy

daq,

D4 on Linux. Literally the only bottleneck is it eats 11GB of my 1080Ti’s VRAM for breakfast and then still wants lunch and dinner. Plays 4k on high with prefect fps otherwise. Starts glitching like crazy once VRAM is exhausted after 10-15 minutes.

Zero issues on a 20GB card. I understand that shitty code on single game is not exactly universal example, but it is a valid reason to want more VRAM.

Kazumara,

Is this indicative of future potential bottlenecks? Maybe but i wouldnt be so sure.

This is exactly what I expect. I have seen what happened to my friends with their GTX 970 when 3.5 GB of VRAM wasn’t enough anymore. Even though the cards were still rasterizing quickly enough they weren’t useful for certains games anymore. Therefore I recently make sure I go for enough VRAM to extend the useful service life of my cards.

And I’m not just talking about buying AMD, I actually do buy them. I first had the HD 5850 with 1GB, then got my friends HD 5870 also with 1GB (don’t remember if I used it in crossfire or just replaced), then two of my friends each sold me their HD 7850 with 2GB for cheap and I ran crossfire, then I bought a new R9 380 with 4GB when a game that was important to me at the time couldn’t deal with crossfire well, then I bought a used RX 580 with 8GB and finally the RX 6800 with 16 GB two years ago.

At some point I also bought a used GTX 960 because we were doing some CUDA stuff at University, but that was pretty late, when they weren’t current anymore, and it was only used in my Linux server.

Anti_Face_Weapon,

But my rtx :(

altima_neo,
@altima_neo@lemmy.zip avatar

The RAM is so lame. It really needed more.

Performance exceeding the 3090, but limited by 12 gigs of RAM .

Binthinkin,

You all should check prices comparing dual fan 3070’s to 4070’s they are a $40 difference on Amazon. Crazy to see. They completely borked their pricing scheme trying to get whales and crypto miners to suck their 40 series dry and wound up getting blue balled hard.

Aren’t they taking the 4080 completely off the market too?

TheGrandNagus,

Aren’t they taking the 4080 completely off the market too?

Apparently they stopped production of it months ago. Whatever still exists on shelves is only there because nobody has been buying them.

Honestly this has been the worst 80-class Nvidia card ever. The GTX 480 was a complete joke but even that managed to sell ok.

AlexisFR,
@AlexisFR@jlai.lu avatar

Wait, they didn’t put the 4070 super at 16 GB?

Crashumbc,

If I understand it’s a limitation of the bus width which they kept the same.

AProfessional,

They clearly believe customers will always buy nvidia over amd so why bother competing just make an annoyingly segmented lineup.

Kbobabob,

Nope. Even my 3080 Ti has 12Gb. I was waiting for the 4000 series refresh but i think I’ll just wait and see what the 5000 series looks like.

UnfortunateShort,

Why people no buy our GPU anymore?

Because I can get a whole fucking console for the price of a lower midrange GPU. My only hope is Intel’s Battlemage at this point.

GhostlyPixel,
@GhostlyPixel@lemmy.world avatar

I will hold onto my $700 3080 until it spits fire, cannot believe I was lucky enough to get it at launch.

Kbobabob,

Because you are completely against AMD?

UnfortunateShort,

I have an all-AMD system, but they have become too expensive as well. Just Nvidia with a 20% discount, safe for the 7900 XTX which is completely out of question for me to begin with.

Kbobabob,

Cheaper Nvidia ain’t bad. This is coming from someone that uses a 3080Ti and refuses to use AMD GPUs because of shit way in the past. I use their processors though, those are amazing i just wish they had support for thunderbolt.

jaxxed,

Does Intel allow AMD to license thunderbolt? USB might be better in the long term to support.

flatlined,

You can get amd with thunderbolt. The motherboards with thunderbolt headers are bloody expensive, and you’ll need a 200 bucks add in card (which needs to match the motherboard manufacturer I think), so it’s not exactly cheap, but it is possible.

Kbobabob,

I understand you can shoehorn just about anything you want into a system but that’s not the same as supporting it IMO.

flatlined,

Agreed, and in my experience (Asus board) it’s functional but a bit buggy, so not an easy recommendation. Still, if you want or need team red it’s an option. Price premium sucked, but wasn’t actually noticeably more than if I’d gone team blue. Not sure I’d do it again in hindsight though. Fully functional but only 90% reliable (which is worse than it seems, in the same way a delay of “only” a second every time you do something adds up to a big annoyance) is perhaps not worth it for my use case.

sake,

Yeah I’m still running GTX970 since the GPU prices went zongers right after buying this. Last generation with decent performance price balance.

Fuck the market. I’ll just stick with this one until it dies on me.

darkkite,

yeah but then you have to play a console without mods or cheap games

try buying a used GPU and game on 1080p monitor and you’ll be able to have great graphics without a lot of money

sandwichfiend,

@Hypx @technology GPUs are still too expensive for me

CosmoNova,

I mean yeah when I‘m searching for GPUs I specifically filter out anything that‘s less than 16GB of VRAM. I wouldn‘t even consider buying it for that reason alone.

Thorny_Insight,

And here I’m thinking upgrading from two 512mb cards to a GTX 1660 SUPER with 6GB VRAM is going to be good for another 10 years. The heck does someone need 16 gigs for?

barsoap,

AI. But you’re right my 4G 5500XT so far is putting up a valiant fight though I kinda dread trying out CP77 again after the big patch it’s under spec now. Was a mistake to buy that thing in the first place, should’ve gone with 8G but I just had to be pigheaded with my old “workstation rule” – Don’t spend more on the GPU than on the CPU.

pycorax,

Future proofing. GPUs are expensive and I expect to be able to use it for at least the next 7 years, even better if it lasts longer than that.

redditReallySucks,
@redditReallySucks@lemmy.dbzer0.com avatar

Gaming in 4k or AI (e.g stable diffusion or language models)

SPRUNT,

VR uses a lot of RAM.

Treczoks,

And I thought I had the lamest card on the block with my 2GB. …

Crashumbc,

Unless your gaming that’s fine.

But if you want to play any newer AAA games (even less than 5-8 years old) or use more than 1080p. You’ll need better.

Rakonat,

Nvidia over pricing their cards and limiting stock, acting like there is still a gpu shortage from all the crypto bros sucking everything up.

Right now, their competitors are beating them at hundreds of dollars below nvidias mrp like for like with the only true advantage nvidia has is in ray tracing and arguably VR.

It’s possible we’re approaching another shorter with the AI bubble though for the moment that seems to be pretty far off.

TL;DR Nvidia is trying to sell a card at twice it’s value cause greed.

Evilcoleslaw,

They’re beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).

The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.

altima_neo,
@altima_neo@lemmy.zip avatar

And AI. They’re beating the pants off AMD at AI.

Evilcoleslaw,

True enough. I was thinking more of the gaming use case. But even beyond AI and just a general compute workload they’re beating the pants off AMD with CUDA as well.

mihies,

And they beat AMD in efficiency! I'm (not) surprised that people ignore this important aspect which matters in noise, heat and power usage.

MonkderZweite,

Toms Hardware did a test, Rx 6800 is leader there. Next, RTX 3070, is 4.3% worse. Are their newer cards more efficient than AMD’s newer cards?

pycorax,

They seem to be but honestly, this generation hasn’t been very impressive for both team green and red. I got a 6950 XT last year and seeing all these new releases has only proven that I made a good investment.

Daveyborn,
@Daveyborn@lemmy.world avatar

Nothing compelling enough for me to hop off of a titan Xp yet. (Bought a titan because it was cheaper than a 1070 at the time because of scalpers)

Crashumbc,

30 series maybe.

40 series power usage Nvidia destroys AMD.

The 4070 uses WAY less than a 3070… It’s 200 (220 for supera) that’s nearly more than my 1070 170w

umbrella,
@umbrella@lemmy.ml avatar

Streaming performance is really good on AMD cards, IME. Upscaling is honestly close and getting closer.

I dont think better RT performance is worth the big premium or annoyances nvidia cards bring. Doubly so on Linux.

genie,

Couldn’t agree more! Abstracting to a general economic case – those hundreds of dollars are a double digit percentage of the overall cost! Double digit % cost increase for single digit % performance doesn’t quite add up @nvidia :)

Especially with Google going with TPUs for their AI monstrosities it makes less and less sense at large scale for a consumers to pay the Nvidia tax just for CUDA compatibility. Especially with the entrance of things like SYCL that help programmers avoid vendor lock.

caseyweederman,

Remember when eVGA decided they would rather leave the market entirely than spend one more day working with Nvidia?

dependencyinjection,

Really?

frezik,

Yup. It was something like 90% of their revenue, but 25% of their profit.

AlexisFR,
@AlexisFR@jlai.lu avatar

And now they have 0 revenue and 0 profit.

Daveyborn,
@Daveyborn@lemmy.world avatar

Yeah sadly they weren’t gonna be able to stay the same with their remaining products being expensive niche case motherboards and good power supplies. Hopefully the employees got good gigs elsewhere at least.

FlyingSquid,
@FlyingSquid@lemmy.world avatar

They still exist. However their website also says they’re “America’s #1 NVIDIA partner,” so…

frezik,

They do seem to be winding down operations as a whole, though. It’s a deliberate choice on the owner’s part.

Filthmontane,

Well according to the previous math, they retained 10% of their revenue and 75% of their profits. I know, math is hard.

altima_neo,
@altima_neo@lemmy.zip avatar

They aren’t gonna get far just making keyboards and power supplies though. They wound down their motherboard too, I believe. They let kingpin go.

Filthmontane,

Companies get by making keyboards and power supplies all the time.

AlexisFR,
@AlexisFR@jlai.lu avatar

They are in the process of closing down this year.

downhomechunk,
@downhomechunk@midwest.social avatar

I wish they would have started putting out AMD products. Powercolor just doesn’t feel like a flagship partner like evga was to nvidia.

shasta,

I would’ve actually switched to AMD if EVGA did

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • GTA5RPClips
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • osvaldo12
  • ethstaker
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • ngwrru68w68
  • kavyap
  • tester
  • cisconetworking
  • JUstTest
  • InstantRegret
  • khanakhh
  • cubers
  • everett
  • Durango
  • tacticalgear
  • Leos
  • modclub
  • normalnudes
  • megavids
  • anitta
  • provamag3
  • lostlight
  • All magazines