dellish,

Perhaps this is a good place to ask now the topic has been raised. I have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

vivadanang,

have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

in a laptop? practically none. there are some very rare ‘laptops’ out there - really chonk tops - that have full size desktop gpu’s inside them. the vast majority, on the other hand, will have ‘mobile’ versions of these gpus that are basically permanently connected to the laptop’s motherboard (if not being on the mobo itself).

one example of a laptop with a full-size gpu (legacy, these aren’t sold anymore): www.titancomputers.com/…/m151.htm note the THICK chassis - that’s what you need to hold a desktop gpu.

dellish,

Well that sucks, but unfortunately I’m not too surprised.

gazab,

You could use an separate external gpu if you have thunderbolt ports. It’s not cheap and you sacrifice some performance but worth it for the flexibility in my opinion. Check out egpu.io

chemsed,

In my experience, AMD is not more reliable on updates. I had to clean install trice to be able to have my RX 6600 function properly and months later, I have a freezing issue that may be caused by my GPU.

excel,
@excel@lemmy.megumin.org avatar

It would help if they had any competitors. AMD and Intel aren’t cutting it.

Litany,
@Litany@lemmy.world avatar

My AMD card is great.

Bratwurstboy,

Pretty damn happy with my 7900XTX too.

skizzles,

Swapped over to a 7800XT about 3 months ago. Better Linux performance, tested a bit on Windows also and it worked fine, I’m more than satisfied with my decision to hop over from my 3060.

Bratwurstboy,

Yeah, same here. I switched from a 3080 to a 7900 XTX and couldn’t be happier. FPS doubled in some games, it only needs two 8 pin connectors and I really like the adrenaline software. Haven’t tested Linux yet but I am going to soon.

skizzles,

The Linux driver isn’t the most straightforward to install but it’s not difficult. You have to install the installer first, then install the driver with the installer.

Only caveat is you don’t get the adrenaline software, but it’s kind of a moot point as it wouldn’t work anyway due to how Linux works.

I’m running Ubuntu 22.04, was using gnome but switched over to KDE and stopped getting crashes in certain games and got a small performance increase. I suppose that would be due to dropping Wayland for X when I moved to KDE.

Cqrd,

AMD is definitely pulling their load, but more competitors are always better.

CaptPretentious,

Intel is definitely catching up.

UnspecificGravity,

For the vast majority of customers that aren’t looking to spend close to a grand for a card that is infinitesimally better than a card for half the price, AMD has plenty to offer.

Fridgeratr,

AMD is absolutely cutting it!! They may not get DLSS or ray trace as well but their cards still kick ass

the_q,

Or ever.

zoe,

just 10-15 years at least, for smartphones\electronics overall too. Process nodes are now harder to reduce, more than ever. holding up to my 12nm ccp phone like there is no tomorrow …

RizzRustbolt,

freezes

stands there with my credit card in my hand while the cashier stares at me awkwardly

gnuplusmatt,

As a Linux gamer, this really wasn’t on the cards anyway

BCsven,

AMD is a better decision, but my nVidia works great with Linux, but I’m on OpenSUSE and nVidia hosts their own OpenSUSE drivers so it works out of the get go once you add the nVidia repo

gnuplusmatt,

I had an nvidia 660 GT back in 2013, it was a pain in the arse being on a leading edge distro, used to break xorg for a couple of months every time there was an xorg release (which admittedly are really rare these days since its in sunset mode). Buying an amd was the best hardware decision, no hassles and I’ve been on Wayland since Fedora 35.

CeeBee,

A lot has changed in a decade.

gnuplusmatt,

yeah no, I dont want to be fucking with my machine just because I want to run a modern display server. I want my driver as part of my system. Until NV can get out of their own way and match the AMD experience (or even intel), not interested

lowmane,

Laughs in dual 3090s on Linux coming from 5x 1070tis

gnuplusmatt,

Laughs at dual 3090s on Linux

That sounds like a hassle

lowmane,

It’s not at all. You have a dated notion of the experience of the past few years+ with an nvidia gpu

gnuplusmatt,

dated notion of the experience

Do I still have to load a module that taints my kernel and could break due to ABI incompatibility? Does wayland work in an equivalent manner to the in kernel drivers that properly support GBM?

GarytheSnail,
@GarytheSnail@programming.dev avatar

All three cards are rumored to come with the same memory configuration as their base models…

Sigh.

Fungah,

Give us more fucking vram you dicks.

CanadianCarl,

I have 12gb vram, do I need more?

state_electrician,

Only slightly related question: is there such a thing as an external nVidia GPU for AI models? I know I can rent cloud GPUs but I am wondering if long-term something like an external GPU might be worth it.

AnotherDirtyAnglo,

Generally speaking, buying outright is always cheaper than renting, because you can always continue to run the device potentially for years, or sell it to reclaim some capital.

baconisaveg,

A 3090 (used) is the best bang for your buck for any LLM / StableDiffusion work right now. I’ve seen external GPU enclosures, though they probably cost as much as slapping a used 3090 into a barebones rig and running it headless in a closet.

Kit,

Meh I’m still gonna buy a 4070 Ti on Black Friday. Wish I could wait but my other half wants a PC for Christmas.

Norgur,

Thing is: there is always the "next better thing" around the corner. That's what progress is about. The only thing you can do is choose the best available option for you when you need new hardware and be done with it until you need another upgrade.

Sigmatics,

Exactly. The best time to buy a graphics card is never

massive_bereavement,
massive_bereavement avatar

Graphics card. Not even once.

RizzRustbolt,

Real gamers use ayahuasca.

jmcs,

It depends on what you need. I think usually you can get the best bang for buck by buying the now previous generation when the new one is released.

miketunes,

Yup just picked up a whole PC with rtx3090 for $800.

kerrigan778,

New or used?

miketunes,

Used

khaliso,

On what platform? eBay?

Datto,

Best Buy has refurbished PC’s with 4070s for under a grand right now.

miketunes,

Yeah eBay Canada, was a great deal

wrath_of_grunge,
wrath_of_grunge avatar

really my rule of thumb has always been when it's a significant upgrade.

for a long time i didn't really upgrade until it was a 4x increase over my old. certain exceptions were occasionally made. nowadays i'm a bit more opportunistic in my upgrades. but i still seek out 'meaningful' upgrades. upgrades that are a decent jump over the old. typically 50% improvement in performance, or upgrades i can get for really cheap.

schmidtster,

4x…? Even in older cards that’s more than a decade between cards.

A 4080 is only 2.5x as powerful as a 1080ti, those are 5 years apart.

Sigmatics,

What’s wrong with upgrading once every 5-10 years? Not everyone plays the latest games on 4k Ultra

Admittedly 4x is a bit steep, more like 3-4x

schmidtster,

Starfield requires a minimum 1070ti to play. It’s not just about fidelity, you just wouldn’t be able to play any newer games.

joelfromaus,
@joelfromaus@aussie.zone avatar

I had a 1080ti and the only game that really gave me grief playing on high settings was Starfield. I’m not saying older cards won’t have problems playing newer games but I am saying all cards have problems playing Starfield.

wooki,

Cry’s in 970

SheDiceToday,

Dude, there’s dozens of us!

hydroel,

Yeah it’s always that: “I want to buy the new shiny thing! But it’s expensive, so I’ll wait for a while for its price to come down.” You wait for a while, the price comes down, you buy the new shiny thing and then comes out the newest shiny thing.

Norgur,

Yep. There will always be "just wait N months and there will be the bestest thing that beats the old bestest thing". You are guaranteed to get buyers remorse when shopping for hardware. Just buy what best suits you or needs and budget at the time you decided is the best.time for you (or at the time your old component bites the dust) and then stop looking at any development on those components for at least a year. Just ignore any deals, new releases, whatever and be happy with the component you bought.

alessandro,
@alessandro@lemmy.ca avatar

choose the best available option

“The” point. Which is the best available option?

The simplest answer would be “price per fps”.

Norgur,

Not always. I'm doing a lot of rendering and such. So FPS aren't my primary concern.

AeroLemming,

You have a magical button. If you press it now, you will get $100 and it will disappear. Every year you don’t press it, the amount of money you will get if you do press it goes up by 20%. When should you press the button? At any given point in time, waiting just one more year adds an entire 20% to your eventual prize, so it never makes sense to press it, but you have to eventually or you get nothing.

Same thing with graphics cards.

SkyeStarfall,

Once you need it, or, alternatively, once you have enough to live comfortably for the rest of your life. It’s exponential growth, you only get one chance, just gotta decide what your goal with the money actually is.

AeroLemming,

Yep. My point is that there’s no easily calculable, mathematically “correct” moment to push the button. Same goes for buying a graphics card.

Bizarroland,
Bizarroland avatar

Is it compound or straight percentage?

Cuz if it's just straight percentage then it's $20 a year, whereas if it is compound then it's a 2X multiplier every three and a half years roughly.

AeroLemming,

Compound, which more closely models the actual rate at which computing power has grown over the years.

Bizarroland,
Bizarroland avatar

So if I waited roughly 35 years then I would get $1 million...

AeroLemming,

Or you could wait 70 years and leave 34 million to people in your will… The point is that there is no mathematically correct choice.

Bizarroland,
Bizarroland avatar

I think I got about 77 years left in me, unless somebody comes along and kills me that is.

That at least would be $125 million which isn't too shabby. I find it hard to believe that anybody would say that $125 million 77 years from now would not be a considerable amount of money.

Sigmatics,

Press it before you retire

Same with graphics cards

nik282000,
@nik282000@lemmy.ca avatar

I bought a 1080 for my last PC build, downloaded the driver installer and ran the setup. There were ads in the setup for the 2k series that had launched the day before. FML

Norgur,

Yep. I bought a 4080 just a few weeks ago. Now there is ads for the refresh all over... Thing is: you card didn't get any worse. You thought the card was a good value proposition for you when you bought it and it hasn't lost any of that.

Schmuppes,

Major refresh means what nowadays? 7 instead of 4 percent gains compared to the previous generation?

massive_bereavement,
massive_bereavement avatar

For anything ML related, having the additional memory is worth the investment, as it allows for larger models.

That said, at these prices it raises the question if it is more sensible to just throw money at GCP or AWS for their GPU node time.

NOT_RICK,
@NOT_RICK@lemmy.world avatar

The article speculates a 5% gain for the 4080 super but a 22% gain for the 4070 super which makes sense because the base 4070 was really disappointing compared to the 3070.

vxx,

Will the price be the same or up to 22% more expensive?

NOT_RICK,
@NOT_RICK@lemmy.world avatar

You’ll pay 30% more for the honor of owning a 4 series

LemmyIsFantastic,

If the super is even remotely priced in a reasonable way I’ll be jumping on the 4080. Finally will get close to consistent 4k60.

NOT_RICK,
@NOT_RICK@lemmy.world avatar

Don’t hold your breath, buddy

MudMan,
MudMan avatar

I miss that small time window where maxing out games and not having to tweak and tune was a thing.

Is 4K60 the goal? Because I have a bunch of 120Hz displays, so... 4K60? Or what about 1440p120? Or maybe you can split the difference and try to get 90-ish at upscaled 4K and the VRR will eat the difference. And of course I have handhelds so those are a separate performance target altogether.

These days you are tuning everything no matter what unless you're running... well, a game from that era when 1080p60 was the only option.

BaroqueInMind,
BaroqueInMind avatar

You'll have your dream come true when consumers are told to upgrade their televisions after the next generation of game consoles mandate it as their next new shiny feature.

LemmyIsFantastic,

I have a c1. I actually target 4k120 but that’s not going to happen consistently even on a 4090. Your seething is a lol.

sederx,

i saw a 4080 on amazon for 1200, shits crazy

joneskind,

It really is a risky bet to make.

I doubt full price RTX 4080 SUPER upgrade will worth it over a discounted regular RTX 4080.

SUPER upgrades never crossed the +10%

I’d rather wait for the Ti version

anonymoose,
@anonymoose@lemmy.ca avatar

I’m looking to get a 4090 this black Friday, and even with these refreshes, doesn’t seem like my purchasing decision would really be affected, unless they’re also refreshing the 4090.

wrath_of_grunge,
wrath_of_grunge avatar

really the RTX 4080 is going to be a sweet spot in terms of performance envelope. that's a card you'll see with some decent longevity, even if it's not being recognized as such currently.

joneskind,

It will depend on the power upgrade offered by the 50XX and the game development studios appetite for more power.

But TBH I don’t see Nvidia able to massively produce a 2 times faster chip without increasing its price again

Meaning, nobody will get the next gen most powerful chip, game devs will have to take that into account and the RTX 4080 will stay relevant for longer time.

Besides, according to SteamDB, most of gamers still have an RTX 2080 or less powerful GPU. They won’t sell their games if you can play it decently on those cards.

The power gap between high-ends GPUs is growing exponentially. It won’t stay sustainable very long

Outtatime,
@Outtatime@sh.itjust.works avatar

I’m so sick of Nvidia’s bullshit. My next system will be AMD just out of spite. That’s goes for processors as well

dojan,
@dojan@lemmy.world avatar

I went with an AM5 and an Intel Arc GPU. Quite satisfied, the GPU is doing great and didn’t cost an arm and a leg.

Nanomerce,

How is the stability in modern games? I know the drivers are way better now but more samples is always great.

dojan,
@dojan@lemmy.world avatar

Like, new releases? I don’t really play many new games.

Had Baldur’s Gate III crash once, and that’s the newest title I’ve played.

Other than that I play Final Fantasy XIV, Guild Wars 2, The Sims and Elden Ring, never had any issues.

Vinny_93,

Considering the price of a 4070 vs the 7800XT, the 4070 makes a lot more sense where I live.

But yes, the way AMD makes their software open to use (FSR, FreeSync) and they put DisplayPort 2.1 on their cards, they create a lot of goodwill for me.

Cagi, (edited )

The only thing giving me pause about ATI cards is their ray tracing is allegedly visibly worse. They say next gen will be much better, but we shall see. I love my current non ray tracing card, an rx590, but she’s getting a bit long in the tooth for some games.

limitedduck,

ATI

“Now that’s a name I’ve not heard in a long time”

be_excellent_to_each_other,
be_excellent_to_each_other avatar

I have to admit I still tend to call them that, too. Oldttimers I guess.

The first GPU I remember being excited to pop into my computer and run was a Matrox G400 Max. Damn I'm old.

Cagi,

I would have been so jealous. Being able to click “3d acceleration” felt so good when I finally upgraded. But I was 12, so my dad was in charge of pc parts. Luckily he was kind of techy, so we got there. Being able to run Jedi Knight: Dark Forces II with max settings is a day I’ll never forget for some reason, lol.

Cagi,

Not since, oh before most of Lemmy was born. I’m old enough to remember when Nvidia were the anti-monopoly good guys fighting the evil Voodoo stranglehold on the industry. You either die a hero or you live long enough to see yourself become the villain.

PenguinTD,

yeah, that’s pretty much why I stopped buying Nvidia after GTX 1080. Cuda was bad in terms of their practice, but not that impactful since OpenCL etc can still tune and work properly with similar performance, just software developer/researcher love free support/R&D/money to progress their goal. They are willing to be the minions which I can’t ask them to not take the free money. But RTX and then tensor core is where I draw the line, since their patent and implementation will have actual harm in computer graphic and AI research space but I guess it was a bit too late. We are already seeing the results and Nvidia is making banks with that advantage. They are essentially just applying the Intel playbook but doing it slightly different, they don’t buy the OEM vendors, they “invest” software developers/researcher to use their closed tech. Now everyone is paying the premium if you buy RTX/AI chips from Nvidia and the capital boom from AI will make the gap hard to close for AMD. After all, R&D requires lots of money.

CaptainEffort,

That’s exactly why I’ve been using AMD for the past 2 years. Fuck Nvidia

kureta,

only thing keeping me is CUDA and there’s no replacement for it. I know AMD has I-forgot-what-it’s-called but it is not a realistic option for many machine learning tasks.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • pcgaming@lemmy.ca
  • ngwrru68w68
  • DreamBathrooms
  • khanakhh
  • magazineikmin
  • InstantRegret
  • tacticalgear
  • thenastyranch
  • Youngstown
  • rosin
  • slotface
  • modclub
  • everett
  • kavyap
  • Durango
  • JUstTest
  • osvaldo12
  • normalnudes
  • cubers
  • ethstaker
  • mdbf
  • tester
  • GTA5RPClips
  • cisconetworking
  • Leos
  • megavids
  • provamag3
  • anitta
  • lostlight
  • All magazines