FluorideMind,

Ran horribly until I put it on my SSD. Runs flawlessly now.

echo,

if you’re running new games off a hard drive in 2023, todd is absolutely right and you should just spend the 50 bucks on an ssd

AlphaOmega,

SSDs have been the standard for like a decade

echo,

yeah, and the only reason hard drives have been viable for games for so long is that consoles had slow storage so cross-platform games needed to be able to work with it. pc gamers will literally complain about consoles holding games back and then complain about games running poorly on their 10 year old hard drive when consoles aren’t holding games back anymore.

aksdb,

A 4TB HDD is still significantly cheaper than a 4TB SSD.

TheCrispyDud,
TheCrispyDud avatar

Same, it's pretty much unplayable on an HDD, buttery smooth on my SSD though. I feel like it needs to be a rather loud PSA/notice on store pages about it.

qwesx,
qwesx avatar

On Steam the system requirements are very clear about this: "SSD Required".

hypelightfly,

It literally lists SSD as a system requirement.

kindenough,
kindenough avatar

I run Starfield on a 7700x with 6750XT, 32 gb, Samsung ssd and a freesync oled and it is not flawless at all. It's alright but framerates are all over the place. In gunfights with many enemies fps drops and input lag occurs making it hard to aim for instance. In denser enviroments it is starting to drop frames a la fallout 4, and in 2023 that is still a big problem even on recent systems.

My friend runs it on a 7900xtx and although he has an awesome rig, Starfield is all over the place with fps. No steady frame rates. He can run it 'flaweless' though, but it requires more than just an SSD, but a very expensive rig as well.

DoucheBagMcSwag,

On Xbox, Starfield won’t even install on a standard HDD. Honestly, this should have been a requirement on PC as well

variants,

it is

Additional Notes: SSD Required

sagrotan,
@sagrotan@lemmy.world avatar

16 times the detail?

lazyvar, (edited )
@lazyvar@programming.dev avatar

Instead of cracking jokes he should improve the piss poor optimization.

Can’t even render 50fps consistently on a Strix 3090OC at 1620p (accounting for resolution scale), what a joke.

Edit: Scratch that, it’s even worse, averaging around 40 fps with HUB Quality settings, so not even on Ultra and my 12900K is nowhere near bottlenecking.

What a joke.

KTVX94,

I did, just buy a new one bro.

wwaxwork,

I can get it to run just fine, it just looks like a game from the 1990s if only the colors brown, more brown and poop brown were used. I’m sure it’s a great game but since the graphics make me throw up 20 mins in and there are no accessibility options I will never get to play it.

Toribor,
@Toribor@corndog.social avatar

Lack of native HDR is pretty disappointing. I need to figure out how to get that working and maybe try a Reshade mod too. I can’t stand wandering around New Atlantis in the day, the contrast is so bad it’s very unpleasant to look at. It should be a beautiful awe-inspiring future city but the area right outside the lodge with the trees is just not great.

NBJack,

Saw that myself for the first time last night. Surprisingly atrocious. And WTH happened to the shadows around it? Did I just visit it on an overcast day?

The trees themselves would be right at home in a game from the early 2000s. Frickin’ Planetside 2, the game infamous for its indestructible trees and graphics from 10 years ago, has better looking flora assets.

I will also go as far to say it looks as if the game was designed for HDR, but due to lack of time, they just compressed the range, capped it about 10% below the normal maximum to leave some breathing room, and called it a day. Even the flashlight looks washed out at times.

frezik,

When there are benchmarks showing 0.1% lows at <40fps at 1080p on a goddamn 4090, no, Todd, the problem is your engine.

Kilamaos,

After some tweaking of settings, games runs fine. I shouldn’t HAVE to do those things, but it’s doing fine now

For reference, 3700x, rtx 2070s. Game heavily gpu bound. 50-60fps, no lag, no stutter , no whatever, honestly I never noticed losing a couple of fps, and that’s just in crowded cities/areas, in other place it’ll go higher.

What I did :

  • update drivers
  • installed free dlss mod from nexus
  • most setting on medium, render scale 50%, shadows on low

After doing this, everything has been 100% fine and never had any other issues. Before, I sometimes had slideshow in new Atlantis, heavy stutter, occasionnal massive fps drops. But these 3 things have 100% resolved my issues entirely

havokdj,

dlss render scale 50%

That image guaranteed looks like shit. I am sorry but that is not fine.

Mdotaut801,

Yeah. No way.

Uncaged_Jay,

Kiss my ass Todd, my 6700xt and Ryzen 5 5600x shouldn’t have to run your game at 1440p native with low settings to get 60fps

Almamu,

I’m on a 5700xt and game runs around 60 fps at 1080p everything cranked and no FSR or resolution scale applied, so I’d say either your drivers are out of date or something else is wrong there imo

Uncaged_Jay,

I’ll have to double check, nothing immediately stands out as wrong, the game is on an NVME, I’m running 3600mhz cl14 memory, and I just redid the thermal paste on my CPU. With all that being said, most other games I play get 100+fps, including Forza Horizon 5 with its known memory leak issue and Battlefield 5, so I don’t think anything is wrong with the system

Redditiscancer789,

Look man I’m not trying to defend Howard here or imply you’re tech illiterate, or that all your issues clear up 100%, but have you by chance updated your driver’s? Mine were out of date enough starfield threw up a warning which I ignored and was not having a good experience with the same as you(iunno your ram or storage config but I was running on an average NVME and 32 gigs ram with 5600x and 6700xt). But after I updated a lot of the issues smoothed out, not all, but most. At 60 fps average with a mix of med high at 1080p though. Maybe worth a try?

Uncaged_Jay,

I’ve checked for new drivers daily since the game came our, the ones I have are from mid-august though so maybe I’ll double check again

Colorcodedresistor,

such a todd move, always with a shit eating grin. “Oh you can’t run our game? jeez, have you tried not being poor?”

yamanii,
@yamanii@lemmy.world avatar

Games are somehow too CPU heavy these days even though they aren’t simulating the entire world like Kenshi, just stuff around you, so even though I upgraded my gpu I can barely get to 30fps. Also had this problem with Wolong, Hogwarts and Wild Hearts.

Call_Me_Maple,

I agree, I have an i7-8700k and a 2080super which I’d say are like mid to high level specs and I have a terrible time running Wild Hearts and Starfield. Such a damn shame too as a big MHW and MHR fan I was really looking forward to Wild Hearts and just couldn’t run the game well at all. At this point I’m just not surprised when a triple A game runs like dog water on my system, usually these games are free on gamepass I try them out and 5 minutes later I uninstall.

Indies are where it’s at nowadays.

Acid,
@Acid@startrek.website avatar

I wouldn’t consider a 8700k or a 2080 super high level specs or even mid level right now.

Consider that an 8700k is slower than a 13400f today which is considered the absolute lower end of the mid range, realistically 13600k or 13700 is the mid range on the intel side.

To be blunt the 8700k is 5 years old.

The 2080s is well look at this chart and you make a decision https://startrek.website/pictrs/image/2d276779-ffde-42ed-be75-4e2cff5d2572.jpeg

I think a lot of people are just not appreciative of how out of date their hardware is relative to consoles atm

Blackmist,

This is what happens when consoles improve their CPUs.

Suddenly they’ve got more cycles than they know what to do with, so they waste them on frivolous unnoticeable shit. Now you don’t have that extra headroom to get you from console 30fps to PC 60fps+. You’re on a much more even footing than PCs ever were with the underpowered (even at release) PS4 and Xbox One.

You’ll struggle to get a CPU that does double what a PS5 can, and if it’s being held back by a single thread performance (likely), there’s nothing you’ll be able to do to get double that.

hal_5700X,
@hal_5700X@lemmy.world avatar

www.youtube.com/watch?v=uCGD9dT12C0

Get a new game engine, Todd. Bethesda owns id Software. id Tech is right where.

520,

IdTech isn't built for open world gameplay

DrWorm,
@DrWorm@lemmy.world avatar

Rage??? Wasn’t Rage open world?

520,

Kind of. It was more smaller locations linked together by loading screens a la Borderlands 2 rather than the typically seamless worlds Bethesda are usually known for. Although you could definitely argue that this was the approach taken by Bethesda for Starfield.

Blackmist,

Wasn’t the drivable overworld one big map? I honestly can’t remember now, it’s been so long since I played it.

I do remember them harping on about “megatextures” and what this seemed to mean is that just turning on the spot caused all the textures to have to load back in as they appeared. I dunno if they abandoned that idea or improved it massively, but I don’t remember any other game ever doing that.

520,

My memory could also be being fuzzy. Might have been more like Oblivion and Skyrim.

As for the megatexture thing, it's not done anymore because it's not needed. The reason they had to have textures load back in was because the 360/PS3 only had 512MB of total RAM, and while the 360 had shared RAM, the PS3 had two 256MB sticks for the the CPU and GPU respectively. Nowadays even the Xbox 1 is rocking 8GB.

Blackmist,

I thought Megatextures were more to avoid the tiled look of many textured landscapes at the time. The idea that the artists can zoom into any point and paint what they need without worrying that it will then appear somewhere else on the map.

Looking around, some people seem to think they were replaced by virtual texturing, but I’ve been out the loop for a long time so haven’t really kept up with what that is. I assume it allows much the same, but far more efficient than a giant texture map. Death Stranding is an example that must use something similar, because as you move about you wear down permanent paths over the landscape.

520,

Right I think I got confused. The megatexture is a huuuuge single texture covering the entire map geometry. It has a ridiculous size (at the time of Rage, it was 32,000 by 32,000). It also holds data for which bits should be treated as what type of terrain for footprints etc.

The problem with this approach is it eats a shit ton of RAM, which the 7th gen consoles didn't have much of. Thus the only high quality textured that were loaded in were the ones the player could see, and loaded out when the player couldn't.

Megatextures are used in all IdTech games since, but because they weren't open world and/or targeted 8th gen consoles and later, with much more RAM, unloading the textures isn't necessary.

Blackmist,

Whatever code they used to “unload” the texture was also kept for PC, because the same thing happened there. This would have been on an ATI 5870 that I played that. At least double the VRAM of any console at the time.

Man, remember when flagship GPUs were £300? Wtf happened to those days…

520,

Heh. Good times.

Such a shame poor PC optimisation was what we kept instead of affordable cards.

Tau,

IdTech 7 does not use megatextures, the last engine to use it is IdTech 6

jcit878,

honestly it runs fine on my 5700xt r5 3600 combo. not max settings, I set to “high” from memory as the game defaulted to the minimum for me, but I could bump it no worries. no real frame rate or stuttering issues. I’d love to run it higher but I’m a realist, and new PC is on the cards anyway over the next year

B3_CHAD,
@B3_CHAD@lemmy.dbzer0.com avatar

3070 can’t get consistent 60 fps at 1080p high. No stuttering though just low fps.

jcit878,

i actually turn frame counter off. i know im not getting 60, but what i am getting is sufficient that it doesnt ruin my fun, and on my older hardware im ok with it. if i looked at the counter i would probably be more dissapointed

B3_CHAD,
@B3_CHAD@lemmy.dbzer0.com avatar

Me too, I only do it to find the best settings that provide a good balance between visuals and performance and then turn the fps counter off.

okiloki,

I have a 5700 xt as well, paired with a 5600X. The game runs perfectly well. I honestly had more issues with stuttering in BG3.

rennademilan,

Sad but true

colonial,
@colonial@lemmy.world avatar

I installed an optimized textures mod and instantly improved my performance by like… 20 frames, maybe more.

I have an RX 6600 XT that can run Cyberpunk on high no problem. C’mon Bethesda, the game is really fun, but this is embarrassingly bad optimization.

Clown_Tempura,

My PC runs better games just fine.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • pcmasterrace@lemmy.world
  • ngwrru68w68
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • InstantRegret
  • GTA5RPClips
  • Youngstown
  • everett
  • slotface
  • rosin
  • osvaldo12
  • mdbf
  • kavyap
  • cubers
  • megavids
  • modclub
  • normalnudes
  • tester
  • khanakhh
  • Durango
  • ethstaker
  • tacticalgear
  • Leos
  • provamag3
  • anitta
  • cisconetworking
  • JUstTest
  • lostlight
  • All magazines