trafficnab,

It should yeah, it used to be that if you wanted to run the model on your GPU it needed to fit entirely within its VRAM (which really limited what models people could use on consumer GPUs), but I think recently they've added the ability to run part of the model on your GPU+VRAM and part of it on your CPU+RAM, although I don't know the specifics as I've only briefly played around with it

  • All
  • Subscribed
  • Moderated
  • Favorites
  • ai_@lemmy.world
  • kavyap
  • Durango
  • cisconetworking
  • mdbf
  • InstantRegret
  • DreamBathrooms
  • ngwrru68w68
  • magazineikmin
  • osvaldo12
  • Youngstown
  • ethstaker
  • slotface
  • rosin
  • thenastyranch
  • megavids
  • normalnudes
  • modclub
  • khanakhh
  • everett
  • tacticalgear
  • cubers
  • GTA5RPClips
  • anitta
  • Leos
  • tester
  • provamag3
  • JUstTest
  • lostlight
  • All magazines