chikim,
@chikim@mastodon.social avatar

Thanks to all the recent large LLMs, "Apple is considering support for up to half a terabyte of RAM" for the highest-end m4 Mac configurations. I'm sure the price won't be cheap, but I bet it will be cheaper than getting 500GB in vram from NVidia. lol https://9to5mac.com/2024/04/11/apple-first-m4-mac-release-ai/

bryansmart,

@chikim It won't be nearly as fast as even Nvidia's slowest GPUs, though. From my testing, large unified memory and power efficiency are all Apple has going. Isn't worth that kind of money for local inference on huge models, except in a very few special cases.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • llm
  • DreamBathrooms
  • thenastyranch
  • mdbf
  • khanakhh
  • Youngstown
  • slotface
  • hgfsjryuu7
  • ngwrru68w68
  • rosin
  • kavyap
  • Durango
  • PowerRangers
  • InstantRegret
  • magazineikmin
  • anitta
  • cisconetworking
  • cubers
  • vwfavf
  • tacticalgear
  • everett
  • osvaldo12
  • ethstaker
  • tester
  • normalnudes
  • modclub
  • GTA5RPClips
  • Leos
  • provamag3
  • All magazines