lvxferre,
@lvxferre@mander.xyz avatar

My personal take is that the current generation of generative models peaked, for the reasons stated in the video (diminishing returns). This current gen will be useful, but progress-wise it’ll be a dead end.

In the future however I believe that models with a different architecture will cause a breakthrough, being able to perform better with less training. And probably less energy requirements, too.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • thenastyranch
  • DreamBathrooms
  • tacticalgear
  • magazineikmin
  • khanakhh
  • everett
  • Youngstown
  • mdbf
  • slotface
  • rosin
  • ethstaker
  • InstantRegret
  • kavyap
  • ngwrru68w68
  • megavids
  • cisconetworking
  • cubers
  • osvaldo12
  • modclub
  • GTA5RPClips
  • tester
  • Durango
  • provamag3
  • anitta
  • Leos
  • normalnudes
  • JUstTest
  • lostlight
  • All magazines