BetaDoggo_,

The paper suggests it was because of cost. The paper mainly focused on open models with public datasets as its basis, then attempted it on gpt3.5. They note that they didn’t generate the full 1B tokens with 3.5 because it would have been too expensive. I assume they didn’t test other proprietary models for the same reason. For Claude’s cheapest model it would be over $5000, and bard api access isn’t widely available yet.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • technology@lemmy.world
  • slotface
  • kavyap
  • thenastyranch
  • everett
  • tacticalgear
  • rosin
  • Durango
  • DreamBathrooms
  • mdbf
  • magazineikmin
  • InstantRegret
  • Youngstown
  • khanakhh
  • ethstaker
  • JUstTest
  • ngwrru68w68
  • cisconetworking
  • modclub
  • tester
  • osvaldo12
  • cubers
  • GTA5RPClips
  • normalnudes
  • Leos
  • provamag3
  • anitta
  • megavids
  • lostlight
  • All magazines