OC Long Context Lengths (And Low Resource Friendly)

Thought I'd ask and see if y'all are familiar with upcoming models or techniques that I'm not. I'm aware of the MPT 7B storywriter model and the RWKV models that support up to 8192 tokens, but that's about it as far as "long" context lengths go. I'm also wanting to run this in a VM with limited resources. The most I will be...

  • All
  • Subscribed
  • Moderated
  • Favorites
  • localllama@sh.itjust.works
  • DreamBathrooms
  • magazineikmin
  • thenastyranch
  • Youngstown
  • mdbf
  • rosin
  • slotface
  • InstantRegret
  • khanakhh
  • Durango
  • kavyap
  • osvaldo12
  • tacticalgear
  • GTA5RPClips
  • JUstTest
  • ngwrru68w68
  • everett
  • tester
  • ethstaker
  • cisconetworking
  • cubers
  • modclub
  • provamag3
  • anitta
  • normalnudes
  • Leos
  • megavids
  • lostlight
  • All magazines