jni,
@jni@fosstodon.org avatar

@simon ok I'm finally trying out your llm library and it's neat. I am intentionally limiting myself to open source and local models. Could you write a post about these? Specifically:

  • strengths/weaknesses of each model?
  • strategies for getting around context length limits? I'm currently interested in text summarisation and immediately hit this with mistral-7b-instruct-v0*
  • yes I did ask mistral itself to suggest strategies but they were distressingly manual and brittle/non-generalisable. 😂
  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • ngwrru68w68
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • InstantRegret
  • GTA5RPClips
  • Youngstown
  • everett
  • slotface
  • rosin
  • osvaldo12
  • mdbf
  • kavyap
  • cubers
  • megavids
  • modclub
  • normalnudes
  • tester
  • khanakhh
  • Durango
  • ethstaker
  • tacticalgear
  • Leos
  • provamag3
  • anitta
  • cisconetworking
  • JUstTest
  • lostlight
  • All magazines