glennf,
@glennf@twit.social avatar

There’s a problem in technology: many people believe everything is on the same curve as transistor density. Most things aren’t. There’s often a sharp trend upward and then a stall. https://mastodon.cloud/@jasongorman/112267865098989336

jannem,
@jannem@fosstodon.org avatar

@glennf
Sigmoid and exponential trends looks much the same early on. But every trend ends up being a sigmoid sooner or later.

glennf,
@glennf@twit.social avatar

@jannem We are pattern-matchers and some parts of technology go upward sharply (almost entirely manufacturing) including hype

carlmalamud,
@carlmalamud@official.resource.org avatar

@glennf our results in increasing the training set by two orders of magnitude in BERT was that the results were not appreciably better. https://openreview.net/revisions?id=gQubg63s_Hf

glennf,
@glennf@twit.social avatar

@carlmalamud Feels like the same thing happened with deep learning. I had this conversation in an interview nearly a decade ago with Yann LeCun for a story on computer vision. Despite his co-creation of and intense work on deep learning, he felt it was already nearing its accuracy limits and worried not enough was being done to find fresh grounds to bridge the last gap. LLMs aren’t really much different in nature.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • tacticalgear
  • DreamBathrooms
  • InstantRegret
  • magazineikmin
  • khanakhh
  • Youngstown
  • ngwrru68w68
  • slotface
  • everett
  • rosin
  • thenastyranch
  • kavyap
  • GTA5RPClips
  • cisconetworking
  • JUstTest
  • normalnudes
  • osvaldo12
  • ethstaker
  • mdbf
  • modclub
  • Durango
  • tester
  • provamag3
  • cubers
  • Leos
  • anitta
  • megavids
  • lostlight
  • All magazines