The Curse of Recursion: Training on Generated Data Makes Models Forget (arxiv.org)

Stable Diffusion revolutionised image creation from descriptive text. GPT-2, GPT-3(.5) and GPT-4 demonstrated astonishing performance across a variety of language tasks. ChatGPT introduced such language models to the general public. It is now clear that large language models (LLMs) are here to stay, and will bring about drastic...

65dBnoise, to random
@65dBnoise@mastodon.social avatar

Can't say "dumb as a brick" any more. Sam Altman (of #OpenGPT fame) thinks "intelligence is a fundamental property of matter" 😮 . So if bricks are intelligent, why can't LLMs be intelligent too?

#Altman, a Stanford CS dropout, says that a future system will "cure all disease, help address climate change, radically improve education, make us 10-100 times more productive". And all that without any doubt.

Sounds exactly like modern #snakeoil to me.

https://www.youtube.com/embed/AiE7FsdRzz8

#AIhype #BS

65dBnoise, to ai
@65dBnoise@mastodon.social avatar

Excellent piece by Emily M. Bender, @emilymbender, about large language models () and why they can neither "know" nor "reason".

Compare that with the "awe" with which Sam Altman spooked the US Senators.

https://medium.com/@emilymenonbender/thought-experiment-in-the-national-library-of-thailand-f2bf761a8a83

  • All
  • Subscribed
  • Moderated
  • Favorites
  • JUstTest
  • kavyap
  • DreamBathrooms
  • thenastyranch
  • magazineikmin
  • InstantRegret
  • GTA5RPClips
  • Youngstown
  • everett
  • slotface
  • rosin
  • osvaldo12
  • mdbf
  • ngwrru68w68
  • megavids
  • cisconetworking
  • tester
  • normalnudes
  • cubers
  • khanakhh
  • Durango
  • ethstaker
  • tacticalgear
  • Leos
  • provamag3
  • anitta
  • modclub
  • lostlight
  • All magazines