gael, French
@gael@mastodon.social avatar

At some point, I wish there would be a local plugged to my email inbox and IM discussion channel so I can retrieve information super easily.

Would you find this feature useful in /e/OS ? 😎

Papeleo, (edited )
@Papeleo@mastodon.social avatar

@gael
No, I don't need an LLM as long as I can generate text with my own brain.

I'd prefer to have /e/ in more modern Samsung phones with good cameras. I couldn't find anything after the Galaxy S10 from 2019.
😉

gael,
@gael@mastodon.social avatar

@Papeleo we won't commercialize Samsung devices anymore because supporting VoLTE for those devices would be super costly.

Papeleo,
@Papeleo@mastodon.social avatar

@gael
I see... 🙂

So, which is the phone with the best camera that /e/ supports?
📸

gael,
@gael@mastodon.social avatar
djoerd,
@djoerd@idf.social avatar

@gael Please, no! 😱

oscarascal,
@oscarascal@framapiaf.org avatar

@gael I'd love to have a Speech to Text engine that runs locally as input method, are LLMs the right approach for this?

gael,
@gael@mastodon.social avatar

@oscarascal pure local STT engines are very hard to implement locally on a smartphone because it takes a lot of resources (RAM in particular). But I agree, it would probably be the same with a local LLM 🫤

phiofx,

@gael we need an open source, collaboratively trained LLM, on commons text data that is explicitly opt-in, and local inference API on mobiles and desktops (there may be more opportunities to use LLM's integrated in various desktop apps such as Libreoffice)

gael,
@gael@mastodon.social avatar

@phiofx Yes. The thing with collectively trained is privacy concerns. To me that can only be solved with homomorphic encryption, but consumer hardware is not powerful enough for this yet.

lordphoenix,
@lordphoenix@social.targaryen.house avatar

@gael I don’t want a bullshit generator on my phone.

gael,
@gael@mastodon.social avatar

@lordphoenix I am not talking about any bullshit generator ^^

djoerd,
@djoerd@idf.social avatar

@gael but... LLMs do not retrieve information, they generate plausible text.
Like, @lordphoenix, I don't want that on my phone.

lordphoenix,
@lordphoenix@social.targaryen.house avatar

@gael LLM are bullshit generator. They know nothing, they don’t know what are knowledge and truth… The only thing they do is generate statically plausible text : It’s THE definition of bullshit.

gael,
@gael@mastodon.social avatar

@lordphoenix no because you can send them context information, so they can retrieve information you are looking for in a constructed manner.

lordphoenix,
@lordphoenix@social.targaryen.house avatar

@gael No because LLM are unable to evaluate relevance and accuracy of information they retrieve… They don’t retrieve information in fact, They only use an enormous quantity of text to generate another plausible text based on a statistical evaluation, and only that. They can't retrieve information because they don't know what information is.

slaeg,
@slaeg@mastodon.online avatar

@gael absolutely!

  • All
  • Subscribed
  • Moderated
  • Favorites
  • llm
  • mdbf
  • DreamBathrooms
  • cisconetworking
  • magazineikmin
  • InstantRegret
  • everett
  • thenastyranch
  • Youngstown
  • rosin
  • slotface
  • khanakhh
  • Durango
  • kavyap
  • ethstaker
  • megavids
  • anitta
  • modclub
  • osvaldo12
  • normalnudes
  • ngwrru68w68
  • GTA5RPClips
  • tacticalgear
  • provamag3
  • tester
  • Leos
  • cubers
  • JUstTest
  • lostlight
  • All magazines