gael, (edited ) French Imagine a #LLM running locally on a smartphone...
It could be used to fuel an offline assistant that would be able to easily add an appointment to your calendar, open an app, etc. without #privacy issues.
This has come to reality with this proof of concept using the Phi-2 2.8B transformer model running on /e/OS.
It is slow, so not very usable until we have dedicated chips on SoCs, but works (and #opensource !)
🙏 Stypox