simon, I pulled together notes on all of the LLM plugins that have worked for me for Llama 3 - both for hosting locally (I've run 8B and 70B on my 64GB M2) and access via APIs (Groq is SO FAST for that)
Options for accessing Llama 3 from the terminal using LLM
https://simonwillison.net/2024/Apr/22/llama-3/
Add comment