Uranium3006, 27 days ago local AI is the way. it's just that current models aren't gpt4 quality yet and you'd probably need 1 TB of VRAM to run them
local AI is the way. it's just that current models aren't gpt4 quality yet and you'd probably need 1 TB of VRAM to run them