camilobotero@feddit.dktoSelfhosted@lemmy.world•Self-GPT: Open WebUI + Ollama = Self Hosted ChatGPTEnglish
1·
24 days agoWhat are your PC specifications for running Ollama3.1:70B smoothly?
What are your PC specifications for running Ollama3.1:70B smoothly?
I can confirm that it does not run (at least not smoothly) with an Nvidia 4080 12Gb. However, gemma2:27B runs pretty well. Do you think if we add another graphical card, a modest one, maybe the llama3.1:70B could run?