Running deepseek locally - Not so smart
Hi,
Yesterday I installed deepseek locally using ollama and used the 1.5b model as I wasnt sure if other models would run on my computer properly or not.
My ram is 16gb and have an ssd drive.
Now, when I asked it a few questions related to coding, it seemed that deepseek wasnt so smart actually. or maybe I should have used a better model? which model is the one that they have in the deepseek chat app? I'd like to use that model but dont know which model is it or how much power my computer needs to run it.
Do i need graphics card too for this? I am not quite sure about the configs required.
so, please shed some light on this if you can.
and what are the other options to run it if we want to use a better model but on the same computer?
0
0 comments
Faraz Ahmed
1
Running deepseek locally - Not so smart
AI Automation Society
skool.com/ai-automation-society
A community built to master no-code AI automations. Join to learn, discuss, and build the systems that will shape the future of work.
Leaderboard (30-day)
Powered by