Activity
Mon
Wed
Fri
Sun
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
What is this?
Less
More

Memberships

Community F.I.R.E. Mojo

1.5k members • Free

AndyNoCode

25.5k members • Free

Royalty Ronin

515 members • $1,999/y

AI Automation Society

208.5k members • Free

AI Automation Agency Hub

274.8k members • Free

AI Automation Agency Ninjas

18.9k members • Free

1 contribution to AI Automation Society
Running deepseek locally - Not so smart
Hi, Yesterday I installed deepseek locally using ollama and used the 1.5b model as I wasnt sure if other models would run on my computer properly or not. My ram is 16gb and have an ssd drive. Now, when I asked it a few questions related to coding, it seemed that deepseek wasnt so smart actually. or maybe I should have used a better model? which model is the one that they have in the deepseek chat app? I'd like to use that model but dont know which model is it or how much power my computer needs to run it. Do i need graphics card too for this? I am not quite sure about the configs required. so, please shed some light on this if you can. and what are the other options to run it if we want to use a better model but on the same computer?
0
0
1-1 of 1
Faraz Ahmed
1
5points to level up
@faraz-ahmed-4040
Taking the leap

Active 18d ago
Joined Dec 15, 2024
Powered by