Any tips to run LLMs locally ?
Hey guys, do you have any tips on how to run projects locally ? I kind of struggle with this now...
I am using a laptop with 16 GB (15,3) of RAM and 4 core Intel i5 (2,60 GHz) processor running on Ubuntu. No Nvidia graphics card unfortunately... I am trying to use only small models such as phi, phi3 or llama2 with Ollama, but most of the times it simply gets stuck while running the code or returns weird characters instead of the agents' work results. I don't have any errors in the code, the program runs well in the beginning, but it freezes usually after 4-5 minutes...
1
24 comments
Alex K
1
Any tips to run LLMs locally ?
AI Developer Accelerator
skool.com/ai-developer-accelerator
Master AI & software development to build apps and unlock new income streams. Transform ideas into profits. 💡➕🤖➕👨‍💻🟰💰
Leaderboard (30-day)
Powered by