LLM production deployment
Hi, any one got an idea how to calculate the GPU size for serve like 50-100 users when I have local server for LLama3.1-70B model, what parameters I shall use, and how to calculate step by step?
2
9 comments
Themis Hau
2
LLM production deployment
AI Developer Accelerator
skool.com/ai-developer-accelerator
Master AI & software development to build apps and unlock new income streams. Transform ideas into profits. 💡➕🤖➕👨‍💻🟰💰
Leaderboard (30-day)
Powered by