AI Developer Accelerator
Log In
Community
Classroom
Calendar
Members
Leaderboards
About
Log In
2
Themis Hau
Aug '24 •
General discussion
LLM production deployment
Hi, any one got an idea how to calculate the GPU size for serve like 50-100 users when I have local server for LLama3.1-70B model, what parameters I shall use, and how to calculate step by step?
Like
2
9 comments
2
LLM production deployment
AI Developer Accelerator
skool.com/ai-developer-accelerator
Master AI & software development to build apps and unlock new income streams. Transform ideas into profits. 💡➕🤖➕👨💻🟰💰
10.8k
Members
30
Online
7
Admins
JOIN GROUP
Leaderboard (30-day)
1
Tom Welsh
+53
2
Aless Romano
+30
3
Jack Williams
+16
4
Kahu Ngata
+14
5
Jorge Colon Jr
+12
See all leaderboards
Powered by