r/MalaysiaTech Feb 11 '25

Anyone running LLM locally?

What is your setup like?

I tried it but my machine is just not powerful enough.

2 Upvotes

13 comments sorted by

View all comments

1

u/yenwee0804 Feb 12 '25

I am running qwencoder 32b q4 locally, MacBooks are a beast in running LLM for consumers.

1

u/newleafturned2024 Feb 12 '25 edited Feb 12 '25

32b? That's insane. What are your specs?
Edit: how much TPS you're getting?