#ai (2023-12)
Discuss all things related to AI
2023-12-04
venkata.mutyala
I learned about this from a local meetup and it’s pretty cool: https://lmstudio.ai/
If you want to quickly run a LLM on your local machine it’s pretty dumb/simple to get started. I am using a macbook pro m1 pro w/ 16GB of ram.
:space_invader: LM Studio - Discover and run local LLMs
Find, download, and experiment with local LLMs
1
Hao Wang
I use Ollama, looks similar
:space_invader: LM Studio - Discover and run local LLMs
Find, download, and experiment with local LLMs
venkata.mutyala
Oh nice. Will check it out. Are there any local models you have been a big fan of?
1
venkata.mutyala
Cool I’ll check them out. How powerful is your machine?
2023-12-05
2023-12-06
venkata.mutyala
bricks-cloud/BricksLLM
Simplifying LLM ops in production
1