#ai (2023-12)

Discuss all things related to AI

2023-12-04

venkata.mutyala avatar
venkata.mutyala

I learned about this from a local meetup and it’s pretty cool: https://lmstudio.ai/

If you want to quickly run a LLM on your local machine it’s pretty dumb/simple to get started. I am using a macbook pro m1 pro w/ 16GB of ram.

:space_invader: LM Studio - Discover and run local LLMsattachment image

Find, download, and experiment with local LLMs

1
Hao Wang avatar
Hao Wang

I use Ollama, looks similar

:space_invader: LM Studio - Discover and run local LLMsattachment image

Find, download, and experiment with local LLMs

venkata.mutyala avatar
venkata.mutyala

Oh nice. Will check it out. Are there any local models you have been a big fan of?

1
Hao Wang avatar
Hao Wang

Mistral or codellama

1
venkata.mutyala avatar
venkata.mutyala

Cool I’ll check them out. How powerful is your machine?

Hao Wang avatar
Hao Wang

I’ve got 64GB M1, feels very smooth to run the models, it saves time

1

2023-12-05

2023-12-06

venkata.mutyala avatar
venkata.mutyala
bricks-cloud/BricksLLM

Simplifying LLM ops in production

1
    keyboard_arrow_up