I just got Oobabooga running for the first time with Llama-2, and have Automatic1111, and ComfyUI running for images. I am curious about ML too but I don’t know where this start with that one yet.
For the uninitiated, all of these tools are running offline open source (or mostly) models.
I’m running it in GPT4All (CPU-based) with 64GB of RAM, and it runs pretty well. I’m not sure what you’d need if you were running it on GPU instead.