Re: So I can run a local chatbot
My system has 32GB RAM, an RTX4070 and 8GB of VRAM. It can run all of the freely-licensed chatbots from GPT4All - but maybe that's not saying so much. There is a definite difference in quality of responses between those models and the ones you get online from Google or OpenAI. It can run Stable Diffusion's image generation models ... just ... if there's not too much else running and you use the version that's been optimised to need less memory.
As to why you'd want it, they are genuinely useful. I'm using them for language learning at the moment. They're perfectly capable of setting you exercises, correcting them, explaining what you got wrong, chatting informally, assessing formal writing, introducing new grammar and so on and so on, all stuff you'd ordinarily pay a language teacher real money to do for you. Does it get the odd thing wrong? Probably. But it gives you about 98% of what a paid language tutor gives you, is free and is available 24 hours a day, for a few minutes or a few hours whenever you want it.