Page 1 of 1
Want to run AI locally?
Posted: Tue Mar 17, 2026 11:52 am
by ukimalefu
Check here if you can:
Detect your hardware and find out which AI models you can run locally. GPU, CPU, and RAM analysis in your browser.
https://www.canirun.ai/
I have tried a couple of apps that run AI models locally. I have the last 16" Intel MacBook Pro model, 16 GB ram. It is S-L-O-W and limited. You'll want Apple silicon if you want to try this on a Mac.
Can you run AI locally?
Posted: Tue Mar 17, 2026 3:36 pm
by Maurvir
Jeff Geerling has some videos on locally hosting AI models, but as always, YMMV. Don't expect Claude with such systems.
Can you run AI locally?
Posted: Tue Mar 17, 2026 4:37 pm
by Jehannum
did you hear about the AI model that entered the beauty pageant?
She won miss information.
Can you run AI locally?
Posted: Tue Mar 17, 2026 5:06 pm
by obvs
You can use Docker to run it, and it’s pretty simple.
Can you run AI locally?
Posted: Sun Mar 22, 2026 7:36 pm
by Jehannum
The real question is why you'd want to do so.
Can you run AI locally?
Posted: Sun Mar 22, 2026 8:42 pm
by obvs
Can you run AI locally?
Posted: Mon Mar 23, 2026 7:07 am
by LCGuy
I’m curious about it, but none of my machines have anywhere near the horsepower lol
Can you run AI locally?
Posted: Mon Mar 23, 2026 2:24 pm
by Jehannum
obvs wrote: Sun Mar 22, 2026 8:42 pmBecause it’s easy.
I'm just wondering what the utility would be to have your own little plagiarism machine locally.
Can you run AI locally?
Posted: Mon Mar 23, 2026 5:41 pm
by juice
Jehannum wrote: Mon Mar 23, 2026 2:24 pm
obvs wrote: Sun Mar 22, 2026 8:42 pmBecause it’s easy.
I'm just wondering what the utility would be to have your own little plagiarism machine locally.
I'm wondering the same. I can drink a gallon of water, fart, and confidently provide a wrong answer all on my own.
Can you run AI locally?
Posted: Mon Mar 23, 2026 11:27 pm
by obvs
Well, for one, I might have a big potential lawsuit coming up, and it spend about 7 minutes creating an extremely long years-long timeline based on all of the documentation on my computer, including timestamped communications, notes, emails, and tons of other legal documentation which I hadn’t thought to include. All wrapped up for the lawyers.
For two, I have a massive library of tools that I’ve written in one language or another for one given platform or another, and it’s really useful to port it over to another platform just by asking it to do so. It’s not so much plagiarism if it’s literally duplicating my own work onto the other platforms(which I test and verify, of course).
For three, it explains in much more specific verbose language that's good for me, instead of in simplistic language that tends to miss details. And again, yes, I verify the information it provides.
For four, I do genealogy research, and you wouldn’t believe how effective it can be in identifying people in others’ family trees who match individuals in your own family tree, allowing you to copy the details from those matching individuals and then copying in all kinds of family members you didn’t know about. I was doing it today, and I found my great-great-great grandma in Czechia, in a language I don’t speak, and her whole extended family including both parents(and marriage records and death records, et cetera), and more and more. I still haven’t seen how far back it goes for that branch, and that's just one. I’ve been using it to identify other branches in other countries as well. And that also led it to find photographs of many of the people who I’d never been able to see in person. And I was able to share with my family today people who it found from 100–200 years ago who looked a ton like living family members. I invited my mom over, and she literally started crying at seeing how cool it was.
There are numerous other useful aspects for "my own little plagiarism machine".
Can you run AI locally?
Posted: Mon Mar 23, 2026 11:47 pm
by Jehannum
How much of that did you do on your own little plagiarism machine though?
Because once you end up using things like claude or grok or chatgpt, you're beholden to the providers of tokens, which, once they're past the initial VC rush of easy money, will quickly price these technologically-dead homunculi out of usefulness, unless we've become so hopelessly dependent on them that we're willing to sacrifice all of our collective source (regardless of license), all of our collective imagery (again, regardless of license) and our water and power on the altar of turning out a few useful JPGs or (of all the stupid things) a timeline of employment for a lawsuit that will go nowhere.
Can you run AI locally?
Posted: Tue Mar 24, 2026 12:06 am
by ConnertheCat
LLMs running locally based on your own data could be quite useful; and can't really be considered plagiarism either. I know we have our own internal one at work that is trained exclusively on our own data which is mildly useful.
Can you run AI locally?
Posted: Tue Mar 24, 2026 3:39 am
by obvs
Jehannum wrote: Mon Mar 23, 2026 11:47 pm
How much of that did you do on your own little plagiarism machine though?
Because once you end up using things like claude or grok or chatgpt, you’re beholden to the providers of tokens, which, once they’re past the initial VC rush of easy money, will quickly price these technologically-dead homunculi out of usefulness, unless we’ve become so hopelessly dependent on them that we’re willing to sacrifice all of our collective source (regardless of license), all of our collective imagery (again, regardless of license) and our water and power on the altar of turning out a few useful JPGs or (of all the stupid things) a timeline of employment for a lawsuit that will go nowhere.
Thanks. I appreciate that.
You know, now that you’ve told me that my lawsuit won’t amount to anything, I know it will turn out great!
And I’ve never thanked you for telling me so many years ago that I would never make money with my associate's degree. Over the years, every time I told my coworkers that someone in Albuquerque once told me that, we always got a good laugh.

Can you run AI locally?
Posted: Tue Mar 24, 2026 12:12 pm
by Jehannum
obvs wrote: Tue Mar 24, 2026 3:39 am
Jehannum wrote: Mon Mar 23, 2026 11:47 pm
How much of that did you do on your own little plagiarism machine though?
Because once you end up using things like claude or grok or chatgpt, you’re beholden to the providers of tokens, which, once they’re past the initial VC rush of easy money, will quickly price these technologically-dead homunculi out of usefulness, unless we’ve become so hopelessly dependent on them that we’re willing to sacrifice all of our collective source (regardless of license), all of our collective imagery (again, regardless of license) and our water and power on the altar of turning out a few useful JPGs or (of all the stupid things) a timeline of employment for a lawsuit that will go nowhere.
Thanks. I appreciate that.
You know, now that you’ve told me that my lawsuit won’t amount to anything, I know it will turn out great!
And I’ve never thanked you for telling me so many years ago that I would never make money with my associate's degree. Over the years, every time I told my coworkers that someone in Albuquerque once told me that, we always got a good laugh.

I mean, if you're going to ignore the question to attack the messenger, go on with yourself, I guess.
edit: at the time, I got the vibe from you that you were interested in a career path that was somewhat different from computer repairman, so I'm not sure I was incorrect, but was instead working without complete information.
Want to run AI locally?
Posted: Wed Apr 01, 2026 12:30 pm
by ukimalefu
Running local models on Macs gets faster with Ollama’s MLX support
Apple Silicon Macs get a performance boost thanks to better unified memory usage.
https://arstechnica.com/apple/2026/03/r ... x-support/
Want to run AI locally?
Posted: Thu Apr 02, 2026 3:04 pm
by Jehannum
WILD: https://neuromatch.social/@jonny/116324676116121930
I didn't look at the code, because I had a feeling that it was just a morass of unmaintainable spaghetti that was on its way to a critical mass of becoming unusable, but at least now I know that's the case.
Want to run AI locally?
Posted: Thu Apr 02, 2026 4:36 pm
by Geesie
Jehannum wrote: Sun Mar 22, 2026 7:36 pm
The real question is why you'd want to do so.
I have pretty light use of LLMs and I'd rather be able to just run something locally rather than support the growth of the technofeudalists' wealth.