Reminder: Meeting June 13th 2024 at 7PM! Topic: Your own personal LLM!
Hey Gang! We've got a meeting tomorrow!! Date/Time: June 13th 2024 @ 7pm. Location: Technocopia, 44 Portland St, 6th floor, Worcester MA. Virtual Location: (back to the old haunt) https://meet.jit.si/WlugMA <https://meet.jit.si/WlugMA> Considering how much AI has been in the news, I thought it would be cool to do a talk/demo about setting up your own personal Large Language Model! I'll either use my laptop of scrounge up a piece of good enough hardware to show it off.. hell I could even spin up a droplet and run it somewhere.. we'll figure that out. Jansen sent me some links to the cool stuff he uses and I'll definitely be leaning on his expertise and others. Here's what he wrote: the self-hosted webui i use is here: https://github.com/open-webui/open-webui ollama! https://github.com/ollama/ollama here's the docker for ollama: https://hub.docker.com/r/ollama/ollama I've experimented with the ollama models and they're cool. As usual we'll nerd out, wrestle with some LLM stuff and have a great time. Afterwards we'll head off to the Boyton or somewhere else for some dinner and keep the geekery in full swing. I'll bring some snacks and refreshments as well. Later, Tim. -- I am leery of the allegiances of any politician who refers to their constituents as "consumers".
Tim Keller via WLUG <wlug@lists.wlug.org> writes:
We've got a meeting tomorrow!! Date/Time: June 13th 2024 @ 7pm. Virtual Location: (back to the old haunt) https://meet.jit.si/WlugMA
Tried a few times to get in between 6:55 -- 7:10. It asks for permission to share mike/video I say yes, it asks do I want to be moderator, I say no, it says you will get in when someone lets you in and displays the circling circle.... I'll try again. Then give up. -- Keith
Hi all, I think we had a good meeting last night, it was certainly neat seeing how well ollama2 and ollama3 are at answering questions. It was also interesting to see how it could start going off the rails a little bit when asked interesting questions, because it couldn't (in my mind!) reason about things properly. Can you tell I'm an AI skeptic in some ways? It will make big changes at some point, but I don't think it will be as much as people think right away. It's still on the hype phase, and we haven't gotten to the reality crash yet. So hopefully we'll get some more details posted so we can all play around with ollama3 on our own systems. It's a bummer I can't use my AMD GPUs since that's all I have, no nVidia stuff at all in my house. And for next month, what do people want to talk about? I can try to resurrect my talk on KiCad, updated for v8 that was released ealier this year. Or I can talk about 'duc' which is the software package I help maintain. It's for indexing large large filesystems and displaying disk usage via CLI, GUI or web pages. It's really useful for when you have a 12Tb filesystem with 30 million files, because plain 'du' just takes way too long. https://github.com/zevv/duc is where you can find it. It's been quiet for a long time, but I've been recently getting ready to update and push out a new release. Hopefully this month if I'm lucky! John
I haven't spent any time on this, but saw this article and thought of Brendan right off the bat. https://www.theregister.com/2024/06/19/proxmox_xcp_ng_gpu_passthrough/ This might be what he needs to run stuff at home on his GPU pass-through attempts. John
participants (3)
-
John Stoffel
-
Keith Wright
-
Tim Keller