Reminder: WLUG Meeting tonight: Topic ProxMox / Turning Pi 2
Hey Everybody! We've got a meeting tonight! Time: 7pm Date: February 8th, 2024 Location: Technocopia, 44 Portland St. Worcester MA Virtual Location: https://meet.jit.si/WlugMA For a topics, We've got two. Topic #1: Turing Pi 2 hardware Wlug member Zori Babroudi recently bought a Turning Pi 2 and a bunch of Pi and Rocks compute modules. Currently, it's a work in progress, but eventually once things are totally working on it, I'd like to show it off in its full glory. But in the meantime, we can marvel at the hardware. Ultimately, the plan is to run k8s on it and do some LLM based AI stuff as well. Topic #2: ProxMox With BroadCom's purchase and destruction of VMware, many of us are scrambling to look for alternatives to VMware Esxi. One great contender is ProxMox. It's a debian based distro that's focused on providing a great interface to manage vms running on kvm. It also has great support for video card pass through (something I know Brandon is *very* interested in). I've got a 3 node proxmox cluster that I will show off. John Stoffel has also been experimenting with it as well so both of us can talk about the good/bad/ugly of it. As usual, the conversation will be lively and focused around linux, FOSS, fair use and who knows what! Afterwards we'll head off for some dinner and keep the conversation going! I hope to see you there! Later, Tim. -- I am leery of the allegiances of any politician who refers to their constituents as "consumers".
Heya - fun meeting, cool stuff! I meant to mention, I have a ton (literally) of decommissioned equipment at work we're looking to give/sell/donate away. Mostly Dell, but some other brands mixed in. I may have posted a partial list at some point on the list. A few older servers (R805, R300, two 32-bay Celeros NAS units, Synology RS2416+ with SAS expansion chassis), Optiplex and Precision desktops, a few Precision laptops (that cost us between$4-7K when new), a 16-port KVM, about 30 monitors, keyboards, mice, etc. etc. etc. If you know anyone who needs any more tech to play with, I have it! I also have a bunch of stuff at home that's available. Pete Wason Clark Labs 921 Main Street Worcester, MA pwason@clarku.edu 508 849 2323 (Desk) 774 922 7202 (Cell) On 2/8/2024 2:11 PM, Tim Keller via WLUG wrote:
Hey Everybody!
We've got a meeting tonight! Time: 7pm Date: February 8th, 2024 Location: Technocopia, 44 Portland St. Worcester MA Virtual Location: https://meet.jit.si/WlugMA
For a topics, We've got two.
Topic #1: Turing Pi 2 hardware Wlug member Zori Babroudi recently bought a Turning Pi 2 and a bunch of Pi and Rocks compute modules. Currently, it's a work in progress, but eventually once things are totally working on it, I'd like to show it off in its full glory. But in the meantime, we can marvel at the hardware. Ultimately, the plan is to run k8s on it and do some LLM based AI stuff as well.
Topic #2: ProxMox With BroadCom's purchase and destruction of VMware, many of us are scrambling to look for alternatives to VMware Esxi. One great contender is ProxMox. It's a debian based distro that's focused on providing a great interface to manage vms running on kvm. It also has great support for video card pass through (something I know Brandon is /very/ interested in). I've got a 3 node proxmox cluster that I will show off. John Stoffel has also been experimenting with it as well so both of us can talk about the good/bad/ugly of it.
As usual, the conversation will be lively and focused around linux, FOSS, fair use and who knows what!
Afterwards we'll head off for some dinner and keep the conversation going! I hope to see you there! Later, Tim.
-- I am leery of the allegiances of any politician who refers to their constituents as "consumers".
_______________________________________________ WLUG mailing list --wlug@lists.wlug.org To unsubscribe send an email towlug-leave@lists.wlug.org Create Account:https://wlug.mailman3.com/accounts/signup/ Change Settings:https://wlug.mailman3.com/postorius/lists/wlug.lists.wlug.org/ Web Forum/Archive:https://wlug.mailman3.com/hyperkitty/list/wlug@lists.wlug.org/message/ZL6T6X...
Bancroft School where I work would be very interested in the desktops and laptops. I am trying to build a Linux CAD lab, and all I have are 12 year old macs with 8gb ram. Do they have 16+gb ram? On Sat, Feb 10, 2024, 9:23 PM Pete Wason via WLUG <wlug@lists.wlug.org> wrote:
Heya - fun meeting, cool stuff! I meant to mention, I have a ton (literally) of decommissioned equipment at work we're looking to give/sell/donate away. Mostly Dell, but some other brands mixed in. I may have posted a partial list at some point on the list. A few older servers (R805, R300, two 32-bay Celeros NAS units, Synology RS2416+ with SAS expansion chassis), Optiplex and Precision desktops, a few Precision laptops (that cost us between$4-7K when new), a 16-port KVM, about 30 monitors, keyboards, mice, etc. etc. etc.
If you know anyone who needs any more tech to play with, I have it! I also have a bunch of stuff at home that's available.
Pete Wason Clark Labs 921 Main Street Worcester, MA pwason@clarku.edu 508 849 2323 (Desk) 774 922 7202 (Cell) On 2/8/2024 2:11 PM, Tim Keller via WLUG wrote:
Hey Everybody!
We've got a meeting tonight! Time: 7pm Date: February 8th, 2024 Location: Technocopia, 44 Portland St. Worcester MA Virtual Location: https://meet.jit.si/WlugMA
For a topics, We've got two.
Topic #1: Turing Pi 2 hardware Wlug member Zori Babroudi recently bought a Turning Pi 2 and a bunch of Pi and Rocks compute modules. Currently, it's a work in progress, but eventually once things are totally working on it, I'd like to show it off in its full glory. But in the meantime, we can marvel at the hardware. Ultimately, the plan is to run k8s on it and do some LLM based AI stuff as well.
Topic #2: ProxMox With BroadCom's purchase and destruction of VMware, many of us are scrambling to look for alternatives to VMware Esxi. One great contender is ProxMox. It's a debian based distro that's focused on providing a great interface to manage vms running on kvm. It also has great support for video card pass through (something I know Brandon is *very* interested in). I've got a 3 node proxmox cluster that I will show off. John Stoffel has also been experimenting with it as well so both of us can talk about the good/bad/ugly of it.
As usual, the conversation will be lively and focused around linux, FOSS, fair use and who knows what!
Afterwards we'll head off for some dinner and keep the conversation going! I hope to see you there! Later, Tim.
-- I am leery of the allegiances of any politician who refers to their constituents as "consumers".
_______________________________________________ WLUG mailing list -- wlug@lists.wlug.org To unsubscribe send an email to wlug-leave@lists.wlug.org Create Account: https://wlug.mailman3.com/accounts/signup/ Change Settings: https://wlug.mailman3.com/postorius/lists/wlug.lists.wlug.org/ Web Forum/Archive: https://wlug.mailman3.com/hyperkitty/list/wlug@lists.wlug.org/message/ZL6T6X...
_______________________________________________ WLUG mailing list -- wlug@lists.wlug.org To unsubscribe send an email to wlug-leave@lists.wlug.org Create Account: https://wlug.mailman3.com/accounts/signup/ Change Settings: https://wlug.mailman3.com/postorius/lists/wlug.lists.wlug.org/ Web Forum/Archive: https://wlug.mailman3.com/hyperkitty/list/wlug@lists.wlug.org/message/FVWLGI...
That synology is nice, enough grunt to run a fair amount of compute on it. Probably too rich for my blood though. soup On Sat, Feb 10, 2024 at 9:26 PM Pete Wason via WLUG <wlug@lists.wlug.org> wrote:
Heya - fun meeting, cool stuff! I meant to mention, I have a ton (literally) of decommissioned equipment at work we're looking to give/sell/donate away. Mostly Dell, but some other brands mixed in. I may have posted a partial list at some point on the list. A few older servers (R805, R300, two 32-bay Celeros NAS units, Synology RS2416+ with SAS expansion chassis), Optiplex and Precision desktops, a few Precision laptops (that cost us between$4-7K when new), a 16-port KVM, about 30 monitors, keyboards, mice, etc. etc. etc.
If you know anyone who needs any more tech to play with, I have it! I also have a bunch of stuff at home that's available.
Pete Wason Clark Labs 921 Main Street Worcester, MA pwason@clarku.edu 508 849 2323 (Desk) 774 922 7202 (Cell) On 2/8/2024 2:11 PM, Tim Keller via WLUG wrote:
Hey Everybody!
We've got a meeting tonight! Time: 7pm Date: February 8th, 2024 Location: Technocopia, 44 Portland St. Worcester MA Virtual Location: https://meet.jit.si/WlugMA
For a topics, We've got two.
Topic #1: Turing Pi 2 hardware Wlug member Zori Babroudi recently bought a Turning Pi 2 and a bunch of Pi and Rocks compute modules. Currently, it's a work in progress, but eventually once things are totally working on it, I'd like to show it off in its full glory. But in the meantime, we can marvel at the hardware. Ultimately, the plan is to run k8s on it and do some LLM based AI stuff as well.
Topic #2: ProxMox With BroadCom's purchase and destruction of VMware, many of us are scrambling to look for alternatives to VMware Esxi. One great contender is ProxMox. It's a debian based distro that's focused on providing a great interface to manage vms running on kvm. It also has great support for video card pass through (something I know Brandon is *very* interested in). I've got a 3 node proxmox cluster that I will show off. John Stoffel has also been experimenting with it as well so both of us can talk about the good/bad/ugly of it.
As usual, the conversation will be lively and focused around linux, FOSS, fair use and who knows what!
Afterwards we'll head off for some dinner and keep the conversation going! I hope to see you there! Later, Tim.
-- I am leery of the allegiances of any politician who refers to their constituents as "consumers".
_______________________________________________ WLUG mailing list -- wlug@lists.wlug.org To unsubscribe send an email to wlug-leave@lists.wlug.org Create Account: https://wlug.mailman3.com/accounts/signup/ Change Settings: https://wlug.mailman3.com/postorius/lists/wlug.lists.wlug.org/ Web Forum/Archive: https://wlug.mailman3.com/hyperkitty/list/wlug@lists.wlug.org/message/ZL6T6X...
_______________________________________________ WLUG mailing list -- wlug@lists.wlug.org To unsubscribe send an email to wlug-leave@lists.wlug.org Create Account: https://wlug.mailman3.com/accounts/signup/ Change Settings: https://wlug.mailman3.com/postorius/lists/wlug.lists.wlug.org/ Web Forum/Archive: https://wlug.mailman3.com/hyperkitty/list/wlug@lists.wlug.org/message/FVWLGI...
"Pete" == Pete Wason via WLUG <wlug@lists.wlug.org> writes:
Heya - fun meeting, cool stuff! I meant to mention, I have a ton (literally) of decommissioned equipment at work we're looking to give/sell/donate away. Mostly Dell, but some other brands mixed in. I may have posted a partial list at some point on the list. A few older servers (R805, R300, two 32-bay Celeros NAS units, Synology RS2416+ with SAS expansion chassis), Optiplex and Precision desktops, a few Precision laptops (that cost us between$4-7K when new), a 16-port KVM, about 30 monitors, keyboards, mice, etc. etc. etc.
What kind of KVM is it? I'd love to get one for home for my main server/desktop combo in my basement, but if someone else can really use it, I'm ok with it. And it was great meeting you and I think we had a really fun and interesting talk thursday. Let's try to keep up the fun! Who's up for a talk next month? And does anyone have pointers to the Image generating stuff we talked about? It wiould be fun to try it out myself and for $10 I could get alot of fun. John
On 2/12/24 17:10, John Stoffel via WLUG wrote:
"Pete" == Pete Wason via WLUG <wlug@lists.wlug.org> writes:
Heya - fun meeting, cool stuff! I meant to mention, I have a ton (literally) of decommissioned equipment at work we're looking to give/sell/donate away. Mostly Dell, but some other brands mixed in. I may have posted a partial list at some point on the list. A few older servers (R805, R300, two 32-bay Celeros NAS units, Synology RS2416+ with SAS expansion chassis), Optiplex and Precision desktops, a few Precision laptops (that cost us between$4-7K when new), a 16-port KVM, about 30 monitors, keyboards, mice, etc. etc. etc.
What kind of KVM is it? I'd love to get one for home for my main server/desktop combo in my basement, but if someone else can really use it, I'm ok with it.
I was interested in the KVM to replace the... "interesting" KVM currently in my server rack. It's two nonfunctional KVMs kludged together to make one functional KVM with no docs and no monitor. One of the switches only has a functional switch, and one of the switches only has a functional keyboard, with a COTS Dell monitor essentially gaff taped on top. In general though, I'd love to see a list of equipment up for grabs! --cs
John Stoffel via WLUG <wlug@lists.wlug.org> writes:
And does anyone have pointers to the Image generating stuff we talked about? It wiould be fun to try it out myself and for $10 I could get alot of fun.
That wasn't a motion, but I second it anyway. From the viewpoint of a jit.si watcher, that was one of the most fun demos of a program I have seen at a WLUG over the decades. But eye-candy aside, I was most interested that the demonstrator claimed to understand how it works and to have some links. The meaning of words changes over time. More change in less time now. A cartoon generating program is called "artificial intelligence". I can't draw cartoons, so I must not be intelligent. The words "machine learning" were used. As I said Thursday, I did my dissertation on machine learning. At that time, it meant a computer (with appropriate programming) would input a sequence of examples, each marked as in or not in a certain class. The computer would then (we hoped) be able to correctly classify examples that had not been explicitly shown before. I didn't see anything like that in the demo. It was said to be based on a "neural net", but neural nets have been around since Minsky started inventing words back in the fifties. What changed to make neural networks suddenly start working? -- Keith
They began leveraging large language models against... any data set as I understand it. There were a few papers that set the world on fire with their implementations. Academia was raided for questionably sourced corpus for commercial use veiled in community gain. I've been wrenching on stable diffusion API for a few weeks, as a former professional photographer it is shocking to me the quality you can get from very little input and common COTS GPUs. Haven't touched any of the SaaS cloud inferencing services that have come out in this gold rush but they can do quite a lot more with the datacenter class GPU grunt. Text to video (so-called "T2V") is here and getting scary.~ Bless, soup On Tue, Feb 13, 2024 at 4:19 PM Keith Wright via WLUG <wlug@lists.wlug.org> wrote:
John Stoffel via WLUG <wlug@lists.wlug.org> writes:
And does anyone have pointers to the Image generating stuff we talked about? It wiould be fun to try it out myself and for $10 I could get alot of fun.
That wasn't a motion, but I second it anyway.
From the viewpoint of a jit.si watcher, that was one of the most fun demos of a program I have seen at a WLUG over the decades.
But eye-candy aside, I was most interested that the demonstrator claimed to understand how it works and to have some links.
The meaning of words changes over time. More change in less time now.
A cartoon generating program is called "artificial intelligence". I can't draw cartoons, so I must not be intelligent.
The words "machine learning" were used. As I said Thursday, I did my dissertation on machine learning. At that time, it meant a computer (with appropriate programming) would input a sequence of examples, each marked as in or not in a certain class. The computer would then (we hoped) be able to correctly classify examples that had not been explicitly shown before.
I didn't see anything like that in the demo.
It was said to be based on a "neural net", but neural nets have been around since Minsky started inventing words back in the fifties. What changed to make neural networks suddenly start working?
-- Keith _______________________________________________ WLUG mailing list -- wlug@lists.wlug.org To unsubscribe send an email to wlug-leave@lists.wlug.org Create Account: https://wlug.mailman3.com/accounts/signup/ Change Settings: https://wlug.mailman3.com/postorius/lists/wlug.lists.wlug.org/ Web Forum/Archive: https://wlug.mailman3.com/hyperkitty/list/wlug@lists.wlug.org/message/7OFTTG...
"soup" == soup <soupforare@gmail.com> writes:
They began leveraging large language models against... any data set as I understand it. There were a few papers that set the world on fire with their implementations. Academia was raided for questionably sourced corpus for commercial use veiled in community gain.
I've been wrenching on stable diffusion API for a few weeks, as a former professional photographer it is shocking to me the quality you can get from very little input and common COTS GPUs. Haven't touched any of the SaaS cloud inferencing services that have come out in this gold rush but they can do quite a lot more with the datacenter class GPU grunt. Text to video (so-called "T2V") is here and getting scary.~
You have to have a recent nVidia CPU to get good results, I was trying using AMD GPUs and it just wasn't fast at all. So instead of spending $500 on a GPU, I think spending $20 on some cloud rental might be a good use of my money and time. Still watiing on <blanking on his name> to post his setup instructions from New Orleans so we can all start poking at it. Now one big issue I see is that once get generate an image, can you get a version broken into layers? Do professional artists work mostly in layers on computers so they can more easily re-compose their layout? For some of the examples, I though it was really neat, but I'd like to shift the result to the left say to emphasize the background more. But since I'm a terrible artist without any training or much talent, I leave it to others to answer these questions.
On Thu, 2024-02-15 at 09:52 -0500, John Stoffel via WLUG wrote:
Now one big issue I see is that once get generate an image, can you get a version broken into layers? Do professional artists work mostly in layers on computers so they can more easily re-compose their layout?
I don't know if most artists do. I do for my bad artwork. I started remaking covers of old Dragon magazines. I tend to use a lot of layer so I move them around. Also I think it could be neat to animate or maybe use for a visual novel at some point. https://youtu.be/6Hm4wHMfsxc I don't think any of the AI tools separate out the layers. -- Dennis Payne dulsi@identicalsoftware.com https://mastodon.gamedev.place/@dulsi
Yep, definitely the case. Need CUDA cores, without them it's doing other and more work. It's funny the apple silicon will probably see faster performance increase than the Radeons just because that's where the people wrenching on this stuff are. I'm still in intel land there for OSX and will be for quite a while (audio software reasons). I'm operating slowly with a 2070RTX and agree, once you've got process, it is a much better value proposition to pay for cloud inferencing. Merging of large checkpoints, building intricate models, aren't possible on consumer hardware until you get into silly money and datacenter boards will always be faster in any case. There's facility and process for separation of elements but were I working professionally in this fashion I would not be expecting final output from inferencing. I would use it as a stock library, at this point in the tech anyway. Even the simplest use of the thing- generating backgrounds, saves so much work and allows for so much different _kind_ of work that if that was all it did it would be a revolution. Separation or direction of elements during inferencing has seen improvement in leaps and bounds, though. Not long ago it was just inpainting (creating masks), and autogeneration of partitions with math, then partition by token/attention and now even automatable manipulation of masking and weights by partition by mask. So it's definitely doable but would require each to build pipeline for their needs. Those tools are still in early stages but the ComfyUI¹ has a 'workflows' conceit for doing a lot of that with visual programming. I haven't moved on from grado library webui yet so no first hand experience with it. Already have pipelines for my purpose in that, the usual, no time to move to the bleeding edge~ For that particular need, the stock featureset of Automatic1111² can do growth, called "outpainting", of a given input. There's extensions to expand on steering capabilities for that but it does a fine job in my experience, as long as the checkpoint and tokens you're using "knows" something of the input. Talking about art is like dancing about architecture so I hope I'm being clear. Also, I should say this is all down at the consumer level, folks building huge base models or using it in enterprise are much more likely to be using pytorch and the diffusers lib. soup ¹ https://github.com/comfyanonymous/ComfyUI ² https://github.com/AUTOMATIC1111/stable-diffusion-webui On Thu, Feb 15, 2024 at 9:52 AM John Stoffel <john@stoffel.org> wrote:
You have to have a recent nVidia CPU to get good results, I was trying using AMD GPUs and it just wasn't fast at all. So instead of spending $500 on a GPU, I think spending $20 on some cloud rental might be a good use of my money and time.
Still watiing on <blanking on his name> to post his setup instructions from New Orleans so we can all start poking at it.
Now one big issue I see is that once get generate an image, can you get a version broken into layers? Do professional artists work mostly in layers on computers so they can more easily re-compose their layout?
For some of the examples, I though it was really neat, but I'd like to shift the result to the left say to emphasize the background more.
But since I'm a terrible artist without any training or much talent, I leave it to others to answer these questions.
participants (8)
-
Cara Salter
-
Dennis Payne
-
John Stoffel
-
Keith Wright
-
Kevin Harrington
-
Pete Wason
-
soup
-
Tim Keller