I diagrammed out my home lab/home server setup, mostly to keep a complete overview of how everything connects. I didn’t want to get bogged down in aesthetics around colour scheme, or layout – as you can no doubt tell. After a while diagramming it started to feel like a meme where I was trying to convey some crazy conspiracy theory on a wall of pinned paperwork and connecting threads. I think I am done documenting everything. But now I am wondering how obsessive I should be about detailing every little thing and VLANs and IP assignments. I don’t really care if it looks like a dog’s dinner, I really just care about “okay, where does this wire go to?” Is that the right approach?
That’s mostly semantics, for me at least.
I have only one NAS, and one Proxmox host that is up 24/7, so they are in production.
I regularly tinker with those two as well, it’s all part of my lab.
This is how it works for me. I am using the homelab to learn new things. Part of that learning process is getting things into production and maintaining them. Because managing a production environment is one of the things I want to learn.
A bit off topic but I want to see some IRL pictures of all that stuff lol
Came here to comment this. OP can we get a house tour please?
When you have weekly change control meetings
Can you do rm -rf / on everything in your “lab”? Yes: still a lab. No: not a lab.
You, diagram? I just keep throwing crap into the mix and trying to remember which vlan and ip scheme its supposed to use and which device has access. Order is for work, Chaos is for personal enjoyment.
are you running 9" displays in place of physical photos in frames? Curious how this is setup. Is there a write-up somewhere?
edit: same for “the wall” with the 6x 55" screens.
If you can turn it off and still do things, it’s a homelab. If you run services on it that are vital to your home, then it’s a home server.
The meaning of “homelab” has changed over the years. Originally it was literally just having the hardware you’d find in the lab at home. e.g. you were taking classes for a CCNA and instead of going to the school’s lab for hands-on with the hardware you’d just replicate the setup at ‘home’. Nothing in the setup would be relied on beyond the specific thing you’re testing in the moment. If you’re going to stick to the original intent of the name, anything beyond “lab” use wouldn’t be “homelab”.
Now it skews more to meaning anything you’re using to learn the technology even if you’re using it as the equivalent of production and rely on it being up as a part of your daily life.
Stops being a homelab when you have SLA’s
We need some pictures
Questions: Are you in deep shit if the cat bathroom rack goes down?
Jukebox project sounds cool. Any extra info on that?
What’s the deal with the Oh-Fuck server in the arcade cabinet?
Also, 84 cores in the arcade cabinet? Just… damn.
I am working on the build log for the jukebox project. It’ll be on github eventually.
I have a $700 Tyan motherboard in my workstation. When I was moving the motherboard from one case to another, I scraped the underside of the motherboard against the metal case and broke off a number of small SMT caps and resistors. In the middle of the pandemic. In the middle of a project. So I had to jump on Amazon and have a new motherboard shipped to me next day whilst I RMA’d the damaged one. What do you say when you break your workstation motherboard in a moment of casual clumsiness? “Well… fuck!”
The jukebox is a “retro” jukebox. Wood grain, lots of tactile buttons. Two 14" 4160x1100 pixel touch screens with a vacuum fluroescent display graphic effect that shows the tracks. Click an arcade button, play that track. So it looks like those old-style jukebox devices you’d find in a diner. There are two 1920x1080 flexible touchscreens (though I have them encased so they are just permanently curved touchscreen displays), that let you navigate the full library, show album artwork, search box, etc. It’s all driven by a single Raspberry Pi with a 4TB USB SSD for storage, and everything syncs to the music directory on my Synology NAS.
Am I in deep shit? Only if I don’t clean the litter box in the “Cat Bathroom.” So the only thing that can really go wrong is the power going out. Everything else is sort of redundant, and you can route around it by moving a few cables. I guess the UDM Pro SE taking a shit would cause me some issues. Or the cable modem. Everything else, though used daily and to its fullest extent, simply means those services, e.g. music server, become temporarily unavailable. No real disasters in over 20 years. The backup Synology NAS is effectivel a fail over for essential services, e.g. adguard, but even if both synology devices are down, there’s backup DNS resolving on the UDM and also with quad9.
Is it 84 cores? 4 in the NUC, 28x2 in Storm, 28 in oh-fuck. Never really thought of it that way. I’d like some 8490H XEONs but I cannot justify it right now.
I don’t see a single other person mentioning it, so I’ll just say it: 52TB of flash storage alone is enough to make me jealous. 52TB of flash storage in an RV is just a few more layers on top.
Sad that the picture-wall project repository isn’t open on github - I hoped to see it in action. Seems very neat.
Holy shit… thats quite a “home” enterprise
When I started doing informal change control reviews with family and scheduling disruptive work outside of peak windows to avoid “user impact” - also having critical hot spares available, haha.
Lower Guk was more fun than Upper Guk.
Fight me.
Let me deal with this train to zone, then I will bring back a bunch of guards with me that you are KoS too. :-)
Did you see “The Cauldron” too?
Yes I love the EQ references. Still my favorite MMO TO DATE pre Lucian.
Only game where a death would make me pretty late for work.
Having all the clients in your diagram is a bit much but it’s your diagram. We have over 1M client end points and just call it user or used to. Things change but love your setup. My home lab has been in use for years. My powered off servers and switches could make another lab for someone. To much power draw for me.