So, why did I stop my gaming VM to play with openstack? Curiosity. At my current position, we have a massive openstack deployment. It spans multiple data centers, with thousands of VM's and containers being managed. And after figuring out that it's pretty easy to install and play around with it, I took the leap and looked at what was possible.
After it dawned on me what openstack was capable of, it also quickly hit me that really using it and playing with it would likely not be as easy as creating the simple dev environment I now have.
To really play with it, I'll need multiple availability zones. Either through multiple VM's emulating distinct hosts, or through new physical hardware. Or both..... Which means my wallet very quickly glared at me.
One option, would be to split up my current server into 2 or 3 smaller ones. Some number of compute nodes, and a node to act as the main storage and management server.
That idea has been around since before openstack became a thing in my environment. But with openstack, I'd finally have my central management system across multiple hosts. Something UnRaid lacks. What UnRaid does have though, is ease of use, and great GPU passthrough. It also... well, just works. It's simplicity is an asset in a home environment. Particularly since I have UnRaid servers at both my parents and my cousins homes doing various things.
For now, the openstack server (Named Enterprise, because why not....) is just a singular server. But as my hardware capability (RAM) increase, it's likely that I keep playing with it even more. Until then though, after typing out this post, it's likely that it gets turned off for a bit until I can think up some better use cases for it in my environment. As the image below highlights, I'm also running of space on my SSD's in general. And new drives for this server are not #1 on my tech purchase list right now.