I first picked up an interest in computers back in middle school and in the same way that most others probably do -- through video games. There was one in particular that really ignited my interest. It allowed for custom LUA scripts in-game and I happened to meet some people who were writing their own scripts and would freely give out programming tips. It may not have been much or very involved, but it was enough to teach me some of the foundations of logical thinking in programming. Best of all, struggling through a script and seeing actual progress as well as the end result was exhilarating.
But my time with the game wasn't perfect. In many games, I had performance issues.
It would run slow at times.
It would freeze.
Crash.
Sometimes even crash the computer.
That is, until I learned that driver updates were a thing.
It's a very simple problem but your average middle school kid wouldn't know how to fix it, much less where to start looking. Especially if there were no one nearby with any knowledge of the topic.
It took a while, but I finally figured it out.
Updating the drivers fixed all of those problems. The game ran faster. Smoother. It was stable and so was the computer. Amazing how much can change in 2 years worth of driver updates.
I then understood what a difference your drivers can make. So if the latest drivers can make that much of a difference in the hardware you run, what kind of difference would you see in changing the hardware itself?
With a middle-schooler's budget, I got some help ordering a $35 graphics card on eBay. It was quite the upgrade at the time, going from an ATI X600 to an ATI 2600 XT. The difference in games was huge. But now the processor was the bottleneck. The Pentium D 820 (2.8GHz) wasn't quite cutting it anymore. A few how-to videos later and a bit of research led to me upgrading to a Pentium D 945 (3.4GHz). Again, middle-schooler's budget. Newer processors had come out and this was going for far less than retail on eBay.
And then I put together my own PC. And another a few years later. And another a few years after that. Occasionally one for a friend; sometimes one for an acquaintance.
Eventually, I came across the Homelab subreddit which kickstarted my journey into completely unnecessary business hardware for messing around learning business IT.
I decided I wanted something more capable than your typical consumer router and pfSense was highly regarded in the Homelab community. I gave it a shot and not only was it full of features -- the stability was rock-solid. I initially had this set up in a virtual machine for testing but quickly found that it's not very useful there when you need to take down the host machine and can't search the internet to fix the problem.
So it ended up on a mini-PC with a laptop processor. Low power consumption and separate from the rest of the network.
The router was connected directly to the switch on one of its 1Gb ethernet ports. The switch also has a couple of 10Gb SFP+ ports that were mainly used for quick transfers between my main PC and the file server.
A couple of VLANs were set up to separate the wireless network from hardwired devices.
Connected to the switch on one of the VLAN enabled ports. No special set up here. In my experience, the wireless routers made by Asus are very stable and I haven't had an issue with either of 2 different models that I have used.
The main server was beyond overkill for my requirements. Uses much more electricity than I'm happy with. I should've gone with something smaller.
It's connected in two places to the switch: one ethernet port and one 10Gb SFP+ port. The ethernet is on a separate VLAN which is set up so only a certain IP can access it. It's used for IPMI and for accessing the ESXi menus. The SFP+ card handles external traffic for all of the virtual machines.
VMs are hosted on the PCIe SSD and backed up to the SATA SSD. VM backups are also copied to the file server just in case.
The controller for the hard drives is passed through to the file server VM. The hard drives are in a RAIDz2 configuration to get a combination of speed (300MB/s) and reliability.
This one was used mainly as backup for the main server.
There are all sorts of enterprise-level features available on this platform. I may not have made much use of many of them but I definitely got everything I wanted out of it, specifically:
This handled my internet speeds of 400Mbps down and 40Mbps up with ease and QOS made it even better. I could download at nearly the full speed of my connection and not run into latency issues for games or other latency-sensitive applications.
With ESXi as the hypervisor, everything ran smoothly.
For a while, consumer Nvidia GPUs could be passed through to a VM and they would work fine. I know that stopped being the case for a while. I'm not sure if anything has changed.
This had Active Directory Domain Services enabled and hosted my domain. My other VMs were tied to this where possible so I could use the one login for everything.
No major setup was done here as I had no real use for much of it. I ended up learning much of what I know about Windows Server from one of the jobs I've had.
This is the file server OS I used. It has the 10Gb card and the SATA controller for the hard drives passed through to it.
The hard drives are set up in RAIDz2, which is similar to RAID 10 but is software-based, has extra protection for data, and can handle ANY two disks failing. Because ZFS is software-based, I can wipe out the OS and start new and load up those disks in their configuration as if nothing had happened.
Permissions were set so I had a public folder and a private folder. The VM was AD-joined so I could set extra permissions as I needed and I could log in using my AD credentials.
I lost this VM a few months after setting it up due to allocating too little space to it. It quickly filled up and I didn't have the knowledge at the time to expand the disk properly.
While it was working, it received data from my VMs, the server itself, and pfSense.
I probably should have set up a separate VM for log intake.
This was connected to Zabbix to more clearly show the data that was being received. I hope to get this set up again at some point in the future.
I spun up a few Windows 10 VMs and Ubuntu/CentOS VMs to test out various things over time.
Manually setting up each one became a bit of a hassle so I began to look into Ansible and the like. I made a bit of progress before I was stopped by the ESXi free license's limitation of the ESXi API. I got a better license later on but never came back around to get this set up.
The SSDs in this server were placed in a RAID 1 configuration.
It ran ESXi as the hypervisor and Windows Server 2016 as the only VM.
The 2nd copy of Server 2016 was used as a backup domain controller to the Server 2016 VM on the main server.
VMs could be transferred here with a paid VMware ESXi license with VMotion.
Various hardware and software was used to build my own business-like IT environment in a project that took place over many years and has served many purposes. For years to come, it will continue to serve as a test platform for learning the newest technologies and becoming more familiar with the capabilities of such an environment.
The experience gained here has been very valuable to me so far in my professional career and I don't see this ending any time soon.