I feel like I wasn’t looking, and then the world changed.
A brief history of Linux: In 1991, a 21 year-old Finnish computer science undergraduate at the University of Helsinki named Linux Torvalds announced that he was going to develop an alternative to Unix, an operating system developed, trademarked and sold by AT&T (Bell Labs) and the University of California at Berkeley. What began as a project to provide an affordable (free) computer operating system to interested hobbyists, has become one of the most dominant OSes today.
As one might expect of an experimental bit of computer code, it didn’t gain immediate acceptance. I remember attending a Linux User Group (LUG) some time in the 1990s and getting CDs of the software, for the cost of the CD. Certainly cheaper than buying Windows!
But it was complex, and required a computer programmer/user mindset, whereas Apple and Microsoft continued to focus on making their computers more “user friendly” so that the geek factor wasn’t necessary.
For a number of years, I focused on my work, which consisted mainly of working in Unix, Windows, and occasionally Mac environments. Then, sometime around 2008, I found I had an older Mac laptop, then known as an iBook, which could no longer run Apple’s latest and greatest operating system. I liked traveling with this portable, so I looked for an alternative. That alternative turned out to be Ubuntu Linux, which had been created by a South African company named Canonical, and was first introduced in 2004.
(Side note: Ubuntu’s product numbering follows its release schedule and is notated as two-digit year-dot-two-digit-month, so the first release was 04.04). There was a version created for the PowerPC chip, which was the CPU used at the time by Apple, so Ubuntu became usable on Mac hardware. In fact, I installed it as a dual-boot system, so I could choose either Apple or Linux on startup. I have a memory of sitting in a shopping mall in Las Vegas outside an Apple Store, using their wi-fi, but on a Mac running Linux!
Linux took a back seat in my computer pursuits for a while, as I had no real use for it. But I did keep my hand in, using the nifty Parallels Desktop for Mac virtualization software. In fact, I started when this product was a version 5, and as of this writing, version 15 is current! Virtualization allows one to set up a machine-within-a-machine. These days “containerization” is all the buzz, with terms like Docker and Kubernetes being tossed about, which is just another form of virtualization. Using Parallels, I would download an interesting-looking Linux “distribution” (the Linux name for a software delivery) and create a virtual machine (VM) running it. As my work became more and more online-based, I found it handy to install a (legal) version of Windows into a VM, thus allowing me to use Windows-specific capabilities.
Some six years ago, my home office decided having a computer server in our local office would be a good idea, so we purchased a Dell PowerEdge T420. We specified no operating system pre-installed, because Windows would have added to the cost, and I wanted to run Linux, instead. Two Intel Xeon ES 2430 v2 processors, 32 GB RAM and 2 TB of hard disk space. Although not top-of-the-line, it was definitely a server-class computer. On it, I installed CentOS 6.x. Short for Community ENterprise OS, this is a free “downstream” version of the enterprise-level Red Hat Enterprise Linux (RHEL). This was a solid system for the years I used it.
However, things change. CentOS released version 7. Sadly, there was no in-place upgrade path, which meant a complete re-install would be necessary to stay current. I didn’t bother. Then, came version 8. And the notice that version 6.x would reach end-of-life November, 2020. That’s just four months from today. Well, I’m only running a sandbox server, so I could probably just have kept on running version 6, but I don’t like the idea of running unsupported software. Because ultimately, something breaks. Murphy claims, “at the worst possible time, too!” So, I decided to take the plunge.
This was an opportunity to upgrade the server memory as well. Thirty-two gigabytes was a lot when we bought the machine, but the software produced by my employer now requires a minimum of five servers, with a total minimum of 56 GB. Why not add another 32 GB while I was upgrading? I searched and found suitable memory chips, and decided to go whole hog and added 64 GB, for a total of 96 GB total. The cost? Less than $!50. Now, using virtualization software (VMware Workstation Pro) I can run all five servers inside my one, and still have eight gigabytes of system memory “breathing room!”
Installing the memory was a breeze. The machine has 12 slots for memory, in two banks of six (to support two processors) allowing for a grand total of 384 GB (12 x 32 GB)! But then the problems began when I attempted to install CentOS 8.02. After several failed attempts, I reached out to the CentOS community support forum, where I learned that the Dell hardware was now too old for CentOS, and was no longer supported.
Huh?
Okay, I’m going to try to be understanding here, but it isn’t easy. One of the supposed benefits of Linux – at least to my understanding – is its great compatibility with older hardware. Yes, I get it: Red Hat wants to be at the cutting edge of technology, to keep its offering current and powerful, but Linux has shown a remarkable adaptability for different chip architectures, storage, networking, and other technologies. It seemed the “offending” component for me was the disk drive controller, something Dell refers to as its PowerEdge RAID Controller (PERC) . Like so much else, improvements have been made to this part of the computer, and Red Har decided to remove support for it.
What to do? After looking at the product support matrix, I decided to look for another Linux, and settled on Ubuntu, once again. Another major player in the Linux marketspace, Ubuntu has probably done more to make Linux mainstream than any other company. And I learned they’d just released their latest Long Term Support (LTS) version, 20.04. So, I downloaded it and attempted to install it.
Nope.
After a bit of hair pulling and researching, I gathered that once again the problem was the disk controller. Or, rather the way Linux could (or could not) handle a disk array. I found the solution was to partition the disk in such a way that Linux would be able to see its core folders/directories and save the excess for just storing data.
So, that’s what I did. Overall, I’m happy that I had the opportunity to learn much more about Linux, file systems, disk controllers, computer hardware and a host of other items. But it seems that while I wasn’t looking, Linux grew up. And the result isn’t a golden swan. It isn’t an ugly duckling, either, but installing and maintaining Linux has become a whole lot “geekier” than it used to be.