Friday, October 21, 2011

Linux vs. Windows... redux redux redux

A colleague sent me a link today to an article on a zdnet blog discussing a particular failing of Linux (and implying, without really supporting the argument, that Windows somehow accomplishes this better). The author seems to be trying to make the following points:
  1. Keeping up with all of the latest-and-greatest developments in Linux takes a lot of time, arguably more time than he can spend on it.
  2. "Bleeding edge" is highly dependent on exact versions of packages under developement, and getting those versions wrong breaks everything.
  3. Linux distributions are moving targets and commands (or command lines) that work on one version may not work on successive versions.
  4. Updating the operating system can break custom-compiled software that you install on the system.  He claims it makes the system unbootable.  I am skeptical about this.
  5. His ISP claims they never update their CentOS machines, because it breaks them.
First, let me say that perhaps Mr. Gewirtz is earnest about what he did and what the effects are, but describing things that he supposedly did like "recompiling the package manager" make no sense, so it's difficult to be certain what parts of his post are fact and what parts are exaggeration.  I've tried to give him the benefit of the doubt.

His first four points are absolutely true.  But they're not really "points" because they are obvious, and the solutions are equally obvious.  If you don't have time to keep up with all the latest developments in Linux, then don't!  I've arguably used Linux almost as long as anyone (since around 1992, my first kernel version was in the 0.96pl series, and I installed my first Linux on a 386SX from floppies -- it was the MCC distribution, which predates Slackware!) and I certainly don't have time to keep up with all the Linux trivia.  So I don't.  It doesn't stop me from running a few Linux boxes and knowing what I need to know to run them.  99% of the arcane details that might be interesting about Linux are not actually necessary to use it.

Likewise, there have always been "bleeding edge" versions of everything on Linux, and if you want to run them, there's generally some pain involved.  So if you don't want to put in the effort, don't run the bleeding edge!  Wait a bit for it to get stabilized and tested and sorted out, and you'll be in a much better position to have it "just work" like you're hoping it will.

The complaint that commands stop working between distribution versions is sort of silly to me.  It's true, but it's true of everything.  Solaris 10 doesn't support a lot of the commands that worked on SunOS 3, for obvious reasons (although admittedly, Sun does a remarkable job of making it work well, with the /usr/ucb tree of SunOS-type commands to compliment the /usr/bin SVR4 versions).  Even Windows doesn't solve this -- how many complaints have you ever heard about how Microsoft changed the UI in Windows?

Updating the OS definitely has the potential to break custom software.  This is equally true of Windows, IMHO, although admittedly I think Linux is a faster-moving vehicle so it's more likely that this happens more often.  Also, coming from the open-source paradigm, it's easy for Linux aficionados to feel that simply recompiling the software with the upgraded OS is easy since most things have available source.  I have a mail/web server that I originally built in 1998 that has been running RedHat 6 since it came online.  I custom compiled the mail server, and the web server, and the SSL libraries and the PHP modules and the Perl modules, etc., etc., ad nauseum.  I literally cannot upgrade this server because everything will break if I do.  I've lived with that for 10 years.  I've hardened it as much as I can, firewalled it, don't let many people log into it, and it's been okay for that long.  The operating system has outlasted 2 PCs and 2 hard disks.

One day, I will have to build a new server to replace that one, and when I do I will do it differently.  When I built this server, there was not really any such thing as Linux security updates.  If you wanted the latest SSL holes patched, you compiled your own SSL.  Today I'd never do this.  Every major distribution has a mechanism for distributing security (and other) updates, and if you update within the distribution's own software, it's not going to break.  If I had to rebuilt my server today, I'd put Debian on it.  I'd apt-get install apache and something for mail (I am a long time qmail user but I recognize that there are alternatives that didn't exist in 1998 when I chose qmail).  And I'd painlessly take updates from the vendor, easy peasy.

Finally, regarding updates for CentOS breaking the system, it's definitely unfair to paint all Linux distributions with the same brush because of something that happened on one.  I have been updating Debian and Ubuntu for years and years and while I've had some problems (trust me, trying to figure out why your apt-get dist-upgrade failed and what sort of messed up state it left you in is no picnic) it's gotten much better every time I've tried it and I've had no problems for several years.  I don't run CentOS or Fedora or Red Hat so I can't speak to them, but claiming you can't update Linux because CentOS sucks is like saying word processors are crap because you don't like Google Documents.

And, I must say that if an ISP told me that they never applied updates to their systems, I would find a new ISP.  The only exception to this would be if, as I suspect, they don't avoid applying updates because it's dangerous or risky, but rather they don't maintain the servers so of course they don't update them.

Either way, it sounds to me like the the author wants to use Windows rather than Linux, and I'm gracious enough to say that for a lot of things Windows is very capable.  But don't make the mistake he made and confuse the quality of an operating system with your personal measure of its ease of use.

No comments:

Post a Comment

Search This Blog