Time for a change
I rarely write about computers. I use computers for a significant portion of every day. I use computers to follow the news in fields that interest me, to keep in touch with far-flung friends and acquaintances, and - once in a while - to update this web site.
I also use a computer at work where, for fourty hours or so every week, I write software to automatically classify the text of web sites. These classifications ultimately are used by ISPs to offer content-filtering services for parental controls and for workplace safety and productivity. Some of the software that I wrote at the height of the dot-com boom is still working behind the scenes on behalf of hundreds of millions of web surfers.
I've mentioned before that I have almost no interest in using a computer for music production; that's still true. I'll use a computer for ancillary tasks like burning CDs, uploading files and occasionally attempting to notate a score or lead sheet, but my day-to-day musical activities are largely computer-free.
The articles on this site, if nothing else, have served as a catalog of changes and observations related to things of importance to me. Over a ten year period, for example, I've gone from doubting that digital modeling could replace traditional gear to fully embracing the technology. I've noted changes in musical performance, recording and distribution that have paved the way for the wider adoption of new musical technologies.
I don't lament the passing of the "good old" days. I'm not one to romanticize the past. Tube amps were great in the days when they were the only game in town. Tube amps were still great when the engineers who designed solid-state guitar amplifiers failed to understand the common use cases: amps pushed to the point of distortion and careless upkeep of speaker cables often leading to short-circuits which would destroy the output transistors. Tube amps recovered from a minor setback in the late `70s and early `80s - during a brief reign of clean power - to become the preferred form of amplification throughout the `90s and the `00s.
Digital modeling, through many of its early generations, lacked authenticity. Despite the fact that many performers embraced modeling technology - their audiences largely unaware of the change - tube amps continued to dominate.
Today, a change is afoot. The market for live music is fragmenting. Big-budget stadium shows will never disappear; there's something strangely appealing about the visceral thrill of witnessing a performance along with thousands of other enthusiastic fans. There's a lot of growth, though, in small and mid-sized performance venues and house concerts. Most of these alternative venues are located in communities where the neighbors won't tolerate the aural assault of a tube amp cranked into its "sweet spot".
The biggest future growth of MI gear purchases comes not from performers who tour with dozens of supporting staff, but rather from amateurs and small-time professionals alike who must satisfy the wishes of family, neighbors and venue operators who all insist that you respect their wishes for you to create your music at a sane volume.
While tube amps won't disappear tomorrow, or even next year, the market will eventually give up on unsatisfying attentuation devices, inflexible small amplifiers, and unreliable tubes. It's not a question of whether, it's a question of when. It's time for a change.
I'm not very good at predicting change. I'm skeptical, and often resistant. But I do eventually pick up on trends that make a difference to the things in which I'm interested. And today I'd like to tell you something about computers that I never thought I'd say...
My involvement with computers goes way back. I built my first computers from components purchased from electronics distributors, before there was even such a thing as a computer kit. I once paid a thousand dollars for a used five megabyte drive, at a time when that was a bargain and five megabytes was an almost unimaginably large amount of storage for a home computer. I designed and built numerous devices around early microcontrollers from Intel, Zilog and others. I spent a couple of years working in the home video game industry in its early days, when the state-of-the-art consoles had 128 bytes of RAM and 4K bytes of ROM.
As the years rolled on, so did the magnitude of my computer projects. I ran a tools support organization for several hundred programmers. I've written mission-critical software for use by ISPs, RBOCs, large corporate networks, and mortgage lenders.
I've had some manner of functional (as opposed to experimental) computer in my home continuously since 1980. That includes some kind of network connection. I've been on "the internet" since the days when one could list the host names of every attached computer on a single sheet of paper.
In 1985 I bought my first Apple Macintosh computer. It had, IIRC, a nine-inch screen, one floppy disk drive and a half-megabyte of RAM. I was younger, then; I kind of enjoyed being an early adopter of new computer technology.
I've admired all things Macintosh since I sat in front of one of the first production Macs in late 1984. There has always been a certain elegance of design and function in a Macintosh that hasn't been seen in other vendors' systems. Yes, there have been associated costs and inconveniences. But I bought into the Macintosh ecosystem and had been largely happy with my choice for over twenty-five years.
Over the last several years, though, Linux has caught my attention. Linux is not new. Not by a long shot. Linux had its beginnings twenty years ago. Its progenitor, Linus Torvalds (who, despite being only two degrees of separation away in my social circles, I've met only twice) borrowed from the work of Andrew Tanenbaum, an academic who had based some of his writings on the study of Unix, which in turn originated at Bell Labs way back in 1969. So when you hear about Linux and think of it as a latecomer to the party, you're only partly correct. Linux has deep roots.
The reason you might think that Linux is a latecomer is that, when it comes to desktop computers, it is a latecomer. Linux is widely used in embedded systems from appliances to network hardware to smart phones. In all of those cases, Linux is invisible. You neither know nor - unless you're a technology geek - care that it exists.
For those of us who seek out new computer technologies, Linux has been there for the asking for twenty years. You really had to want Linux in the early days, and were likely to learn more about its underlying operation than you cared to know in the process of making it work for you. Once (and often, if) you overcame the installation hurdles, there were still lots of rough edges related to its daily use. That's gravy for techies who like to tinker, but a showstopper if all you really wanted to do was to surf the web, listen to music, and balance your household budget.
Linux is packaged in "distributions", or "distros". Distros are a way of delivering Linux components that's supposed to be relatively easy to use; everything's prepared in a way that a few clicks or commands should install a working system or - once the system has been installed - an additional application. Of course, there are the inevitable problems caused by the presence of unanticipated hardware or incompatible combinations of hardware. These are the same problems that enable a thriving service industry for Microsoft products, and the same problems that Apple neatly disposed of by the obvious expedient of eliminating unanticipated hardware variables.
Over the years there have been a large number of distros, competing on features, support services, intended use, and other attributes. Few, if any, of these distros would have appealed to - and I say this with both love and recognition - anyone but IT professionals or professional propeller-heads.
In late 2004 Canonical Ltd. introduced Ubuntu Linux, named after the Southern African philosophy of humanity towards others. If you'd expect that philosophy of humanity to somehow translate into a computer operating system that's easy to use, you'd be right. Ubuntu has become the most popular Linux distro for desktop computers, with (as of this writing) more than twelve million users accounting for fully half of the desktop Linux market share.
I tried Ubuntu numerous times in the early years, running it on retired Macintosh computers. From day one it was apparent that Ubuntu might become the free OS that would appeal to long-time Macintosh users like myself. The installation and setup was close to being trivially simple, and the user interface was always pretty good, deftly combining some of the best features of Mac OS and Windows.
But the early days also saw a bewildering array of post-installation choices with respect to the application software ecosystem; it was too easy to put your system into a compromised state by installing conflicting applications or, more specifically, applications that required conflicting changes to the underpinnings of the system. Some of the hardware on my old Macs simply couldn't be made to work under Linux; that was more because developers had a difficult time figuring out how to write drivers without cooperation from Apple. Updates, while automatic, were risky. More than once in the early days I made my Linux-on-Mac systems unbootable by accepting recommended software updates.
Despite all those growing pains, Ubuntu Linux showed promise. Frankly, the experience was still better than I had encountered while working with the first two decades of Windows releases...
When Sun Microsystems made VirtualBox freely available, I started looking again at Ubuntu Linux releases. VirtualBox emulates a standard hardware platform on top of a "host" operating system. The host can be Mac OS, Windows or Linux... whatever's convenient and available to you. The emulated "guest" can run any other OS of your choosing. In my case, the host was a Macintosh and the guest was Linux.
At the beginning of this year I started using Ubuntu Linux full-time at work. At home, I started running Ubuntu Linux inside VirtualBox on my MacBook. The more I worked with Ubuntu, the more I came to appreciate its unique features.
Eventually I found myself starting up Ubuntu in VirtualBox, making it fill the MacBook's screen, and ignoring the Mac OS almost completely. At that point, I knew that it was time for a change.
My intent was to migrate from Mac OS X to Ubuntu Linux. The question was: how? I knew that I didn't want to run Linux on top of Mac hardware; Apple still doesn't cooperate with the Linux development community by providing hardware specs. Attempting to run Linux directly on my Mac (without VirtualBox as a shim) was expected to be a lot of work. Besides, there's nothing wrong with the Mac OS; it's a great match to the hardware. May as well keep them together...
My attention turned to seeking pre-packaged Linux systems. I learned two things from my search: (1) the choices are extremely limited, and (2) one pays dearly for the privilege of having someone else install a "free" OS for you. In some cases, the upcharge was an additional 50% over the nominal "street price" of the hardware. In other cases, the markup was less but the system was delivered "as is". Linux installs and runs well on a lot of commodity hardware, but if I'm going to pay someone to deliver a fully-functional system, I'd like them to stand behind it for a year or more.
My next tack was to find a suitable hardware system and install Linux on my own. This turned out, again, to be more difficult than it would seem. I wanted to find a system that I knew was going to work with Linux. Obtaining that compatibility information was neither easy nor fruitful. One can get an idea for which hardware components are supported, but in attempting to match that against information supplied (or more accurately, not supplied) by hardware vendors it was too complicated to account for all of the variables. This, I suppose, is why the vendors of pre-packaged Linux systems can get away with charging so much for their expertise.
Eventually, I came to appreciate some of the necessary constraints. Bleeding-edge hardware is always risky. Linux developers get their hands on new hardware only after it has been qualified and shipped for use with the dominant OS; then they have to play catch-up if there are any surprises. Bleeding-edge hardware is also, by definition, bloody expensive compared to its less-glamourous older siblings. Going back even a year or two on the technology timeline offers reasonable assurances that the Linux developers have done what they need to do with respect to taking the rough edges off of integrating the hardware, and saves the buyer a bundle. Performance sacrifices going back only a year or two are modest, and can be countered by buying a higher-end machine than you might have been able to afford were you adopting the bleeding-edge technology.
But that's where my understanding ran out... After twenty-five years of buying hardware from Apple, where one's choices are limited to "small, medium or large" (or, perhaps more perspicaciously, how much one can afford to pay), I was lost when it came to understanding the bewildering array of choices in commodity hardware. After a week and a half of thrashing, I decided to turn the hardware choices over to an expert.
I took my wish list to Alex at Old Town Computers, a small but well-recommended shop inside Backspace in Old Town Portland. I told Alex that I wanted a kick-ass software-developer's machine loaded with RAM, a multicore CPU and a decent-sized cache. I knew that I wanted to avoid the Sandy Bridge archtecture; the Ubuntu developers expect to catch up with that toward the end of the year. I wanted two 1 TB drives, a CD/DVD R/W drive, a media card reader, WiFi, and multiple USB ports in a quiet mid-sized tower enclosure. I don't care much about graphics horsepower - as I'll never play games on this machine - but I did want a good 24" monitor, having become accustomed to the quality of Apple displays over the years.
Over the space of an hour and a half, Alex asked some clarifying questions, explained various options and drafted two proposals while I visited a downtown music store and later sat and nursed a beer at Backspace.
To say that I was pleased with Alex's proposal would be a vast understatement. I ended up with a screaming system for about what I would have paid for a low-end MacBook plus AppleCare.
A week later (allowing for delivery of components), David at OTC assembled my system, installed Ubuntu 11.04 and ran a 24-hour burn-in.
Upon arriving home, system setup consisted of plugging in the peripherals and applying power. Everything "just worked"...
Fair warning: this is a theme that will be repeated over and over: everything "just works". This is something I have come to expect from Apple systems over the years. With Apple's closed hardware ecosystem, one is rudely surpried - if not indignant - when something doesn't "just work". I was prepared to set my sights somewhat lower for an open-source OS running on commodity hardware; it's tough for the developers to cover all the bases when the ecosystem is so diverse.
After powering up the system, my first and only challenge was to make the WiFi work. This was a crucial first step since I don't have a wired connection between my work room in the basement and the "networking closet" (which is an admittedly grandiose term for the corner of the sewing room closet that hosts a cable modem and an Airport Extreme) upstairs. With the MacBook Pro by my side, I found a web page that described the necessary one-line configuration file addition; this took all of thirty minutes to find and implement.
Next up was importing my personal files from the MacBook Pro. I thought I'd be able to pull these in from my Time Machine backups. That turned out to not be as easy as I'd thought. I had assumed that Apple ran rsync under the covers of Time Machine, making incremental backups with hard links. But I found data files where I expected hard links. Maybe they were thinly-disguised soft links. Regardless, my thoughts of importing directly from the Time Machine backup evaporated. Instead, I copied files from the MacBook Pro to a USB hard drive, then from the USB hard drive to the Linux machine.
Oh, did I mention that plugging a Mac-formatted hard drive into the Ubuntu box just works? It does. It's mounted as a read-only file system, but that was all I needed in this case.
Importing my photo and music libraries from the data imported from the Mac: just worked.
Reading Canon Raw photos: I installed Gimp and ufraw; they just worked.
Printing to my Epson photo printer: just worked.
Recognizing a Wacom tablet: just worked.
Recognizing a scanner: just worked.
Playing various kinds of media files: I used the software center to load a package of non-free codecs; they just worked.
Ripping an audio CD: just worked.
Writing a data CD: just worked.
Setting up a backup system: one click to install from the software center; another few clicks to configure; it just works.
The list of things that "just work" goes on and on and on... The only thing so far that hasn't worked is syncing files to my iPhone 4. Until the developers crack Apple's proprietary protocol, the iPhone 4 (as well as the iPad and latest-generation iPods) will remain a read-only device. However, I had no problem syncing my first-generation iPod Touch. (Now I'm glad I held on to that...)
Most of the work in setting up the new machine has been related to data migration. I've always been fairly cognizant of the risks of committing important data to a proprietary file format, so I restrict my use of applications that don't offer cross-platform support. Still, I fell prey to the sexy good looks of Apple's spreadsheet - Numbers - for a few frequently-updated lists. I was fortunate that I could export those files in a format readable by Libre Office on Ubuntu.
Apple's Safari web browser stores webloc files (created by dragging an URL from the browser to the desktop) in a binary format. With a bit of searching, I found a Perl program designed to convert those files to a text format, then subverted the program to instead launch Firefox upon clicking a webloc file. In the two hours that it took me to do that, I probably could have manually translated most of the webloc files. But writing the program was more fun and a lot easier on my hands...
Setting up a connection to a Cisco VPN was only slightly more fussy than it needed to be. The software center sometimes doesn't provide clear guidance for less common applications.
All in all, I'd have to say that - for my needs - setting up an Ubuntu Linux system was actually easier than setting up my MacBook Pro was. In particular, any hardware more than a couple years old really isn't supported by Apple. On the Mac, I'd had to seek out unsupported software packages from disparate sources for my printer, tablet and scanner.
As a software developer who works on Linux first, I've discovered that Darwin - while Linux-like - is not a good substitute for the real thing. There are the usual differences related to their derivation from different distros. But the trouble extends beyond that. At its core, Apple software - even the parts derived from Open Source software - is intended to be a closed ecosystem. An Apple customer is encouraged - if not expected - to obtain all of their software and media from Apple and its developers and partners.
Have you ever looked into installing a Linux package on Darwin? It's not pretty. If you want to use one of the package managers supported on Darwin, you have to install a lot of unsupported cruft and sometimes even mess with permissions and ownership in portions of the file system that should be reserved to the system. Failing that, you can find a source package and hope that it builds on Darwin. Some do; some don't. The whole point of having a package system is to avoid chasing and resolving all of the dependencies that crop up while building software from source.
And then there's the performance issue... While burning the midnight oil for a work project, building a compute-intensive piece of software, I discovered that my code ran twenty percent faster on Ubuntu Linux inside VirtualBox on my MacBook Pro than it did running natively under Darwin. Same code, same compiler, same options. A limited amount of research on that phenomenon led me to complaints about poor performance of Darwin's underlying Mach micro-kernel architecture. Whether that's the whole story, I don't know. But it's another reason to - as a software developer - seriously consider Linux.
Sometimes change is good. In this case - switching from a recent MacBook Pro to a fire-breathing Ubuntu Linux desktop machine, I ended up with a much more capable machine for relatively little money. I now have a machine that does everything I need - better than my old MacBook Pro - and is, so far, as free from drama as I have come to expect from the fine products that Apple has produced over the years.
That said, Ubuntu probably isn't for everyone. Despite Linux's ubiquity in embedded systems and its ascendancy in the data center, it remains a fringe player on the desktop. If you're tech-savvy - even if you're primarily a Mac user - you can almost certainly overcome the few obstacles that a recent release of Ubuntu might place in your path. If you've ever set up your own Windows machine from scratch, Ubuntu will be a picnic in comparison. (BTW, the UI gestures on Ubuntu are closer to Windows than to Mac OS X...) But if you're the kind of person who needs extensive hand holding when it comes to using your computer, whether it's the perky person at the Apple Genius Bar or your cousin who works in IT, then you'd be advised to ascertain your support network before attempting a switch to Ubuntu. Ask around; you may be pleasantly surprised to find someone nearby who can help you.
If you've made it this far: thank you for putting up with my rambling account. I know that some folks would like for me to skip all of the exposition and cut to the chase. There are plenty of people on the `net who will do that for you; I'm not one of them. I believe that it's important to question what one thinks one knows. Unfortunately for the TL;DR crowd: that process requires a considerable amount of introspection. That said, my introspection is just that: introspection. I can tell you what I'm thinking and what you may want to consider for yourself, but the journey and resulting choice is ultimately yours.
At the very least, you owe it to yourself to get out there and do some research. I'm sure you know that things change quickly in the tech world. What may have escaped your attention, though, is that the players' motivations change along the way.
Apple, for example, introduced new paradigms of computing to the mass market, became the darling of both creative and fashion-conscious consumers, and fostered a legion of followers who firmly believe that the company can do no intentional wrong. Apple broke down much of "old media's" resistance to electronic content delivery, changing the way that we listen to music, watch TV and movies, and even read books and newspapers. But Apple is undergoing a subtle transition from being a computer company to being a media conglomerate.
It's true - Apple's product placement is still all about powerful, sleek, cool computers and lifestyle applications. But the real product - driven by the corporate machinery behind the facade of cool - is the delivery of potential consumers to content providers, with Apple taking a cut of every transaction. The iAd API isn't there for you, the consumer; it's there for the benefit of Apple's content-provider partners. Yah, I know... it's not a life changing revelation. Still, it's something that a savvy consumer should keep on their radar for the years ahead.
And with that, this chapter (novella?) is done. But if you care to read on, I'd like to leave you with a well-deserved plug for Old Town Computers.
I've already describe how happy I was with the expertise provided by OTC and with their delivery of a system built from components that have - in their own experience - been found to be both robust and cost-effective. What I'd like to share with you, in closing, is a brief anecdote about how they dealt with the one glitch in my system build.
As I noted, I had a difficult time figuring out what components to combine to create a working system. That's why I contracted OTC for my system build. Based upon my assertion that graphics performance was a tertiary concern beyond the basic ability to display the Ubuntu UI and watch the occasional movie, Alex recommended the cost-effective expedient of relying on motherboard graphics. What neither of us realized until the system build was about to start is that the particular CPU chosen for the build was unique within its product family by not having an onboard GPU to drive the motherboard graphics.
Surprising, yes. Unexpected, yes. But I certainly can't fault Alex for having missed that. It took me a week of reading the invoice and poring over the product data sheets before I became aware that there might be a problem. The fact that Alex missed that over a period of an hour and a half while also dealing with other customers is completely forgiveable.
When I became aware of the potential problem I emailed OTC, authorizing them to add a graphics card at my expense if necessary to make the system work. I knew it wouldn't be a lot of additional money, and had room in my computer budget to cover the anticipated expense. I heard nothing more until I picked up the system and was informed that the build did indeed require the addition of a graphics card.
What happened next floored me. I pulled out my debit card and asked "What did that add to the cost?" I really wasn't expecting the answer I got: "It was our mistake and we're covering it."
Over the years I've lost track of how many businesses have overpromised and underdelivered, leaving me with no reasonable recourse but to pony up hidden charges or cut my losses and take my business elsewhere. Kudos to OTC for doing the morally responsible thing, in contravention of what has seemingly become standard operating procedure for American businesses everywhere!
If you have tech needs - whether it's a new system build, an old system upgrade or repair, data recovery, or any of the other services or products they sell - I hope you'll give OTC your consideration. They've treated me well and will get my future business.