On my rounds of the interwebs this morning, I came across this excellent rant by Ken Rockwell:
While there are some specific points that I don’t agree with, I think on the whole he is absolutely right. New stuff really does suck.
An example (not that a single example proves anything, but hey—this is a rant, not a proof): I bought my first CD player in 1987, and I used it for about twelve years, until I replaced it with my first DVD player. That DVD player started having tracking problems after about five years, meaning a replacement was needed. I felt cheated. What I didn’t realize at the time was that in the years since 1987, generally accepted standards of quality in electronic components had fallen substantially. In the mid 2000’s, a DVD player lasting five years was considered exceptional.
My second DVD player purchase was fairly painless, due to the low price (only about 15% of what I paid for the first one), but the fucking thing only lasted about a year before it started having the same problems as the first one, and then one day it just died completely. What’s more, based on reviews I read of that brand, a one year life span was actually better than average. Typical reviews went something like this: “Um yeah, mine lasted about a week past the 90 day warranty, but hey, it’s only a $50 player so it was no big deal. I went out and bought another one just like it.” Translation: He bought a cheap piece of crap DVD player which failed after three months, so he rewarded the company who perpetrated this travesty by going out and buying another one exactly like it. There were a lot of reviews like that. It was very disheartening.
Today, I am on my third DVD player, and, while it is lasting a much more respectable amount of time than that second one, it’s not as good as the first one was in its heyday. And the remote control sucks (more on that below).
Back to Rockwell: He also rants about cars, and I find I’m in complete agreement. I drive an older car, and to be perfectly honest, I would rather not replace it. I’m probably going to have to, though, because it’s so old now that keeping it running and operational is getting to be an issue. Important components are becoming difficult to find when the originals fail, and the rust on the body has gotten to the point where trying to abate it would be pointless. However, I’m not looking forward to replacing it with a newer car because, frankly, all the newer models suck by comparison. What’s more, this old, 1980’s model (which doesn’t even have fuel injection!) gets gas mileage comparable to a lot of modern day hybrids, meaning whatever I upgrade to, I’m also going to have to budget about a 50% increase in my gasoline costs. Ridiculous! What I’d really love is if I could just buy a car identical to the one I have, except brand spanking new. Anyone got a time machine I can borrow?
Rockwell also bitches about how cars and appliances don’t have enough knobs on them and how the few buttons they do have are grossly overused (a sentiment with which I completely agree), but then he raves about how his Apple remote control and iPod only have four or five buttons that do everything. Um, ok. :)
On the other hand, his comments about poorly designed TV remotes are right on the mark. The people who design these things seem to assume that I am actually going to be looking at them during operation, so they fill them up with exactly even rows and columns of exactly identical buttons. This is stupid. When I’m watching something on the TV, there are two facts which need to be considered: 1) I am not interested in taking my eyes away from the screen. 2) More importantly, even if I was, it’s usually too dark to read tiny little letters on the damn remote control anyway! Once past the initial learning curve, a properly designed remote operates entirely by feel. This means different buttons have to be different shapes and sizes. I’ve had remotes like this (on an old Toshiba VCR, for instance), and they were a joy to use.
The other critical feature of a good remote is not having to point it exactly at the sensor! A remote should work perfectly well even if you are pointing it as much as 45 degrees in the wrong direction. I’ve used remotes that were even better than that: They worked fine when pointed at the ceiling or exactly in the wrong direction (i.e. the infrared signal would bounce off the opposite wall or ceiling and still be strong enough to control the device). By contrast, my current DVD remote is one of the most finicky remotes I’ve ever used, requiring me to point it exactly at the front of the DVD player, with no obstructions whatsoever, and even then I often have to press a button twice to get it to work. What’s more, some of the most useful buttons on it are small, tactically identical rectangles, making it hard to operate by touch alone. (This remote, however, is one of only two flaws on an otherwise excellent player. It’s a Panasonic, in case you’re interested.) Why don’t I just buy a universal remote? Well, that’s a whole other subject. ;)
Rockwell’s essay also covers the loudness war, in which the audio quality of compact discs has deteriorated over the years. Rather than actually utilize some of the huge dynamic range possible on compact discs (which was a key selling point of the format back in the 1980’s), the dynamic range is compressed into a narrow band and amped up until the top of the waveform clips. This is so the recording sounds better when played on cheap-shit five dollar headphones, junky computer speakers, narrow bandwidth radio stations, and elevator P/A systems. For these limited purposes, the technique works fairly well. It’s a disaster, however, for getting an actual hi-fi experience out of the disc on even a mediocre quality full-fledged audio system.
So I don’t buy CDs anymore, unless I’m confident I’m getting a version that wasn’t mastered recently. I sold a bunch of discs back in 1993 because I needed some cash. Nowadays, I seriously regret doing that. Many of them, I assume, will have been remastered since I originally bought them, and it’s a safe bet that the newer version will be inferior to the old. What I need is a place to buy old, used compact discs. Or a time machine.
There’s more. Rockwell doesn’t like Snow Leopard, for instance, although his complaint about it only pertains to aesthetics. My own beef with Snow Leopard is that Apple decided to further obfuscate the definition of “gigabyte” by siding with hard drive manufacturers who, for years, have been defrauding the public with devices that don’t have as much capacity as advertised. They weasel out of committing actual, legal fraud with language in the fine print stating that, according to their definition, a gigabyte is one billion (1,000,000,000) bytes. This is a lie, and they only get away with it because there is no legal definition of “gigabyte.” In practice what it means is that you hook up your “one terabyte” hard drive to a Leopard or older system, and it shows that you only have 931 usable gigabytes. Windows users face a similar problem. Snow Leopard has “fixed” this by simply repeating the lie and reporting that the drive is a full terabyte. Meanwhile, RAM continues to be defined correctly, in powers of 2, like it always has, meaning an Apple system now has two different kinds of gigabytes: the kind on the hard drive, which is 1 billion bytes (or a terabyte being 1 trillion bytes), and the RAM kind, which is 1,073,741,824 bytes (that’s 2 to the 30th power). Furthermore, a gigabyte on an Apple Snow Leopard system will be different from a gigabyte on Windows, which should be fun for those of us who actually need to interact with the Windows world in a more-than-trivial way.
What annoys me more than anything about this problem, though, is that the average computer geek seems to believe Apple’s gigabyte trick is actually a good idea, because the “definition of giga is one billion, according to the SI standard.” This is the same type of pedantic, know-it-all mentality which came up with the “proof” that the 21st century started in the year 2001 because “the definition of a century is 100 years, and there was no year zero.” These people fail to understand that having two different, incompatible measuring systems is a much greater problem than having a few noobs complaining about their terabyte hard drives being smaller than expected, and a vastly greater problem than the utter triviality of the SI unit prefixes being used “incorrectly.”
The real solution would have been congressional regulation, that is, legally defining the different units as 1024 of the smaller unit, i.e. 1 terabyte = 1024 gigabytes, 1 gigabyte = 1024 megabytes, 1 megabyte = 1024 kilobytes, and 1 kilobyte = 1024 bytes. However, not only is that politically impossible in a nation where the main legislative body has stubbornly avoided their constitutional duty to “fix the standard of weights and measures” pertaining to computer systems, but it is already too late. For such regulation to have been truly effective, it would have needed to happen back in the 1980’s, or early 1990’s at the latest. Now that Apple has further muddied the waters, and geek-laden organizations like the IEC, IEEE and ISO have thrown their weight behind using two completely separate systems, meaningful regulations are never going to happen.
One thing that would really help would be if Microsoft would make the same change in their next version of Windows. That, at least, would solve the difficulty of using two different measuring systems depending on which OS platform you use, and it would keep the “kilo means 1000 not 1024” pedants happy. It would not resolve the inconsistency between hard drives and RAM, nor the fundamental inconvenience of having two very similar but not-quite-the-same systems in use simultaneously, but it would at least be an improvement over the current mess.
Anyway, this rant has been off the rails for some time now, so I’d better stop. :)