People have been complaining about computer glitches since the punch card days. Honestly, it's 2010. Why don't we have simple stuff like drivers and software compatibility fixed yet? Can we blame it all on Bill Gates?
Well, the bad news is that there are really good reasons your PC doesn't work quite right, even now. The worse news is that it's probably never going to get better.
You can blame ...
Have you ever wondered why the newest computers still start up with a screenful of text and a "Speak-and-Spell"-esque beep? While pretty much every component has been replaced with new standards at least once since, the core of modern computer design is still based on the design of the 1981 IBM PC.
An elegant computer, for a more civilized age.
The reason the IBM PC design was so popular was the same reason the actual IBM PC wasn't -- IBM took a page from Eli Whitney's book and allowed each part of the computer to be interchanged with a part made by a competitor, meaning that you could (and today, almost definitely do) have an entire PC made from non-IBM parts.
So, say that one day the Taiwanese company that made your sound card finally goes bankrupt or is captured by China. If you want to use it with the next version of Windows, you're shit out of luck, as anybody with a sound card more than a year old was when Creative refused to write less-crappy drivers for Windows Vista until Microsoft did it themselves.
Vista was to Windows what The Phantom Menace was to Star Wars.
If you're using Windows 7, you probably see the "blue screen of death" less often than you used to -- it's a lot better about not letting badly written programs crash your system. But when you do see it, you can bet it's due to drivers written by the aforementioned hardware designers, who may not speak English, which, unfortunately for them (and by extension, us), is the language all the books on how to program drivers are written in.
Every version of Windows attracts complaints about old programs not working with the new version. People assume this is because Microsoft is stupid and lazy and didn't bother to accommodate the perfectly reasonable design of older programs.
Steve Ballmer, CEO of Microsoft.
Microsoft deserves a break on this one. The insane work Microsoft does to keep backward compatibility is like one of those games where shit just falls from the sky endlessly, and as you keep catching it, it just falls faster and faster until you miss one, and then the game calls you a loser.
It all seems so very pointless.
There's a lot of responsibility involved in programming, such as making sure the program lets go of resources it's done using, and not making assumptions that, if they turn out to be wrong, will make the user's computer shit its pants. Some programmers take this as a dare.
For instance, the original SimCity was really sloppy about how it used memory (i.e., it told DOS it was finished using a bunch of memory and then immediately start using it anyway). This was fine as long as the system wasn't using much of its memory anyway (and DOS didn't). But then Windows came along and started needing this memory that SimCity was tying up. So the game crashed.
Games didn't tend to outsell Hollywood blockbusters back then.
Microsoft fixed this by writing code so Windows would detect if SimCity were running and do special tricks to make it work. Then Microsoft did that for the next piece of software that fucked up. Then the next. Here is the current list of "we found this program doing stupid shit and have to work around it" applications Windows currently has to look for. There are 6,520 of them.
If you got a funny image in your head of a Microsoft employee running down some store's software aisle with his arms out, knocking every stupid product they hit into his shopping cart so he could look at them ahead of time rather than wait for complaints to roll in, good, that's because that's exactly what he did. If you're Microsoft, you can't just let their software crash the user's PC (the user will blame you), and you can't just wait for the software manufacturer to fix it (hell, it may not even be in business anymore).
Their only other choice is to ban all software that Microsoft hasn't given explicit permission for ahead of time -- but that's what has people furious at Apple. This is why you hear people talking about "jailbreaking" their iPhones -- they're trying to circumvent that system.
Get busy living, or get busy downloading third party apps.
It's a no-win situation.
Not that we're letting Microsoft off the hook here. They screw up, too, and what's worse is that they get locked into their screwups.
While we hate to go back to the same well twice ...
If something in Windows is broken, people writing programs for Windows have to work around the broken thing, like a tailor making a pair of pants for a man with only one leg. But that program, like the pants, now only works with the broken system. If Microsoft then updates Windows to fix the previous screwup, that software no longer works, same as if our one-legged man got a donor second leg and then tried to wear his old pants. They either have to leave Windows broken, or break the software.
And then you have to account for the human factor -- people who just got used to the glitch. For example, we have this fucker:
The most hated pixelated character since the Duck Hunt dog.
The little irritating animated paperclip who'd show up every time you touched your mouse in Microsoft Office. Or maybe you used the dog:
Make no mistake: Microsoft knew it had a failure on its hands right out of the gate on this one. When it released Office XP, which came with Clippy disabled, it created an "eX-Paperclip" series of (tragically no longer online) Flash cartoons to promote it with Clippy's voice provided by none other than god of annoyance Gilbert Gottfried.
Nevertheless, it took three versions of Office to get rid of this little bastard. Why? Because every feature, no matter how terrible, will have users relying on it. When one company rolled out the 2003 version of Office, the corporate help desk almost instantly received a call from a distraught user over the loss of the dog on her computer.
Likewise, thousands of you reading this are doing it with Internet Explorer 6. As recently as a few months ago, 10 percent or so of the Internet was using it, only because it's the version of browser that originally came with Windows XP.
Some of you may have noticed that the Web has changed a bit since 2001 (people were just barely starting to use Google back then). And when the Web changes, the technology you use to make the Web has to change with it. But because so many users are still using a browser that's a decade behind, they can't advance too far forward.
We're betting it took folks a long time to advance from "sharpened rock" to "sharpened rock on stick."
As you sell your PC or software in more and more territories, it means you have to deal with more and more governments and somehow try to anticipate what they're going to be sensitive to. For instance, India nearly banned Microsoft from selling Windows 95 there because of 12 pixels on a map.
To help users figure out which part of the world their computer lived in, Microsoft included a 180-pixel-high map of the world that would highlight the region you had selected. The thing is, there's a small stretch of land you may have heard of -- if you've ever looked at a news source -- that India and Pakistan are constantly warring over, and each country claims it as part of their own time zone. When Microsoft made the images for the map, it included that area in Pakistan's time zone, at which point India threatened to ban Microsoft from the country, a move that would have really screwed Microsoft 10 years down the line when it outsourced all its tech support.
Outsourcing. Because Americans are lazy.
This sort of scuffle is the reason you'll find the ambiguous phrase "Country/Region" all over everything in Windows today. Using the term "country" would mean explicitly recognizing Taiwan as a separate country from China, which the famously permissive and open-minded Chinese government expressly forbids by law.
Oh China, you and your wacky high jinks.
Also, you may recall a certain antitrust scuffle Microsoft got into around the turn of the millennium. The results of that ruling have been a bit of a mixed bag: Microsoft can't include its free anti-virus software with Windows, because that would stop Norton and McAfee from making users pay $100 a year for their own anti-virus programs that declare critical system files to be viruses and stop the entire computer from running in response.
Kind of like how chugging cyanide probably prevents gout.
And if that weren't bad enough...
One of the things proponents of free open-source software like to push is the idea that any regular Tom, Dick, or Harry with an idea can add his knowledge and experience to the programs he uses, send it upstream to the creators and have the fruits of his efforts enjoyed by everyone. But in the real world (America), trying to do this will get your ass sued, because anybody could claim that you're stealing his brilliant ideas, such as clicking things without having to click on them again.
The answer to many of life's problems seems to be "because lawyers love money."
In 1991, the patent office lost its taxpayer funding, at which point its patent criteria changed from "Any new and useful art, machine, manufacture or composition of matter" to "Any idea whatsoever for which applicable patent fees are paid" (as Cracked has covered before).
For example, Sega holds a patent on the concept of "using a floating arrow to direct the player," meaning that any game that wants to point at a door, such as Bioshock, has to pony up money to Sega.
Yet, somehow, they manage to avoid paying royalties to the Rand Estate.
You don't even have to have invented the thing you're suing people for. Companies that buy the assets of bankrupted software companies, known as "patent trolls," make a business out of doing nothing but buying the rights to patents on things similar to something that's being done by a bigger company, then forcing the bigger company to either pay up or remove whatever it is the patent troll claims to have a patent on.
For example, in 2003 Microsoft changed Internet Explorer so that you had to click inside any part of the page using a plug-in (such as Flash) before you could use it. This wasn't because Microsoft wanted to -- it had to do this to sidestep a patent on "not clicking on things before you can use them" that a patent troll called Eolas claimed was being infringed.
Which ironically sounds more like a patent elf.
In 2007, Microsoft restored IE to the way it had worked before -- after paying Eolas several million dollars to "license their technology."
A whole lot of the innovation you're using now -- including the basics of your operating system that were developed, not by Microsoft or Apple but by Xerox -- happened before the era of software patents. Otherwise, those innovations may never have happened at all. Who can afford to pay for every little facet of a system that happens to be similar to what someone else invented?
Hey, remember the "Y2K Bug"? The programmers of the world's software had been using the last 2 digits of 19XX to represent the year, meaning that come 1/1/00, all the computers of the world would instantly go into panic mode if they hadn't upgraded. Society was doomed. People everywhere were partying like everybody had a bomb.
The earliest predictors would attempt to live off the grid by replacing their names with indecipherable symbols.
As it turned out, the computers were much less panicky than the people hiding from them in concrete bunkers were. But bugs like Y2K will never go away -- there's another similar bug we'll have to fix by 2038. Countless little ones will happen between now and then; for instance, on March 1, 2010, a bug in leap day calculation caused all but the newest PlayStation 3 systems to lock users out of playing their own games.
This is why you should never work hard on anything ever.
The problem is that time itself isn't exact. The rate at which the Earth hurtles around the sun isn't constant, which is why we have such weirdness as leap days (which were the cause of the aforementioned Y2K+10 clusterfuck) and leap seconds (when an international group of scientists decides to add or remove a second to an upcoming minute).
And then we have time zones. If you thought time zones were divided up into 24 equal sections with neat vertical lines, you're wrong. Check the time zone settings on your computer clock -- there are 91 time zones on there, because different governments insisting on having their own. In fact, several countries just changed their daylight savings time rules. And yes, your computer needs to know the time and date. Just ask the F-22 fighter pilots who saw their multimillion-dollar flight control systems lock up in midair when the plane crossed the international date line and got confused.
That's all the plane you get for $150 million.
Here's what Minesweeper looks like for most of the world:
Flowers. If you're using Windows Vista (or 7) in America, you'll still see mines when you first start Minesweeper. The flowers show up by default only in other countries, because games about not stepping on land mines aren't so fun when you're doing it with one arm, because you lost the other one to a land mine.
This is the kind of thing you have to think about now, as personal computers have gone from a geek tool to an everyday appliance used on every land mass on earth. But you don't want to have to write and sell 800 different versions of your software. You need it to be adaptable to people all over the world, some of whom have disabilities. Doing that successfully is next to impossible.
Land mines: the bane of the software industry. And also people.
For instance, that hypothetical guy with one arm can still use his computer, because there are devices to help him (there's even a gadget that lets you control the mouse cursor using only your eyes), but you have to write your software knowing your user might be using it that way. Now think about how many different scenarios and disabilities you have to account for. Ask the guys who made BioShock 2 -- they found out only after release that the hacking minigame is unplayable if you're colorblind. This isn't some rare condition -- that's one in 10 people. And it never occurred to them.
And then you have the nightmare that is human language.
Anyone else ever notice that the Tower of Babel looks sort of like Minas Tirith?
The text in early computers used American Standard Code for Information Interchange (ASCII), which defined 128 characters it could display, including the 26 letters of the English alphabet.
But when the rest of the world started using computers, they needed to include their umlauts and their Russian backward R's and their yen signs. To fix this, they created Unicode, which increased the range of characters to several million, allowing languages from Chinese to Farsi, and symbols like a hammer and sickle, and an exclamation point dotted with a heart. Fixed, right?
Not quite. To support actually drawing all these things on your screen, you have to build your system to allow for all sorts of other variations -- languages that write text from right to left, or with letters that can have rows and rows of extensions above them. Each new attempt to accommodate everyone adds more complication and more opportunity for something to go wrong.
In conclusion, diversity makes us weak.
We're not saying this will never get fixed. We're just saying that it isn't going to get fixed until all the people of the world start speaking the same language and sharing the same values, and all disabilities are cured.
At which point we will buy the world a Coke.
Stuart P. Bentley is a programmer with binoculars peeping through Microsoft's bedroom window in Redmond, Washington. He has a website at TestTrack4.com
And stop by Linkstorm because it's too cold to go outside.
Do you have an idea in mind that would make a great article? Then sign up for our writers workshop! Know way too much about a random topic? Create a topic page and you could be on the front page of Cracked.com tomorrow!