I broke my phone the other day. It had a problem reading the battery voltage and would spontaneously shut down because of low battery indicators, even though the battery was nearly fully charged. The second time it did that in an hour, it met the floor with a bit more force than it should have. Gorilla glass is tough, but it makes the most interesting patterns when shattered.
Since it was almost two years old, I decided to upgrade to the new Samsung Galaxy S3, since it has many improvements over my older Epic 4G (Galaxy S) and the newest Android operating system. Upgrading when a device is broken is a perfectly understandable and even expected thing to do. But these days, it seems we are expected to upgrade everything even when we don’t need or even want to.
The morning after I brought home my new smartphone, my wife’s Galaxy S2 phone upgraded itself to Android 4.0.4 (AKA Ice Cream Sandwich or ICS). This is not a bad thing, but it necessitated upgrading several of her apps as well. Later that same day, I was working on my old Powermac G3 computer at work and I got several errors when I tried to watch videos from some tech and news sites. It seems Adobe has upgraded Flash and the flash player in my G3 is no longer compatible with the codecs being used these days and Adobe is not making the updates compatible with Motorola based G3 processors like the one in my Mac. Adobe is on the hook for another serious annoyance that pesters me virtually every single day: Adobe Acrobat Reader updates. I get a nag almost every time I sit down at the computer to upgrade this software. Java is another one that is almost as bad. When I turn on my phone, I get anywhere from 4 to 40 app alerts from the Google Play (formerly Android Market) to update the apps on my phone, and periodically, my computer will prompt me to install new updates to windows. It seems that software designers never heard of the old adage “If it ain’t broke, don’t fix it.”
(as I typed that line, Android just popped up with “updates are available” again, and I just updated last night)
Once upon a time, software developers wrote programs (apps in today’s vernacular) and spent a lot of time and money debugging them and tweaking them to make them as good as they could be. When they released them, the software was packaged in a box with a user’s manual and the disks or CDs in jewel cases and a user took them home, installed them and that was that. If a program was updated, it was a significant change in the user experience with additional features added that would prompt the change and it would more than likely cost money. There were rarely “bug fixes” because most bugs were discovered before publication. Oh, sure a few got through, but those were dealt with. With the advent of the internet, software companies found they could deal with the few bug fixes with a “patch” that could be downloaded, thus reducing expense. This lead to more software being released with more bugs, since deploying the patches was easier and cheaper.
Now software is released and then updated within days of each other.
Upgrades are not necessarily a bad thing, many programs get significant new features or improvements in upgrades. Antivirus software must update regularly to handle the ever changing threat of viruses and hackers. However, some updates actually worsen or even break the programs by removing features or options with which users are familiar. Apple removed the ability to format filenames in an earlier version of iTunes and Microsoft is going to alter media center software with the Windows 8 version so that it will no longer be a boot option and must be downloaded as an add-on rather than being included in the release. These do not even touch the number of apps that are completely broken and stop working when an update is downloaded and applied. Android wins the contest on this issue since developers have to try to make the apps work on various hardware platforms.
Back in the old days, when the hardware makers built faster machines, software makers wrote programs that utilized these faster processors and memory. If a user had an older machine and they wanted to run the newer programs, they would need to get a newer machine (or at least upgrade the older machine if possible). This was known as the forced upgrade path and Microsoft and Intel kept each other in business for years doing just that. App developers don’t seem to be using that logic. In fact, they don’t seem to be using any logic with these updates.
With the ever increasing amount of apps available on so many different platforms, it forces one to wonder why the developers can’t leave well enough alone. If it works, leave it alone. Most of the updates have no appreciable difference in user experience, add no new features and are indistinguishable from their predecessors. If a developer makes a significant improvement to an app, fine, push the update, otherwise stop pestering me to update.