|20 Feb 2009 12:58||
|20 Feb 2009 12:58|
Pay and upgrade?
© 2009 James Kanjo
If you're always on the lookout for the latest news in the Apple community, then it will be no big surprise to you that thousands of people are anticipating their next operating system (OS): OS X 10.6 Snow Leopard. The Microsoft community are doing the same thing with Microsoft's next OS: Windows 7.
But even when it comes out, we will still be challenged by the question that's been on our minds: Do we upgrade?
Traditionally, it is wise in the computer community to not rush and buy the latest OS released when it comes out. Why? It's called bugs galore. All these glitches start to appear and you become frustrated with what's supposed to be "the most advanced OS for the commercial market". Don't be ignorant: this problem is faced by all OS developers (Apple, Microsoft and [insert Linux distribution here]). I say this because you get these foolish fans of particular OSs claiming that their favourite OS is perfect.
Their OSs pretty much have everything they need. Their features are pretty much standard across competing OSs. The only thing they have left to do is work on the security of the OS. Or so it seems…
OS developers are going through what I will describe as the "OS revolution". They all have the OS the way they want it, but they can improve on it by optimising it. Optimising is essentially modifying the programming code to be more efficient; performing the same tasks but using less resources such as memory and processing power.
If they can do that, why didn't they do it in the first place? The answer is simple — it was never a priority at the time of development. Let's just assume that the developers did not have a finite limit of resources. If they have unlimited memory, and the fastest processor imaginable, then there's no point in creating optimised code because the outcome would be negligible (perhaps a micro-second's difference).
Now, snap back to reality where we have finite resources. Memory is not a problem for us — we have oodles and oodles of memory. But our processors aren't getting incredibly faster with each year anymore. They've pretty much reached the limit of 3GHz. The current speed of Windows Vista HP on my 2.0GHz laptop, is what I would describe as "acceptable". It's not mind-dazzling. Suddenly, OS developers can no longer rely on the ever-increasing speeds of processors. So what are they left to do? Optimise the OSs. This is the best way you can "speed up" the computer without using a faster processor.
You may ask the same question: If they can do that, why didn't they do it in the first place? This time, the answer lies in the process of optimising… how do they optimise programming code? This is incredibly hard to do. It requires a severe amount of concentration and an extreme amount of understanding logic. Originally, code is written in what makes perfect logical sense to humans. But this "human logic" is not the best logic — it is slow and takes up more memory. The procedures can be made smaller, faster more efficient and very much more effective. Once this process has been done to the maximum, the resulting code is almost totally incomprehensible to humans; albeit, completely logical and beneficial to machines. This is why developers didn't do it in the first place; it is incredibly difficult to do.
To put this into perspective, imagine a Rubik's cube. Humans can solve it logically, by doing it layer by layer. This takes much time, and certainly exceeds 100 moves/rotations. The fastest method of doing it would be to use "God's Algorithm", which doesn't exceed 22 moves/rotations. The OS developers are essentially looking for "God's Algorithm" for their OS.
So why did I go through all of that? Well, because the OSs are no longer having new features, or changes towards the user interface, they are left to make the OS more efficient. If the OS is currently bug-free, and they are gradually optimising the code line-by-line to improve it, then it is unlikely that new bugs will arise at all. How can they? So the new OS will not be the usual "glitchtopia" that typical OSs start out to be. Perhaps it would be worth investing in such a thing if it increases productivity from day one (with faster speed and less power used, etc, etc.).
Perhaps, unlike the OS predecessors, it is not such an unwise choice to make by buying OS when it comes out.
~ James Kanjo