Recently my Daughter decided that she would have to buy a Mac computer. See When you are forced to by a Mac. The decision, other than a desktop or a notebook, see Desktop or Notebook computer then had to be made as to which model. For a Mac, because their is only a small range it is a bit easier than a PC, but the determining factor was how powerful did the computer need to be to do the tasks see wanted to do.
Of course if you go to any place selling computers they will inevitably say that you will get your work done faster if you have a more powerful computer. But the more powerful computers cost more because they have faster processors and usually have what is called a higher specification. Im note sure but I think this is more a marketing strategy by computer and computer component manufacturers because other the initial research and development costs I can not see how it costs any more to manufacture a faster processor than a slower one. Putting aside the reasons for price differentiation, how can you tell what level of powerfulness you need to buy.
There are many benchmark tests that are written about that supposedly rate the performance of a processor in a computer. But what they dont do is show specifically for the task you are going to do how long it will take you to do it.
The slowest part of doing anything on a computer is what you the user do. Typically that is keying in information, using the mouse and navigating through the application. On second thought the slowest part is thinking about what you want to do and how you are going to do it by using the keyboard mouse and the application. Other than the human aspect what is going to be so slow that you are willing to pay more money to have it go faster
There are two different delays where the slowness of a computer is apparent. Interactive delays and batch delays. Interactive delays are typically where you do something and do not get a almost instant response. Like pressing a key and the letter not appearing on the screen or clicking on a button and nothing happening. These can be very frustrating especially if you are in a hurry or under pressure to have something completed. These type of delays are not very common on modern computers and could be a sign that something is wrong with your computer. But where they do exist having a processor that is twice or even four times as fast is only going to reduce a 2 second delay to 1 or half a second and considering the human aspects described above it is going to have very little effect of speeding up the overall task at hand.
Batch delays are typically where a message is displayed telling you that something may take some time or a progress bar is displayed showing how much of the process has been completed and is to be completed. In some cases an estimate is given as to how long the task will take to complete. All other things being equal a faster processing computer will make a greater difference to long batch tasks. A computer that has a processor twice as fast as another computer could reduce a 4 hour task to 2 hours. Provided that that 2 hour time saving can be utilised and the cost of the faster processor is less than cost of the extra two hours it could be worthwhile having the faster processor.
Generally after performing a particular task the batch delay times are known and can be factored in to the overall tasks at hand. Often if you know a particular task is going to take say 6 hours you may set it going to run at night so the results are available the next morning. However if the resultant of one batch task has to be used in subsequent batch task(s) requiring further human interaction then a faster processor may be an advantage.
Often it can be that delays, batch or interactive, have nothing to do with your computers processor. A classic example is clicking on a link in web site and having to wait quite some time for a web page to load. It could be that the web serving computer is overloaded with requests so will be slow in responding or that you have a slow internet connection or their is congestion on an otherwise fast internet connection. In all these examples having a faster processor in your computer will make no difference because the delays are not within your computer
Because software developers tend to use high performance fast processor computers, sometimes they develop applications in a very inefficient way, such that the application runs slower than it should. Because they are using fast computers they don't see this slowness. I have done this myself and seen it in a number of applications. In one case a task that took minutes when re-written in a later release took less than a second.
I have also noticed that a lot of applications tend to have built in delay waiting times. Often in batch tasks used to install applications their can be long periods of time where nothing is happening. It appears to me, that the developers put in the delays of say 20 seconds so that they can be sure that previous function has been completed before another function, relying on the first function being completes, is started. Regardless of how fast your processor is the 20 second delay will always be present. Their are other techniques that developers can use without these delays but they require more effort on behalf of the developers.
Unfortunately many of us computer consumers where forced to purchase a faster computer simply to be able to run Windows Vista. Vista must hold the record for the worlds most commonly used, slowest operating system. As was evidenced by the corporate non acceptance of Vista it can be more efficient and cheaper to stick with an existing operating system or move to a faster one than buy a faster processor computer.
My experience is that the latest greatest fastest computers in the workplace tend to be given to those in the higher executive positions. Often these executives do not use any applications or perform any tasks that require such computing power but have these computer as part of a status symbol.
In the workplace often those that could utilise the power of a faster processor are denied such computers because they are not high enough up the hierarchy
Forty years ago many of the functions that are commonly done on today's modern computers were being performed. An example is word-processing. Yet those computers of forty years ago only had a fraction of the power of today's computers. So if you where only going to do word-processing or something equivalent why would you need a computer more powerful than one of forty years ago?
Bloatware or Software bloat is the basic answer. Weather you want certain functionality on your computer or not you get it. And to be able to run that software, with functionality you may never use, you have to have a computer that has processing power much greater than the computers of forty years ago.
Bloatware http://en.wikipedia.org/wiki/Software_bloat is where successive versions of a computer programs have a greater amount of functionality that end users do not use, use more system resources than necessary, while offering little or no benefit to those users.
Rather than having a faster more powerful computer in some situations where batch processing is performed 2 or more less powerful computers may be able to complete the task faster than the one faster commuter. In these situations the multiple computers need to be interconnected via a local area network or via the internet.
An example of are some video rending applications that allow multiple computers to render video. Where one powerful computer may take say 10 hours to render a large video source file and a less powerful computer would take 15 hours to do the same task 2 of the less powerful computers may complete the task in 8 hours.
Another larger scale project is http://setiathome.berkeley.edu/. This project, Search for Extraterrestrial Intelligence (SETI) takes the unused capacity of 1000's of home computers only when they are idle via a screen saver applications and uses that otherwise unused capacity to search through gigantic amounts of information that has been acquired from radio telescopes.
Yet another example is Google. The Google search that gives you very fast results of any search you do is not performed by one of a few very powerful computers but rather 1000's of computers working in unison that are most likely less powerful than the computer you are reading this on.
See also http://boinc.berkeley.edu/
True cloud computing http://en.wikipedia.org/wiki/Cloud_computing is not only the storage of information on external computers but the processing of your information on computers other than your own. Is like purchasing electricity from the electricity company rather than generating your own.
Like in the section above these computers in the cloud could number in the 1000's thus in some situations being able to dramatically outperform any one home computer.
This cloud computing processing combined with vitalisation is application specific but has the potential to dramatically change computing in the future. Corporates are already seeing the benefits of cloud computing in being able to greater utilise the computing capacity they have and are buying. Only time will tell if these benefits will filter down to home uses such that they will be able to utilise very low cost ($20) mobile devises communicate with the cloud and get processing power greater than the average home computer of today.
Have your say by clicking on link Below