Has Moore’s Law finally come to an end or has computer progress finally slowed down? After all, if you look at the last 40 years of computer history, and you do the exact same experiment you’ll see what I’m talking about.
Try going to Year 1990, and look at that computer versus a computer from 10 years earlier. They aren’t even in the same league. So I started doing some research and taking a look at various aspects of computers. I started with memory, and looked at intervals of 5 years starting in 1975. At this point the typical computer had 1 or 2 kilobytes of RAM. Five years later that amount increased by a factor of 16 and five years after that increased again by a factor of 16.
Surely this trend of exponential growth could not continue, right? Well, in order to go forward I’m gonna have to shrink this chart down some. There, that will give us some more room. Over the next five years, memory doubled. That’s still nowhere near the growth we saw before. Let’s shrink the chart down again, and look at the next year. This time it quadrupled. Let’s shrink the chart again, and wow, another 16-fold increase in RAM. Let’s shrink again and check this out and this, and, look where we are today. So I think we can safely say as far as RAM goes, there’s been no slowing down on that.
Ok, so what about processing power? Let’s just look at clock speeds starting in 1975. The typical speed was 1MHz. Five years later, there was no real change and in 1985, we made some progress, and that progress continued into 1990. We’ll need to resize the chart again. You can see the CPU speed really started to take off by the year 2000. We were already at 400 times faster than we were back in 1980. Now, we’ll have to resize again, and you can see continuing exponential progress in clock speed. Oddly enough, you’ll notice it peaked, and by 2015 it looks like we went backwards. And well, there are multiple reasons for that for one thing, clock speed is not really the best measurement for determining a CPU’s total power. Another thing to consider is that these computers started off being 8 bits and by the mid ’80s everyone was using 16-bit machines, and 10 years later everyone was using 32-bit machines. And only recently did we start using 64-bit machines. So keep in mind besides just the clock speed, these things can process a lot more data for each clock tick.
And not only that, but most of your modern computers have more than one core and a core is like its own CPU in and of itself. So, most computers have anywhere from 2 to 4 cores, and some of them even as many as 8. So, you can see this chart is not really reflective of CPU power. So one way you might measure the difference in raw power would be with something like Geekbench. You could see the original Macbook released in 2005 gets a score of 2,287 where the latest Macbook gets a score of 6,350. So, if I were to modify this chart to be more reflective of raw CPU power, it would probably look more like this. I also compared things like graphics, resolution, and hard drive capacity and I found the same exponential growth that we’ve seen in the other aspects of computers and that continues up to the present day.
I even compared average cost of a home computer in this chart, and here’s what it looks like adjusted for inflation. You can see the cost dropped quite a bit for a while, and it’s kind of leveled out, it’s even going back up a little bit. That’s probably a result of most people switching from desktops to laptops. So the question I’m trying to answer is: why is it that a 10-year-old computer today is still usable but if you go back not too far in the past, a 10-year-old computer was always obsolete.
If computer progress has not slowed, then what’s the explanation? Well, I have one possible theory, and it actually has less to do with the computer’s hardware, and more to do with the computer’s software. It goes something like this: Now this chart is not scientific, but hopefully it makes sense. If this bar represents the amount of CPU power a computer has then this part represents how demanding the software is using up almost all of the computer’s power. Once a new computer comes out that’s a little bit more powerful the owners of this computer can enjoy a really speedy computing experience.
That is, until the next year when new software comes out that requires a faster computer to use. And then the next year the cycle repeats, and repeats again. And so this game of cat-and-mouse has been going on for decades. However, I think what may have finally happened is that now with each successive year, the software is still getting bigger but not at the same rate that computing power is increasing thus making older computers still capable of running modern stuff maybe not as fast, but it still works.