I'm writing this on a 13-inch, Mid 2011 MacBook Air.
Recently I started looking at a new laptop. So, I compared the specs between the newest 13-inch MacBook Air and mine. Here they are (default configurations):
1.6GHz dual-core Intel Core i5 with 3MB shared L3 cache
2GB or 4GB of 1333MHz DDR3 onboard memory
Intel HD Graphics 3000
64GB or 128GB flash storage
1.6GHz dual-core Intel Core i5 (Turbo Boost up to 2.7GHz) with 3MB shared L3 cache
4GB of 1600MHz LPDDR3 onboard memory
Intel HD Graphics 6000
128 GB PCIe-flashopslag
You basically get the same laptop with a better graphics card.
Sure, you now get Intel's Turbo Boost technology, but that's about it.
What about the benchmarks?
So I looked at the benchmarks. I used the Geekbench browser to check aggregated benchmark data.
MacBook Air Mid 2011 averages about 1800
MacBook Air Early 2015 averages about 2400
Which equals a 33% increase in performance.
Whatever happened to Moore's Law?
Moore's Law states that the number of transistors doubles every two years. Thus from 2011-2015 the number should have increased 4-fold. Of course, due to increased complexity in connecting transistors, this would not result in a 4-fold increase in CPU power, but a 2-fold increase wouldn't be bad. According to Pollack's Rule (regarding microprocessor advances), this number would be even higher (albeit not much).
However, what I'm seeing here is not a 300%, not even a 100%, but a 33% increase in performance after 4 years.
What has happened?
Could someone enlighten me what has happened? Why is the performance lagging?
Have the transistors shrunk, thus allowing lower power consumption? Or has the power of the GPU increased significantly?
Let me know what you think below.
*Higher scores are better, with double the score indicating double the performance.