Now that we're in the era of people wondering if Moore's Law is ending it's probably worth looking a little bit at what Moore's law really is.
Way back in 1965 Gordon Moore of Intel publish a paper showing that as the years went by the number of transistors you could economically cram onto a piece of silicon went up. The more transistors per chip the cheaper it was to make the chip but at the same time the greater the chance that one of those transistors would have a defect and the whole chip would have to be thrown out. Between those two forces there was a happy medium number of transistors that would get you the most compute for you buck, and Moore observed in 1965 that that happy medium number of components tended to double every year with improving technology. He also observed that with decreasing device size the power used per transistor would decrease and that the overall power consumption would remain manageable despite the explosion in the number of circuit elements.
Of course, nobody called that Moore's law back then. And Moore proceeded to continue talking about the future of the integrated circuit industry, especially around more, smaller transistors to integrated circuits. And it was not until 1975 that people began to really pay attention and start throwing around the term "Moore's Law." This confuses matters because it was also in 1975 that another famous paper was published by a group headed by Robert Dennard. It laid out some of the consequences of shrinking transistors. They laid out how smaller transistors would be able to be blocked faster over time.
When "Moore's Law" came into being as a phrase the size of transistors, how many it made sense to put on a chip, how fast they were clocked, and how much power they used per calculation all got better proportionally to each other and nobody really bothered distinguishing between them. At one point someone asked Moore about this and he issued a memo endorsing this broader view of the usage of "Moore's law." Sadly I don't know where to find it online but I received a photocopy in my MIT course notes on the subject.
Everything went along quite well until 2005. For a long time the fact that some leakage occured in transistors was well known by the quantities of charge leaked were so small that nobody worried about it. The amount of charge used in switching bits from 0 to 1 and back was far larger. But as the generations of transistors went by the amount of leakage, compared to active power, kept getting bigger and in 2005 the two curves suddenly intersected and people had to start making changes to how they designed their transistors.
With that the relationship between transistor size and speed that Dennard et al had discovered broke down and while transistors continued to shrink they could no longer be clocked so aggressively and clock rates have only progressed incrementally since then. We have continued to add more, smaller transistors to our chips and we've managed to continue making them use less power. But 1 of the four elements is gone.
As I've written before Moore's Law won't last forever. You can only take a transistor and shrink it so much. But I wonder if I might not have been precisely right. Yes, transistors may keep shrinking but will the price of a transistor stop going down? Will we stop being able to make transistors more and more reliably? I think we we'll stop that too eventually but that part, the original part, of Moore's law might be the last part to run out of steam.
Thankfully, again, we know the fundamental limits to how efficient we can make a computational process and we're only at 0.00001% of the theoretical limit. We've been making computation more power efficient for a lot longer than we've been making transistors too, through mechanical computers then electromechanical ones then vacuum tubes then finally the transistor. So it makes sense that progress along that axis will continue even after Moore's law is completely over, though who knows what form it will take.
Many of the worst pandemics that afflict us are from pathogens that don't normally prey on humans. Probably the most famous pandemic in...
Lets say I'm telling a story about some hiker heading up into the mountains. I mention that when passing under a cliff a pebble came lo...
Blindsight, made famous by a book of the same name in science fiction circles by Peter Watts, is a disorder caused by damage to the primary...
For an author it's important to get things that the reader might find hard to swallow out there and dealt with as soon as possible. I d...