An interregnum is a gap in governance, most commonly when a monarch dies without a child old enough to take over. For decades the world has grown used to the idea that computers would get better and better year by year as engineers were able to fit more and more transistors onto a piece of silicon economically in a process described by Moore's Law. Sadly, that law looks to be running out of steam. So we're going to have to go through an interregnum while people discover some other substrate for computation that can be pushed further than silicon transistors could.
Transistors have come a long way since they were invented in 1926. They couldn't be built by the people who conceived them but in 1947 Shockley et al figured out how to make a practical transistor for use various electronics such as radios and then in 1954 the first transistorized computer was developed: TRADIC. It was an amazing device at the time because computers were usually room sized whereas TRADIC was only the size of a refrigerator. It also consumed only 100 Watts of power instead of many kilowatts and was nearly as fast as the best computer built with the then standard vacuum tube. Sadly transistors were still much more expensive than vacuum tubes.
Then, in 1959 people built the first silicon chip with more than one transistor on it. People started putting more and more, smaller and smaller transistors on pieces of silicon. In 1965 Gordon Moore noticed what was happening and predicted that by 1975 people could fit over 65,000 transistors on a single chip. Sure enough the size of transistors continued to shrink exponentially and in 1975 people were stating to talk about "Moore's Law."
And ever since then the size of transistors has shrunk by a factor of two more or less every 18 months. For a very long time, until 2005 or so, shrinking transistors brought faster clock speeds. The amount that transistor leak is governed by the voltage used in them and the size of the gate and until we hit 90 nanometers in 2005 the amount that transistors leaked was tiny compared to the amount of power required to flip them from a 0 to a 1 so everybody left the voltage the same. Ever since then we've had to worry about leakage currents much bigger than switching currents and so we've shrunk voltage at the cost of no longer increasing clock speeds. A piece of silicon can only dissipate so much heat per square centimeter.
And now the shrinkage of the transistors themselves looks like it will begin failing. Here's a Nature article from last year predicting in demise. Here's Sophie Wilson, the genius behind the original ARM processor saying she doesn't think there's that much time left. And now Intel has repeatedly delayed moving off of 14 nanometers with constantly slipping deadlines for Cannonlake, its first 10 nanometer chip. The end is not yet but it looks like it'll certainly be here by 2025.
Thankfully there's good reason to believe that this isn't anything like the end of progress in computation. For a long time steam engines became more efficient periodically but eventually that stopped because there's a fundamental limit to how efficient an engine can be bounded by an ideal called the Carnot cycle. When engines got close to that bound progress slowed down.
Luckily there are firm physical limits we understand to how efficiently you can perform computation and we still have a ways to go. Current gates in high end silicon take around an attoJoule to perform a simple 'and' or 'or' computation but Landauer's Principle says that it's possible to do it for 2.75 zeptoJoules, 500 times less. And thankfully by reducing the ambient temperature we should be able to do better. Regarding speed there's another limit, Bremermann's, that we're even further from.
So what could take the place of Moore's law? Off the top of my head different substrates such as diamond or carbon nanotubes. Computation through magnetic spin. Computation through photons. Quantum computers. Ballistic electrons. Nano-mechanical systems. Nano-electro-mechanical systems. Pure chemical systems like DNA computing. There are quite a few options and all are far from being ready for commercial use. Still, there's no reason to think that we won't be able to make one of them work eventually. In the mean time maybe we'll face stagnation or maybe we'll have a golden age of computer architecture where we learn to do more with the transistors we have. Only time will tell.
Many of the worst pandemics that afflict us are from pathogens that don't normally prey on humans. Probably the most famous pandemic in...
Lets say I'm telling a story about some hiker heading up into the mountains. I mention that when passing under a cliff a pebble came lo...
Blindsight, made famous by a book of the same name in science fiction circles by Peter Watts, is a disorder caused by damage to the primary...
For an author it's important to get things that the reader might find hard to swallow out there and dealt with as soon as possible. I d...