Saturday, December 17, 2011

Resistive RAM

It seems that there's now something new under the sun in the world of computer memory. Way back in 1971 Leon Chua noticed that there ought to be a 4th type of basic circuit element, which he called the memristor, alongside the traditional resistors, capacitors, and inductors. Then, just a few years ago in 2008, a group from HP was actually able to create one of these things that Chua had dreamed up. I was expecting that someday, maybe by 2018 or so, people would be starting to make things with memristors that were actually usable.

I've recently found that I was way too pessimistic, though, and it seems that people have already made cells of resistive RAM, and that once they can be made cheaply in bulk its going to bring huge changes. If everything works out we'll have memory that is just as fast as RAM but which is non-volatile, that is it can continue to store information even when it looses power.

For many years computers have had separate memory and storage. There are you disk drives which you store all your information on. They have a lot of capacity, but they're very slow. You also have your RAM, your main memory, which is fast but when you turn off the power to your computer all of the information stored on it goes away. Its also more expensive per gigabyte than hard drives.

Recently there's been a trend to use flash memory instead of hard drives. This started with portable media players like the iPod, but now people use flash memory in their phones, their tablets, and some people even use it in their laptops. The advantages of flash are that it is a lot faster than a traditional hard drive and it takes less power to use. The disadvantages are that its more expensive than a hard drive for a given amount of storage, but if you don't actually need more than a hundred gigabytes in your computer it can make a lot of sense.

Why don't people just replace their RAM with flash? Well, there are two reasons. First, it takes a relatively long time to write information to a block of flash memory, and flash eventually wears out. It takes quite a while to wear out and most people don't write to their storage that frequently, but main memory is written to far more frequently and flash wouldn't last too long in that role.

But RRAM can get around all these disadvantages. Writing to a RRAM cell is more complicated than reading from it, but only by a little bit. Its not nearly the huge production that writing to a flash cell is. Also, while RRAM apparently does wear out it does so much more slowly than flash does. Flash can survive somewhere between a few thousand and a million writes before it wears out depending on how its manufactured. The researchers at IMEC say that their cells can last for a billion write cyles, and the HP/Hynix industry consortium says that the cells they're starting to produce can last for a trillion cycles. Because modern computers have caches between their main memory and their processors which will absorb repeated writes to the same place, a trillion writes ought to be more than enough to make RRAM practical as main memory.

This isn't the first time a type of memory has been both fast and non-volatile, but previous attempts have been too expensive. Memory like MRAM has stored bits in cells that were far larger than those used by DRAM or flash. Eyeballing the picture of the RRAM cell that IMEC showed off earlier this month it looks like their RRAM cells are about the same 2000 square nano-meters, the same area you would expect from DRAM cells that are currently in early production.

So we've got this new technology that can feasibly replace both computer memory and computer storage, and unify them into the same substance. But will it? We seem to be past the stage where manufacturing difficulties are a huge risk, but there are other possible problems. For starters, it seems that HP, the company which developed memristors in the first place, has the whole technology wrapped up in patents. If they get greedy and charge more for their technology than people are actually willing to pay then it wouldn't be the first time. Look at what happened to RAMBUS.

The RRAM cells used by IMEC seem to use use hafnium, which is expensive. Its not so expensive that people who make processors are reluctant to use it, however, and its unclear that the cells that HP is planning on taking into production are also Hafnium based.

And finally there's the question of re-engineering our computers. The distinction between memory and storage has existed for almost as long as there have been computers. Our system architectures, operating systems, and software are all built assuming these are separate things. Of course, this is a problem that people have confronted before. Back in the 1970s people didn't realize how much cheaper disk drives were going to get and so planned for a transition similar to what it now looks like we're going to go through. IBM developed operating systems using single level storage where the differences between RAM and disks were abstracted away. Unfortunately, its only one fairly obscure operating system that uses it. Luckily, every common operating system today has a notion of specific files being memory mapped, that is on disk or in memory as needed. Perhaps we'll see a transition period where operating systems simply leave files stored in RRAM alone when they are memory mapped, and software writers start to use memory mapping more and more. This would mean that the transition could be gradual, with existing software working the same way even as the systems they're on change underneath them.

In any event, it looks like RRAM is going to be the future.


Saturday, April 30, 2011

Book Review: The Myth of the Rational Voter

Recently I read The Myth of the Rational Voter by Bryan Caplan and I feel like I got a lot out of it, but from a very surprising direction. Like many people who are interested in economics, I'd long been aware of the idea of rational ignorance, that because it takes time and effort to become informed many people won't in situations where there is no direct benefit to them. Because the effect any given person has on national elections is utterly tiny it doesn't make much sense for individuals to invest much time and effort on learning about political issues since any good they do is further spread out among 300 million of their fellow citizens. This was my explanation for what problems there are in the US political system before reading the book and though it came strongly recommended(1) I figured it would just be an elaboration on this theme.

I was very wrong, because Caplan spends a few pages near the start of the book utterly demolishing this argument as a sufficient explanation for why democracies sometimes do stupid things. The short version is that if the ignorant majority votes randomly they will tend to cancel each other out and if there is a minority which votes rationally they will end up dominating the outcome, since after all Aumann's Agreement Theorem says that rational people should always agree with each other. Ok, but more seriously you would still expect that in areas where there was fairly strong expert agreement (global warming is man made, minimum wage laws hurt the worst off, etc) we would see large number effects give victory to the experts.

So it requires systematic bias for the public to disagree with the experts and Caplan spends most of his book talking about biases and why in economics we should believe the experts instead of the lay public. He also mentions toxicology and how people tend to want to treat danger as a matter of contagion rather than "the dose makes the poison". I'm sure I could come up with more examples from other areas, but I'm also sure that you, gentle reader, can come up with your own and even if we can't cite statistics we might as well go on.

In the last part the book discusses some ways that this effects politics. Politicians want to be elected, and its easier to do that by confirming the biases of the voters rather than arguing with them. But once elected they want to be reelected, which means that they have to ensure that things go well, which means that once elected they have to talk and listen to experts - and then maybe reverse themselves on things they promised to get elected.

So the upshot of the book seems to be that lying politicians are better than the alternative, who actually carry out all the promises they made to get elected. Not a pleasant conclusion and one I'll probably blog more about later. But despite my job of summarizing it here you should all still read the book.

(1) At least Megan McArdle and someone else had said good things about it at some point, if I remember correctly.

Sunday, March 27, 2011

Nukes and artificial leaves

Ah, it seems my original ambitions fell a bit short and its been a year since my first post. Oops. Lets hope I can do better in the future.

My thoughts have been turning to energy this last week, mostly due to the Fukushima nuclear power plant problems. I'd briefly considered putting "power plant disaster" there, but the scope of the damage from radiation seems so relatively small that it feels like doing so would trivialize the tsunami. Some back of the envelope calculations make me think that even if you assume a linear no-threshold dosage model for how radiation causes cancer (i.e. the silly conservative assumption) the plant meltdown will end up causing some tens of cancers.

I'm generally a supporter of nuclear energy. Its a lot better than, say, coal which in the US kills tens of coal miners each year, and over ten thousand people through pollution. Its certainly possible to do better than that, but its possible to make nuclear plants much safer as well. Nuclear power does produce radioactive waste and its impossible to make that waste perfectly safe... but its not very hard to make the waste far safer than the Uranium ore that was mined to create it. I mean, Uranium ore is just sitting near the surface in Colorado where it can get into groundwater and everything.

But realistically, I suspect that after Fukushima there is little hope for the development of new nuclear reactors in the US any time soon. But really, I'm starting to think that renewable energy might just end up being good enough in the long run. And by long run I mean fifteen years or so. Here is a very nice article from Scientific American about how much solar electricity has been improving. They say that the cost of generating solar electricity will be as low as what consumers pay by 2021. This is misleading for a few reasons: first the cost that consumers pay includes things like transmission as well as generation and second its when solar gets as cheap as what we have now that it becomes economical to start building solar instead of oil or whatever, not to rip up all our existing fossil fuel infrastructure. But the point stands that even if photovoltaics are impractical at the moment, it seems likely that their time will come.

In the shorter term, natural gas is the least bad of the fossil fuels and we've recently discovered that the US has much more extractable gas than previously thought. Wikipedia mentions a hundred years worth, but if we're going to cut down on coal use that number is going to have to drop. Still, things are looking much better there than there than they were a few years ago.

And then there's the artificial leaf that some researchers at MIT have announced recently. It is apparently "10 times more efficient...than a natural leaf" which would put it somewhere around a 30% energy conversion, since wikipedia says that photosynthesis is usually 3-6% efficient. That isn't great compared to solar cells, until you realize that this elides all the inefficiency involved in storing the energy a photovoltaic produces. This isn't the only technology we need to develop before hydrogen powered cars become feasible, but it was the one I would have been most skeptical about. It still remains to be seen how economically these things can be produced in bulk, but in the long run I'm optimistic.

The very long run for SARS Covid 2

Many of the worst pandemics that afflict us are from pathogens that don't normally prey on humans.  Probably the most famous pandemic in...