Monday, February 27, 2012

Dark Silicon Followup

Something I saw this last week that prompted me to think about dark silicon again was a project Ubuntu is working on, where you have a phone that is mostly a normal Android phone, but when you plug it into a computer screen and keyboard it can act as a full desktop operating system.  Since a modern smartphone is much more powerful than the desktops of ages past, and since we tend to do more and more things on far away servers, this seems like it might be a model for the future of computing.  You computer fits in your pocket and you use it as a phone most of the time, but then you plug it into some sort of dock and it becomes your desktop.

But it strikes me that in a future like that you're likely to have a bunch of computational resources sitting there in your phone that only light up when the phone has access to a hefty power source.  Heat dissipation out of the phone will still be a hard limitation since there's no way you can fit a fan in a phone form factor, but there's still room for quite a bump.

Pushing people off bridges, and consequences.

Pushing people off of bridges - but only to save the lives of others, of course - has long been one of the staples of debate in moral philosophy.  The original formulation of the famous Trolly Problem is more or less "Suppose you see an out of control trolley about to run over 5 people.  Is it moral to push a fat person under the wheels if it means that only he will die instead of the 5 others."  This started out as a debate among philosophers, then became a tool for cognitive scientists to use by asking people about this topic in surveys, but often with some variation.  What if the one person is your mother?  What if you throw a switch instead of having to push the person yourself?

Researchers have found out many fascinating things about how people respond to moral problems, or at least say they would respond, using this problem.  I'll plug Thinking Fast and Slow here as an excellent overview of modern cognition research in a lot of areas, including this one.  Strangely for people who try to describe morality in terms of simple abstract rules, almost everyone would throw a switch to divert the trolley from hitting five people to a course that would make it hit only one, while very few would actually push someone themselves.

I'd tend to explain that with two forces.  First, in real life I can be much more certain that pushing someone will result in harm than I can be that, further down the track, the train will cause harm by itself.  It could be diverted by someone else, I could be wrong about where it was going, etc.  Second, there is the notion of blame.  I wasn't to blame for the out of control train so I can't be blamed for the five deaths, but if I push someone onto the tracks I'm now the proximate cause of them dying.  In matters of abstract morality you can talk about partial responsibility, but in terms of lynch mobs usually its just a matter of finding the one person who is "to blame".

However, I recently read a blog post discussing the finding that people are much more likely to push if all the participants are related than if they are strangers, going from about a quarter to about half.  The blog post author, Robin Hanson, suggests that being related makes the stakes higher, enough so that more people are willing to violate social norms.

That's possible, but I think a stronger consideration would be that in the case of strangers the relatives of the person you killed are mostly disjoint from the people whose relative you save, while in the case of the brothers most of the people most upset by the killing will also be most relieved by the other results.  I'd suggest running a new experiment, where the person pushed and the five on the tracks are all related to each other but not to the person answering the question.  My bet would be that closer to half than to a quarter would end up saying that they would push.

Wednesday, February 8, 2012

Is there no such thing as bad publicity?

Recent events surrounding the Susan Komen breast cancer charity and Planned Parenthood has gotten me thinking about the idea that all publicity is good publicity. It seems to me that there are times when bad publicity can actually be bad and others where it cannot, and that this mostly has to do with the nature of the recipient rather than anything about the details of the scandal.

As you might expect, Wikipedia has a long list of instances where scandal has brought people great success, and it seems that these cases have a lot in common.

If your primary opponents are obscurity and apathy then the rush of attention a scandal brings can be of huge value. Even if the majority of the people who hear about you are so offended that they don't want to have anything to do with you, some wont' be and through them you'll gain from your new found notoriety. This is why some early 20th century artists explicitly tried to attract the ire of morality crusaders. The crusaders weren't the target audience anyways, but they'd let people who were interested know what was going on.

On the other hand, if you're already well known your potential to gain from notoriety is much smaller. And if you can suffer from disapproval you might be even worse off. The worst people who disapprove of a business can do is often just boycott the product, but a politician or political movement has to appeal to a majority then turning people against you can be a serious negative consequence.

So where does this leave Susan Komen and Planned Parenthood? Both experienced a huge surge in donations with the recent fracas, but for both there's more than just the short term to worry about. If Susan Komen has attracted a large number of conservative who just like to see someone stick it to Planned Parenthood, but who don't care as much about breast cancer then this will just be a flash in the pan. That would come down to how much social conservative care about breast cancer versus social liberals. But, of course, Susan Komen soon reversed their course, so now we're left asking how much the appearance of inconstancy will hurt them, versus more widespread public knowledge about their mission. I'd guess that their revenues will be up a year from now, but I'm not totally sure. Remind me in a year, and we'll see how right I was.

Monday, February 6, 2012

Dark Silicon (dun dun dun!)

For many years, the major limitations of CPU design have been about power. Back in the day, this wasn't much of a problem. Every year transistors got smaller, which meant that you needed less current to flip them from one state to another, which meant that you could flip them more frequently without doing any more (thermodynamic) work than you had done with the last generation of chips.

But then something changed. Leakage current reared its ugly head. As transistors became smaller they also became less substantial. Whereas once the trickle of current that would leak through a transistor when it was off would be tiny compared to the stream it would take to shift it from off to on, as transistors became smaller they became more and more permeable. And to make it even worse, shrinking transistors meant that even if overall power usage remained the same, that power was concentrated in a smaller and smaller area.

Something had to give, and it did. After the ill-fated Pentium IV, Intel figured out that using more, slower transistors meant that the power usage was spread out over a larger area, and the lower voltage those transistors operated at meant that leakage again retreated, and was again less important than the fundamental power needed to drive those transistors from one state to another.

Adding more transistors to execute more instructions at once only goes so far, however, because often one instruction depends on the previous one, and finding places where they don't gets harder and harder the more you try to do it. So one way which CPU designers tried to use all these extra transistors to speed things up was by duplicating the entire CPU, creating more cores. But what happens when you have more cores than you can really use? That's when we start entering the realm of dark silicon.

Its a well known fact that specialized hardware is almost always faster than general purpose hardware. This is why we have graphics cards, a chip that was designed explicitly to do 3D graphics can make assumptions about what its doing, it can neglect areas that aren't needed, it can encode rules directly into silicon instead of having to fetch abstract rules from memory. If you have a graphics chip of a certain number of transistors and a certain power budget, it would take many chips of the same size to do as much work if they were all general purpose CPUs.

But why stop with graphics? If specialized circuits can do a task faster and more efficiently, why not use them for as many things as possible? As Moore's law keeps working and the number of transistors you can cram onto a chip keeps going up, and as the number of general purpose CPUs you can usefully cram onto said chip doesn't, it makes more and more sense to put specialized circuitry on board. Now mind you, those same transistors could be used for more cache, but as caches get bigger each marginal transistor used for such gets less and less valuable. Sooner or later a specialized processor for doing media encoding or encryption looks more and more like a better use for those marginal transistors.

And sure enough that's already happening. Intel's newest chips have both specialized circuits for encryption and media encoding. AMD is planning similar things, and it looks like the age of the co-processor is about to begin. This is called "dark silicon" because most of the time it just sits there, dark and unpowered, not using any power at all. But when you need to do something it's suited for, it thrums to life tears through whatever task you throw at it before growing silent once again.

As time goes on we're likely to see more and more such accelerators come into being. What will be next, more Java accelerators? Physics accelerators? Haskell accelerators? A configurable blocks that can become whatever is needed? In any event operating systems will have to evolve to be able to dispatch tasks to whatever specialized hardware can handle them, or just emulate those specialized functions in software. It looks like Apple might be leading the way here, but the final result might end up looking very different.

We're trying to make our SARS-2 tests better than we should

Ok, that's a somewhat provocative title but I think it's basically accurate.  During this pandemic the US in particular has had a pr...