Saturday, December 14, 2013

The Mill

Enough public policy, time for more shiny technology.  For a while now I've been following some presentations Ivan Goddard has been putting on a series of lectures for this group called Out Of The Box Computing which has some seriously ambitious plans to totally re-think how computers work.  I'm afraid those videos (and the rest of this post) are going to get a bit technical, so fair warning.

It's sort of interesting to see how little the base abstractions governing how we interact with computers have changed over the last 50 years.  Back then instructions executed one by one, taking however many clock cycles they took before the processor passed on to the next instruction. 

First people pipelined computers so that though instructions were only started one at a time you could start a new instruction before a previous instruction had finished.  Then people created computers that could issue more than one instruction with each cycle of the clock.  Finally, people created computers that could safely re-arrange the order of instructions on the fly, looking down the instruction stream for instructions it can execute early, out of order, without hurting the correctness of the program.  But while the compiler can see the complex web of interdependencies between instructions before it serializes them into a program, and while the CPU can reconstruct them later, the program itself says nothing about this.

All of this has costs and it's those costs the Mill is trying to get around by making actual binary programs preserve these interdependencies.  They aren't the only people who have thought about this.  There was school of instruction set design call VLIW that would try to approximate this by saying which instructions could be issued at the same time.  But while that is part of the important information, the model it follows assumes that each group of instructions finishes before the new one begins, so there's still a considerable amount of information being lost.

Now, what the Mill is purporting to do comes at a cost.  The same CPU you have in your laptop today can execute code that was written decades ago for a machine that was entirely different inside.  By exposing the latency and execution width in the instruction set the Mill is forcing programs to be recompiled with each new generation of processors that is developed.  For something like your desktop PC where you expect to buy software in pre-compiled form and use it for years that is simply not workable.  So I don't think the Mill has any chance of ever conquering the world of PCs.  But while the world of PCs looms large in our minds since those are what we interact with every day, the world of computers is much larger than that.  The high end embedded processors you would find in a cell phone tower, or the servers inside Google or Facebook only run a very limited set of software and that software is often written directly to the task at hand, and could be recompiled if needs be.

On the other hand, the history of computing is littered with failed designs that were too radical and that tried to change too many things at once and failed because the task of making each and every innovation works was just too much.  On the other hand, many of the innovations the team is proposing, such as backless memory or their load semantics, are wonderful ideas that ought to be able to work quite well even apart from all the other new ideas they're trying to bring forward. 

So if you've studied programming or computer architecture go ahead and watch their videos.  Each one blew my mind at least once.

Fukushima vs Coal

This is the result of a comment I left on Hacker News a while ago, which I thought might be worth expanding into a blog post.  We all know about the disaster at the Fukushima nuclear plant, and probably heard about all the many things that have gone wrong there.  I've often seen people use Fukushima as an example of why nuclear power is too dangerous for us to use, and that we have to move to other methods of generating electricity.  Some people have claimed that Germany is replacing their nuclear power generation with Coal after Fukushima, while others says that that's wrong and Germany is merely replacing Nuclear with renewables when it would otherwise have replaced coal with the growth in it's use of coal being merely incidental.

But at the same time, how things get talked about by the media is often a poor guide to how dangerous things really are in practice.  So bad was the Fukushima disaster, really?  The Tsunami that hit Japan was a humongous disaster, killing well over 15,000 people by itself.  Clearly we can't blame those deaths on the nuclear disaster, though.  Nobody died from radiation poisoning as a result of the Fukushima disaster, but with something like this most of the damage is in the form of people developing and dieing from cancer potentially many years later.  A WHO study of those effected by the disaster looked for changes in the incidence of cancer in people affected by the disaster but couldn't find any.  But that merely puts a ceiling on the strength of the effects since it might be that not enough people have yet gotten cancer from the disaster to be statistically significant in the WHO study.  Another study by Hoeve and Jacboson estimated around 120 excess deaths from cancer caused by the Fukushima disaster.

Now, there are also other deaths associated with the disaster besides those directly caused by radiation.  A large number of people were evacuated form the area affected by radiation, and it's likely that harm caused by the radiation released would have been far worse if the evacuation hadn't taken place, so it makes sense to count the harm inflicted by the evacuation.  There were roughly 60 people who died from disruption in medical care or from suicide resulting from the evacuation though it can be a bit difficult to tell given the disruption caused by the Tsunami.

So it was a disaster that killed roughly 180 people.  Which is certainly more than one would ever want to see, but how does that compare to other forms of energy.  It's almost impossible to imagine a single wind turbine killing 180 people, but you have to have many more wind turbines in operation if you're going to replace a nuclear power plant with them.  If we're worried about how dangerous a source of electricity is, the best way to compare is probably to measure how many died in the generation of each Gigawatt-year of power.

Over it's lifetime Fukushima generated 877,692 Gigawatt-Hours of electricity or almost exactly 100 Gigawatt-Years.  So overall it killed 1.8 people for each Gigawatt-Year of power it produced.  How does that compare with other sources of electricity?  Whenever I want to look up statistics on power, the first place I look is Sustainable Energy Without the Hot Air, and excellent book available free online about energy and sustainability.  In chapter 24 they have an excellent table of the dangers of various ways of generating electricity.  Or rather, they have number from two wide-ranging studies on the topic, which are generally produce similar numbers but look at different sets of countries over different time frames to produce their results.  Sadly for us, the two studies disagreed most wildly about how dangerous coal power production is, with the EU estimate being that coal kills 2.8 people per Gigawatt-Year and Paul Scherrer Institute estimating that it only kills .4.

So in the end, we can't say whether Fukushima killed more people over the course of it's lifetime than an equivalent coal power plant, we just know that it is comparable.  Of course, this isn't the same as comparing nuclear to coal.  To do that you would have to include the Chernobyl explosion which killed many more people than Fukushima, but you would also have to include the hundreds of nuclear plants that complete their lifetimes without any problems.  On average nuclear plants seem to result in .1 deaths per Gigawatt-Year, which is much better than forms of fossil fuel that emit a lot of particulates into the air to give people lung cancer, but roughly equivalent to wind power or natural gas.

Update:  In the years this since was published the estimates of number of deaths caused by the evacuation listed on Wikipedia have gone way, way up to around 600 which puts Fukushima as a clearly worse disaster than the normal operation of a coal plant.  It looks like a Fukushima disaster sans evacuation would have resulted in far fewer deaths overall but that doesn't really matter since we can probably expect similar responses in the future regardless. See here.

Sunday, November 3, 2013

Cambridge Council Elections

People tend to spend a lot of time thinking about national politics and that only makes sense.  We, or people in my social circles, do spend most of our time consuming national news and the New York Times or similar papers aren't going to concern themselves about little old Cambridge.  Even the Boston Globe has barely mentioned the election.  Which makes sense, in a way, since any given local election will only effect a relatively small number of people.

But there are some ways in which the cumulative effect of city council elections from around the country have large impacts on the world.  The most significant of these is probably decisions taken by the local government regarding building density. 

People frequently talk about the local effects of decisions by the town or city government regarding new development.  It might increase tax revenue or increase the supply of housing on one hand, but it might overload transportation infrastructure or change the character of neighborhoods on the other. 

What gets left out of these discussions is concern for the interests of potential future residents, and for the environmental consequences of more development.

Compared to the rest of the country Cambridge has a fairly low unemployment rate and jobs that pay relatively well.  And there's good reason to think that at least some of the extra pay is because workers in cities are genuinely more productive than those in other places through network effects.  Corporations don't believe in "to each according to their need" and while scarcity of labor can drive up wages for necessary positions you wouldn't find an entire city of high wages without corresponding high production.

And while denser cities have a higher environmental cost per square mile than suburbia, the ecological cost per capita is far lower.  People living densely have a shorter distance to travel to work or to shop, and larger buildings are more efficient to heat in the winter.  Someone living in a city will generate about half as many greenhouse gases as someone living in suburbia.  And since nobody is planning any mass euthanasia or involuntary birth control it's per capita environmental costs that we all have to worry about.

So I hope I can persuade those of you who live in Cambridge to vote for more density this coming Tuesday.  After all, if you're going to be selfish with your voting you might as well just spend your time sleeping in instead, your individual vote doesn't matter that much unless you multiply it by all the people it effects.

A Better Cambridge has a nice guide to development friendly Council candidates. 

Rockets VII: Staging

See also parts  I ,   II ,  III ,  IV ,  V , and  VI . Space is sort of hard to get to.  You've got one of the Space Shuttle Main Engi...