Wednesday, November 29, 2017

RISC-V is doing well

Back in 2010 some researchers at the University of Berkley started work on an instruction set architecture (ISA) that was going to be both open for anybody to use and incorporating modern ideas.  All computers run by performing a series of operations, like loading a 16 bit value from memory, adding two 32 bit numbers together and returning the highest possible 32 bit value if the result can't be represented in 32 bits, or taking the cosine of a 32 bit number.  An ISA defines which operations are basic to the computer and which have to be assembled out of other instructions.  It also tells you how you represent your instructions as sequences of 1s and 0s in memory.  And it specifies various other things such as how memory accesses from different cores interact.

You might have heard of the great RISC versus CISC wars of the 1980s.  For a long time it was very expensive to move data from memory to the computer core (it was only every one back then) and back.  Transferring instructions is part of that and to save on instruction memory designers wanted to use as few instructions as possible and so made their instructions as powerful as they could.  And since programming back then was usually done by programmers writing individual instructions rather than using compilers this also meant less work for the programmer.

But as time went on the amount of data that computers worked on grew faster than the number of instructions they used making code size merely important rather than critical.  Programmers started to use compilers more.  And some researchers at Berkley and Stanford realized that there were a lot of advantages to using a less complex instruction set.  If you simplified your ISA you would have an easier time doing things like starting one instruction before the previous one had finished because there were fewer complicated interactions.  Less instructions meant less work for designers.  And in the early 80s you could fit an entire RISC core on a single silicon chip rather than having to spread it across multiple chips.  That made it cheaper and faster.

A lot has changed since the 80s.  Some aspects of the RISC philosophy have fallen by the wayside but others are embraced by everyone designing a new ISA for general purpose computers.  And RISC-V is, of course, firmly in the RISC camp.

Other people have created ISA that are open for anybody to use and free of patents but none of them had ever really taken off.  I'm not familiar with them so I'm not going to speculate on why.  In contrast RISC-V has gotten a lot of people interested.  There are a number of concrete processors that adhere to the architecture that have been designed at Berkly and other places and which have also been released for people to use freely which may be part of it.

When I first heard of these efforts a couple of years ago I was impressed.  Back when I was doing my thesis I could see how an open chip design could be been useful for me to modify and try out my ideas for my thesis.  Now that there were these designs out there free to modify and with working compilers and other software out there lots of academics working on processor design were going to have a very powerful tool.  So RISC-V clearly had a bright future in academia.

In the outside world there were certain benefits.  RISC-V makes it very easy to add your own new instructions for any special purpose you might have.  So companies with special purposes in mind would have a reason to look at it.  I wasn't optimistic about a wider impact, though.

Well, it now looks like I was underestimating it.  At the seventh RISC-V Workshop yesterday Western Digital announced that they were moving to RISC-V for the microcontrollers in their hard drives which tell the drive head where to go, communicate back to the motherboard, etc.  That's potentially billions of RISC-V cores shipped in commercial products every year.

A while ago NVidia also announced that they were looking at RISC-V for microcontrollers orchestrating things in their graphics cards while the GPU cores did the computational heavy lifting.  They mentioned that the ability to add their own extra instructions was a big draw.

So that's some success in embedded microcontrollers.  That makes sense for people who want more customization or who don't want to pay licensing fees to, say, ARM.  A few days ago I certainly hadn't been expecting people to be seriously considering RISC-V for application cores running all sorts of different programs such as in a phone or laptop.  If you're receiving applications from third parties they can't make use of any special extra instructions you have so the RISC-V flexibility isn't a factor.  And nobody has created applications for RISC-V, though you can always compile existing code for it if you have access to the source.

Well, I still think that but another of the talks at the Workshop was for a fairly hefty 4 core chip that would do pretty well inside a laptop.  I'm not sure anyone is going to put it there but I'm sure people will be using it for servers, where you're running a narrower selection of software.  There's support for RISC-V being added to Linux though it isn't fully supported yet.

The whole thing is moving faster outside of academia than I would have expected and I'm interested in seeing what the future brings.

Sunday, November 19, 2017

Expertise, the president versus congress

Since writing a post way back about the way complexity is a problem for Congress I've been happy to discover that the ideas in those aren't at all original and that these are the sort of things people write papers on.  Here's a good article on one recent paper.  I suppose I should have seen that coming.  Possibly I got the idea from somewhere initially then forgot about reading it.

But anyways, figuring out what you need to know to write legislation is hard.  It would be cool if Congress had a big budget to hire outside experts but they have to make do listening to what lobbyists tell them and trying to decide which to believe.  Of course there is on part of government that has a huge budget to hire people with specialist knowledge and which has tons of them on staff.  That is, the executive branch.

That's an angle on this whole situation I'd completely overlooked.  A president proposing legislation can use the Department of Education to draft school reform bills, use the EPA and Department of Energy to draft climate control legislation, etc.  People talk about the imperial presidency.  I expect that this is a pretty big factor in how we got that.

There's also some cause for hope here.  We got the Congressional Budget Office from Congress wanting to push back at having to rely on the White House when budgeting.  To quote Wikipedia:
Congress wanted to protect its power of the purse from the executive. The CBO was created "within the legislative branch to bolster Congress’s budgetary understanding and ability to act. Lawmakers' aim was both technical and political: Generate a source of budgetary expertise to aid in writing annual budgets and lessen the legislature’s reliance on the president's Office of Management and Budget.
 What got me thinking about this power dynamic was watching the recent floundering of Congress on their health care plans and other matters.  Partially this is an issue of leadership but part of the problem also seems to be that the executive is just not interested in the matter.  Or possibly that so many political appointments haven't been completed he's not able to.

The limitations of blindsight

Blindsight, made famous by a book of the same name in science fiction circles by Peter Watts, is a disorder caused by damage to the primary...