Event, Exclusive, Interviews, Supercomputing Frontiers 2015

Error-Free Computing: Unums Save Both Real and Virtual Battles

VRW: So then why won’t the IEEE adopt this? What are some of the public reasons they won’t adopt this?

JG: I’ve never actually been in the committee meetings, I’ve only heard about them afterwards. It’s insane what they debate and the conclusions they come to. They have so little to do with what is the right answer, and so much to do with ‘well this is hard for our company, this costs too much money’. They are usually trying to decide between several wrong answers. ‘This is the least bad, no this is the least bad’.

If you get something right, you do not have to have IEEE standards committees. Is there an IEEE standards committee for integers? What would they do, debate ‘whether 2+2=5?’. It would be absurd. If I am successful there should be no IEEE standards committee for Unums because there will be nothing to debate.

VRW: It seems that before you convince vendors to integrate Unums, it would make sense to convince universities to include it on their curriculum for first-year computer science students. Will that ever happen? Have you had any luck doing this?

JG: Universities don’t always lead the way. Sometimes they follow. They certainly didn’t come up with Java, and the world adopted Java like a whirlwind. It’s amazing how they latched on to it. Eventually universities switched to making the language that they taught. So sometimes it happens backwards.

A number of universities want to get a hold of unums and start experimenting with them as soon as possible, but you can’t exactly give people homework assignments with unums unless you give them a little more than what’s out there now.

VRW: If big companies will reject this on the basis that it’s too hard to do, are there smaller, more nimble companies that will pick up the slack?

JG: Absolutely. The first people to adopt it and start putting it into hardware are not going to be Intel, AMD, or even ARM. It will be people with nothing to lose and everything to gain. Of course it it comes in the form of software libraries that run on x86, then who cares?

If we start using it [in software libraries] then pretty soon someone will come along and say ‘let’s build an accelerator’. It’s a lot faster than the software, then five years later someone comes along and says ‘aw heck, let’s just put this in the processor’. And then everyone wins.

VRW: In your keynote you mentioned that you’ve recently completed porting the Mathematica library to Unum. What’s next?

JG: C. It shouldn’t take more than six months to build a library with at least a low-set of unums. You want to do it without extending the language, or a new language. The simplest thing possible just to give people a taste of what they can do. It might only go to 10 decimal places, but it will have a huge dynamic range of 10^12000 or something like that. I can do 32-bit of fractions, I can do 16-bit of exponent (which is bigger than quad precision), but it will still track all these things and it will still have the metadata to track ‘how am I doing’. And that all fits in a 64-bit register.

VRW: For CPU vendors there will be no more separate integers in the floating point units. Is that correct?

JG: Oh there already is. Right now, floats can do integers but they still have a separate unit. I suspect people will still want a separate integer unit. It’s just too efficient. The idea will be a language that says ‘integers=real numbers’. This is a real data type, this is an integer data type. You don’t say whether it’s a two-byte integer, or a four-byte integer… It’s just an integer. You start to overflow the integer, the computer says ‘I think I need to go to a bigger size’. The computer should manage it’s own size. The computer should have the responsibility of doing that.

VRW: You mentioned the idea of doing a unum FPGA, which would be a good reference template for any CPU vendors. Where are you right now with this? What’s the timeline?

JG: I’ve probably run into a dozen people which say they want to start work on it right away. But then when I explain to them what’s involved, the wonder if it would be better to get the C library defined first.

VRW: Thanks for your time.


This interview has been edited and condensed.