Last year we saw the passing of Dennis Ritchie. I was one of the lucky students who used the K&R book as a text book in college. Actually it's pretty amazing that the text books for CS first years were K&R and The Mythical Man-Month. There was pretty much no excuse for me to F up after being taught from those two books. The untimely passing of Ritchie made me recollect that I never really finished K&R. My professor used many of the chapters as reference material, and of course at some point I figured that I already knew everything so well that I didn't need no stinkin' book. So I decided then that I would re-read the book including all of its problems.
Of course I didn't start re-reading K&R right then as I should have. But I did get around to it recently. The very first problem in chapter two asked the reader to calculate the minimum signed chars, shorts, ints, and longs, and the maximum of those plus their unsigned cousins. These are all very straightforward things to calculate for anyone with an understanding of modular arithmetic. The book then mentions a more challenging problem: calculating minimum and maximum floats. No problemo I thought:
Obviously that doesn't work, or this would be a pretty boring blog post. Here's where the awesome comes in. On my MacBook Air, this prints 1.175494e-38 for FLT_MIN but the second line prints 1.401298e-45 for the "calculated" minimum float. That's right it calculates a value that is smaller than FLT_MIN.
Experienced C developers will surely recognize what is going on here but I was flummoxed. It was only upon looking at some of the other constants defined in float.h that I started to figure out what was going on. In particular I noticed that on my MBA, FLT_MIN * FLT_EPS = 1.401298e-45. FLT_EPS is "float epsilon" and is defined as the smallest x for which 1+x != x. This is sort of the standard error on the system, i.e. any calculation could be off by a factor of FLT_EPS. I decided to calculate this in my program and then to also use it to calculate FLT_MAX:
The FLT_MAX calculation takes a long time, since it uses such a small increment. I'm sure a bigger increment could be used to speed it up.
Saturday, March 03, 2012
The JavaScript assembler
Earlier this week, the third part in a series of articles I wrote was published on developerWorks. The series is on CoffeeScript, a language that I first learned of about a year ago. Last summer I read CoffeeScript by Trevor Burnham over the course of a week while on a cruise ship. Later that summer I was at the Strange Loop conference, and serendipitously wound up at a dinner table with several of the guys who develop Clojure. A lot of the conversation that evening was on ClojureScript.
As a long time web developer I am not surprised by the growing interest in programming languages that compile into JavaScript. The browser has never been more important, and thus JavaScript has never been more important. Most programmers that I know find that JavaScript leaves a lot to be desired. This is not just a matter of taste, JS is fucking weird. In fact for years, a lot of the more "serious" programmers I knew would turn up their noses at JS. They were too good for that trash, so they kept to the back-end. It's become tougher and tougher to avoid JS, so ... use a different programming language that compiles to JS. Brilliant!
This idea is hardly new. Personally, I can't help but think of Google Web Toolkit. For years GWT has let your write in Java and compile to JS. So why bother with CoffeeScript, ClojureScript, and probably some other awesomes that I'm leaving out? One obvious thing is that Java is an existing language and is statically typed. So it has its own baggage (which includes excellent tooling, but still) and of course many developers do not like statically typed languages. In fact, many argue that web development with a statically typed language is an impedance mismatch of the highest order. Personally I think GWT was also hurt by initially defaulting to producing highly obfuscated "production ready" JS. I always thought that was developer unfriendly. I think that may have changed over the years, but sometimes it's hard to overcome an initial impression. I think one of CoffeeScript's great strengths is that it produces very developer friendly JS by default. I've never built any huge apps in either language, but in my limited experience I found that debugging was easier in CoffeeScript. I could usually figure out what I had done wrong just by looking at the error message.
Any discussion like this would be amiss without mentioning Dart. It gives you dynamic and static aspects, in a way that reminds me of ActionScript. Personally I think optional typing works best when use variable:type style declarations, like ActionScript and Scala, but that's a minor nit. Of course when you consider that "native Dart" is supported in a version of the world's most popular browser, the possibilities start to get really interesting. I attended a Dart meetup recently where the Chrome developer tools allowed breakpoints and the like directly in the Dart script that was running. The SourceMap proposal would allow this kind of debugging of other higher level languages like CoffeeScript and ClojureScript.
The future seems bright for languages targeting a JavaScript runtime. I haven't even mentioned node.js and how that makes these languages viable for both client and server applications. To me CoffeeScript, Dart, and ClojureScript are all promising. Each seems to have a large target audience. If you like dynamic languages like Python and Ruby, CoffeeScript fits right in. If you prefer statically typed languages like Java and C++ and all of the nice tools that come with them, then Dart is attractive. If you like pure functional languages like Haskell and Lisp, then ClojureScript is a fun way to get your lambda calculus kicks in the browser. Of course there is a lot of anti-Lisp bias out there, so maybe there's room for another functional language that targets a JS runtime, something like a F#-script or ScalaScript.
As a long time web developer I am not surprised by the growing interest in programming languages that compile into JavaScript. The browser has never been more important, and thus JavaScript has never been more important. Most programmers that I know find that JavaScript leaves a lot to be desired. This is not just a matter of taste, JS is fucking weird. In fact for years, a lot of the more "serious" programmers I knew would turn up their noses at JS. They were too good for that trash, so they kept to the back-end. It's become tougher and tougher to avoid JS, so ... use a different programming language that compiles to JS. Brilliant!
This idea is hardly new. Personally, I can't help but think of Google Web Toolkit. For years GWT has let your write in Java and compile to JS. So why bother with CoffeeScript, ClojureScript, and probably some other awesomes that I'm leaving out? One obvious thing is that Java is an existing language and is statically typed. So it has its own baggage (which includes excellent tooling, but still) and of course many developers do not like statically typed languages. In fact, many argue that web development with a statically typed language is an impedance mismatch of the highest order. Personally I think GWT was also hurt by initially defaulting to producing highly obfuscated "production ready" JS. I always thought that was developer unfriendly. I think that may have changed over the years, but sometimes it's hard to overcome an initial impression. I think one of CoffeeScript's great strengths is that it produces very developer friendly JS by default. I've never built any huge apps in either language, but in my limited experience I found that debugging was easier in CoffeeScript. I could usually figure out what I had done wrong just by looking at the error message.
Any discussion like this would be amiss without mentioning Dart. It gives you dynamic and static aspects, in a way that reminds me of ActionScript. Personally I think optional typing works best when use variable:type style declarations, like ActionScript and Scala, but that's a minor nit. Of course when you consider that "native Dart" is supported in a version of the world's most popular browser, the possibilities start to get really interesting. I attended a Dart meetup recently where the Chrome developer tools allowed breakpoints and the like directly in the Dart script that was running. The SourceMap proposal would allow this kind of debugging of other higher level languages like CoffeeScript and ClojureScript.
The future seems bright for languages targeting a JavaScript runtime. I haven't even mentioned node.js and how that makes these languages viable for both client and server applications. To me CoffeeScript, Dart, and ClojureScript are all promising. Each seems to have a large target audience. If you like dynamic languages like Python and Ruby, CoffeeScript fits right in. If you prefer statically typed languages like Java and C++ and all of the nice tools that come with them, then Dart is attractive. If you like pure functional languages like Haskell and Lisp, then ClojureScript is a fun way to get your lambda calculus kicks in the browser. Of course there is a lot of anti-Lisp bias out there, so maybe there's room for another functional language that targets a JS runtime, something like a F#-script or ScalaScript.