Friday, January 30, 2009

Wall Street Deserves Its Bonuses

All of America is upset about Wall Streets bonuses. Even The President is pissed. Guess what. All of America, including President Obama, is wrong. It's easy to say oh these already-super-rich CEOs are just lining their own pockets with taxpayer money, but that's a knee jerk reaction with no thought behind it. All of those billions in bonuses did not just go to CEOs. They went to tens hundreds of thousands of American workers. They went to traders and portfolio manager and secretaries and HR people. It was money that was promised to those workers as part of their compensation. It was money that many probably used to pay for their kids college tuition or pay off their credit card bill from Christmas or maybe just for the basic staples of life. Maybe they used it to pay their mortgages. If you just think a little, then maybe you won't be so outraged. Maybe you'll realize that it shouldn't even be any of your business.

But of course it is everyone's business because of the government bailouts of the banks. Worse we have all been brainwashed into blaming things on the greed of Wall Street. That is the great red herring of this depression. Politicians want you to think that all of our problems are the result of the greed of Wall Street. If you have a scapegoat to blame things on, then you do not have to take any responsibility.

So if it's not Wall Street's fault, then whose fault is it? There is an easy and a hard part to that. The easy part is that it is the government's fault. It is the fault of the Federal Reserve for using its unconstitutional power to amplify business cycles. The Fed created a climate of malinvestment by fixing interest rates at unsustainably low levels. And guess what, they continue to do this. The seeds of the next recession are already being sown today.

That's the easy answer. The harder answer is that its your fault. Its the fault of every American who bought houses they could not afford. Its the fault of every American who constantly refinanced their house to "cash out" their equity. This carpe diem approach has lead to historically low savings in America, historically high debt, and historically high rates of loan default and foreclosures. Those are the real reasons why any bank would be a fool to loan out money right now. No matter how much money the government gives them, it will never make any sense to loan it out. The people who would borrow are not likely to be able to pay it back.

Back to reality. Nobody is going to accept personal responsibility. It's the government's job to saves us from ourselves, right? I actually thought for awhile that maybe people would (rightfully) blame the government, but that has not happened either. The politicians have cleverly manipuated the masses to put the blame on banks and Wall Street greed. Now we are pissed that those banks are paying the salaries of their employees instead of lending money. It's obvious what is next. We take over those banks and force them to lend money to everybody. Then we will get what we deserve.

Wednesday, January 28, 2009

How Google Makes The Net Suck

Some people like to compare developers to artists. When it comes to web development, some people say there's always a man behind the curtain. Whether you agree or not, there are definitely certain freedoms that web developers enjoy. As a web developer, what are the greatest limitations and obstacles in your way? Once it may have been browser quirks. Now maybe it's all those annoying users who still use IE6. However, I think the greatest obstacle to progress is Google.

Now Google would have you believe just the opposite. I do not think they are disingenuous. In a large organization, it's all too easy for different groups to have different motivations. But ask yourself this, how much money does Google from Chrome? What does Google make money from? That's easy: advertising on search. And that is what is hold us all back.

If you have endured my purple prose to this point, I will finally cut to the chase. One of the most important aspects of any web page is how its PageRank. If your web page is all about deep sea diving, where does it surface when somebody searches Google for deep sea diving? The black art of making your page get a higher PageRank has given birth to an entire cottage industry known as Search Engine Optimization (SEO.)

As a developer I have never given much thought to SEO. I always thought that SEO was about the content of the page, and web developers are not responsible for the content. We are responsible for retrieving/generating that content from all kinds of sources, as well as creating applications that are easy and intuitive for the user to interact with a meaningful way. But, if we go back to the deep sea diving example, we're not responsible for providing information about deep sea diving. Heck you are lucky if most developers even known how to swim, but I digress.

But I was wrong. SEO is not just about content. It is about structure. If you want a good PageRank, then quality content about deep sea diving will lead to other people linking to your page and that will increase your PageRank. But there are much more instantly gratifying things you can do. For example, your page should a title and it better contain the term deep sea diving. No big deal, right? The title is really just part of the template outside of the main contents of the page. Its value has little effect on anything, besides PageRank that is. However, it gets worse.

To maximize your PageRank, then immediately after your page's body tag you should have an H1 tag whose contents should contain the term deep sea diving. Oh maybe you put the phrase on the page, but you put it in a div that styled quite nicely? Not good enough. It needs to be in an H1 tag. Maybe you used some JavaScript to create the H1 tag? That is no good at all. Why? Because The All-Mighty Googlebot does not understand how the page looks to a user. It only understand basic HTML constructs. That's right, it's time to party like it's 1999.

Oh, maybe your organization hired an artist who created a killer deep sea diving logo and you load it on to the page as an image? Not good enough. If you put deep sea diving as the alt text, that will win you some bonus points from the Googlebot, but it is still dwarfed by the rewards you could receive by busting out the H1. Nothing compares to the mighty H1 tag. And don't just put that H1 tag anywhere on the page. Heck you might even get penalized for having more than one! Nope only one, and it better appear (in the HTML source code) as close to the body tag as possible.

Ok, so maybe you give in and put the catch phrase in an H1 tag. That wasn't too bad, right? Now back to your regularly scheduled hacking? Not so fast. Do you have some hierarchical information on the page? Sections, headings, menus, etc? How are you going to do those? Again you better not even think about using things like JavaScript to create them dynamically. Nope, they have got to be static on the page. Back to divs, spans (maybe a table or two), along with some oh-so-clever CSS? Forget about it. Let me introduce you to H1's other friends: H2, H3, H4, H5, H6. That's right, if you want that damn Googlebot to "understand" the hierarchy of concepts on your page, then you better put away your divs and spans.

Maybe you think that's going overboard, but it's not. Do you have a section on your deep sea diving page called "Gear" ? Then if you want to show up on a search for deep sea diving gear, you better have the term Gear wrapped in an H2 or an H3 tag.

What about RIA technologies? Again, if you dynamically create things with JavaScript, it will get picked up, but it is non-optimal. You have to do things the way that the Googlebot wants it done to get best results. What about Flash or Silverlight or JavaFX? Flash will get you screwed on about the same level as JavaScript. Silverlight or Java might as well be black holes. Whatever is in there, is never getting out.

There are tricks you can employ like progressive enhancement. There you do things the way that the Googlebot wants them, then dynamically obliterate that garbage and replace it with rich content that your users actually want. This can backfire. If the Googlebot figures out that you are tricking it, then it will banish you to purgatory.

What if you just make a great web application that users will love and don't bother to worry about the Googlebot? That's fine of course, it just means that people will not find your application by searching for it. Is your business model and marketing efforts robust enough to not need SEO? Yeah, I didn't think so.

So now do you understand? The Googlebot ties your hands, or at the very least makes you jump through all kinds of hoops. There are all these great technologies you could use to make your site as interactive as any desktop application, but The Googlebot does not like this. You've got to play his game whether you like it or not.

Wednesday, January 21, 2009

Scala Duck Typing

Last night I did a talk on Scala with David Pollak and Bill Venners for the San Fran JUG. This was a win-win situation for me. I got to talk about Scala, and I got to learn from David and Bill. One of the highlights for me was Bill's talk about how Scala allowed you to adopt techniques from dynamic languages like JavaScript, Ruby, and Python: monkey patching and duck typing. I knew some of this was possible, but did not understand it well until Bill explained it.

I made it a point to do some more reading on one of Scala's feature that enables duck typing, structural types. I found another great explanation of this by Debasish. Good stuff. If you look at the comments to Debasish's post, then you see a lot of debate about how using such technique can be harmful. I empathize with this sentiment.

I have worked with dynamic languages, particularly JavaScript and Ruby, where I look at the code and say "how in the hell does this work?" That question does not scale. If I have to give code to a multitude of other developers to work on and each constantly find themselves asking that question, then you fail. I don't buy the argument that you can succeed on a large scale with fewer, smarter programmers who are oh-so-clever enough to never have to ask that question.

However, I don't think you necessarily wind up with that question in Scala. Static typing once again comes to the rescue. Between Scala's static typing and desugaring, that question can be answered rather quickly. All you need is proper tooling. Right now the best Scala tools are all plugins for the major Java IDEs. This seems like a reasonable route, but at some point these plugins will need to do some things that you would never see in Java. For example, the debugger needs to be able to "make it obvious" what kind of implicit conversions are going on or how some object complies with a structural pattern. Better yet would be to provide this information at design time. Like if I typed 1 until 100 and I could get immediate insight about the implicit conversion of Int to RichInt for example.

Tuesday, January 20, 2009

An Introduction to Scala

Tonight I am speaking at a San Francisco Java User Group meeting about Scala. Here are the slides of the presentation.

Sunday, January 18, 2009

How To Fix The Economy

I like to fix things. This should be no surprise, as I am an engineer. Of course some engineers prefer to only build new things, and do not like to fix existing things, but not me. Sometimes this works against me. Whenever I encounter a problem, my first reaction is to try to figure out how to fix it. Sometimes that is not appropriate, and sometimes the problem just cannot be fixed. That is a tough thing to admit: that a problem cannot be solved. Even tougher is to spot such problems without first repeatedly failing to solve it.

What does this have to do with the economy? The economy is a problem that cannot be fixed. There is no one thing or series of things that can fix this problem. Giving money to banks has already proved to be ineffective, as has eliminating interest rates. Now some folks want to take even more drastic measures. But it is not going to work.

We like to think that we are so smart that we can understand anything, and thus solve any problem. But we are not that smart, not even close. Some things are too complex and macroeconomics are too complex. If it was possible to understand macroeconomics enough to control it, then the Soviet Union would still be running along smoothly. The Great Depression would have only lasted a few years. You get the idea.

Not convinced? Look at the causes of the Great Depression. The Fed was established the Federal Reserve Act in 1913. It really got to work after the end of World War I. Starting in 1921, The Fed used a variety of "levers" to increase the total money supply by more than 60%. The Fed made the "boom" part of the business cycle extra "boomier", but the result was an even bigger bust. Banks were incentivized to make malinvestments. When they could not cover the malinvestments that failed, a "run on banks" ensued.

Does this sound familiar? The Fed did the exact same thing in this past decade. There was no "run on banks" this time because of FDIC (more on that in a minute.) Instead there was a run on other investment instruments, and the result was equivalent : insolvent financial institutes. Only things are worse this time. Why? FDIC.

Smart people back in the 40's thought that the Depression was caused by the run on banks, not recognizing that as symptom of the sickness, not the cause. So they tried to prevent bank runs by enacting FDIC. If the government insures your deposit, then you should not freak out and pull your money out of the bank, right? Of course this creates a moral hazard because it removes some of the risk of investment. So what do we do? Regulate.

What happens? The regulation becomes dated as new types of investments are invented that are not subject to the regulation. One could argue that the reason for these new investment was to avoid paying the tax of regulation and thus give a higher return. However, the moral hazard is even worse. Even though these instruments are not insured by the government, the precedence has been set. Investment banks know that the risk will be absorbed by the government. Meanwhile the Fed once again inflates the money supply, as the government needs a big "boom" to help pay for wars, and like clockwork, we get another dramatic bust.

Do we admit that FDIC didn't work? Nope. Instead we think that the problem was that we didn't regulate those pesky new investment instruments! This is coming from a Nobel Prize winner, so it has to be The Truth, right? While the specter of government force looms as the ultimate "fix."

So am I just proposing that we roll over and do nothing? Well ideally this would be an opportunity to do things that are generally good for the economy: reduce taxes, reduce regulation, increase trade. However when most of the experts propose well-meaning solutions that would often do just the opposite, maybe the status quo is all we can hope for?

Does this mean that banks fail, businesses fail, and people lose their jobs? Yes, it does. Everyone wishes there was some magic button to push that would prevent these awful things from happening, but there is not. None of the dramatic (and unconstitutional) actions of The New Deal succeeded in fixing that mess. Maybe they prevented things from getting worse in some cases, but they also drastically prolonged The Depression and laid some of the seeds for today's problems.

Of course FDR was re-elected three times and that is all that matters to politicians. So get ready for a lot of fixes, and get ready for a long depression. Let's just hope that this economics meltdown doesn't end like the last one.

Friday, January 16, 2009

The Champs and 2010

Just a week ago, the Florida Gators once again won the college football national championship. It was a tense game, much more so than any of their three other championship game appearances. Florida was blown out by Nebraska in 1995, but won easily against FSU in 1996 and Ohio State in 2007. A couple of things really stood out from the game

Sam Bradford is the next Gino Toretta. I can't take credit for saying that. One of my friends made that claim while we were eating lunch on game day. Now given this guy is from Texas and is a huge Longhorn fan... But I think he has a point. Maybe saying Bradford is the next Matt Leinart would be just as accurate. He has really benefited from having amazing talent around him. He is not used to facing any pressure from a pass rush and he is used to throwing to receivers who dominate defensive backs. He faced pressure against Florida, but most of all, his receivers could not dominate Florida's DBs. He threw two picks, but neither were bad throws on his part. However his receivers were far from wide open and Florida's DBs made great plays. Toretta and Leinart suffered from the same "problem" and never adjusted in the NFL.

Urban Meyer is a great coach. The Gators were shaky in the first half. Maybe it was nerves, maybe it was rust, who knows. Their gameplan was sound, but it took two improbably defensive stands to keep the game tied going into halftime. Who knows what happens if the game would have been 21-7 in favor OU at half? Meyer showed his ability to communicate and motivate his players. They didn't come out in the second half and drastically change their gameplan. But they did execute much better.

Florida is really going to miss Percy Harvin. Does Florida win either of its two recent championships without Percy Harvin? I really don't know how things will turn out for him in the NFL. People talk about how Tebow isn't a traditional QB, but Harvin isn't a traditional WR. He was most productive at running the football in the BCS championship. He clearly can't be a RB in the NFL, but his height will be an issue at WR. A lot of people have described him as a "slot receiver" in the NFL, but do you really want him going over the middle and taking big hits? Anyways, he's always been a ticking bomb in his college career. Every time he touches the ball, there is a big play ready to happen. This year he averaged 1 TD for every 6.5 touches of the ball. He did this playing against the best defenses in the country. The only other recent college player who was as electrifying was Reggie Bush. Like Bush, it seems unlikely that Harvin's pro career can be anywhere close to his college career.

Speaking of players going pro... Obviously I was thrilled that Tebow is staying. I did not think he would leave, though. I was thrilled that Brandon Spikes is staying for his senior year. Honestly I don't know why. Everything I had seen indicated that Spikes would be a first round pick. Given his position (LB), it is hard to imagine him improving his pro stock. Whatever. With Spikes back, Florida is returning all of its defense. That's right, a defense that held the highest scoring team ever (OU) to 14 points will return all of its starters next year. Next year is going to be fun.

Going to other teams, I was absolutely shocked that Sam Bradford is returning. A lot of folks had him as the #1 pick overall and everybody had him as a top 10 lock. So why come back? There's no way their offense can be better than it was this year. A lot of things have to go you way to have a historical season. If throws in a couple of stinkers next year, then he could easily drop in the 2010 draft. And then there's "I" word, especially given his lean stature. Similarly I can't belive their TE Gresham is going back either. Maybe these guys think they have something to prove...

One player not going back is Mark Sanchez. I was pretty surprised to see Pete Carroll rip into Sanchez for leaving early for the NFL. You have to wonder if Bradford staying influenced this, as Sanchez may be the second QB taken now behind Georgia's Matthew Stafford. I am very happy to see Stafford go, as I think he will be great in the NFL. Plus he A.J. Green is an NFL caliber receiver, and the prospect of Green and Stafford was just scary. Green had a great year last year as a true freshman.

Thursday, January 15, 2009

Syntax Matters: JavaFX Script

Last night I finished writing an article on JavaFX. It was a lot of fun, and it has convinced me that JavaFX has a strong future. A lot of its promise comes from its syntax. It is easy to get carried away with Rich Media! and Animation! and FPS! but then you forget that JavaFX is a new programming language on the JVM. Here are the highlights of its syntax.

The declarative constructors: When I first started looking at JavaFX, it looked like MXML but in a JSON format. Looks are deceiving. Essentially JavaFX supports name parameters in its constructors. The naming convention is foo : bar, where foo is the name of the parameter and bar is the value. You can put multiple parameters on one line by using a comma to separate, like in most other languages, or you can put each on its own line and eschew the commas. This leads to the JSON-ish syntax. It is probably more accurate to describe it as JavaScript-ish. This becomes even more apparent when you start nesting constructed objects. It really looks a lot like object literals in JavaScript. What is great is that this is not just something you only use for UI elements, like MXML. It is part of the language syntax, so you can use this syntax anywhere you want. Imagine being able to write an MXML expression directly inside a block of ActionScript code...

Speaking of JavaScript: A lot of the syntax of JavaFX looks like JavaScript's more sophisticated cousin, ActionScript 3.0. You use var to declare local variables. You put the type after the variable name, separated with a colon. You use function to define methods. The major difference is using def to denote a constant. In ActionScript, you use const, like you would in C. This is actually kind of bad in my opinion. The def keyword is used in many other languages, like Ruby and Scala, to denote a function. There is no obvious connection between def and a constant. I think they should have used const instead of def. It would have made more sense to do that and maybe use def instead of function.

Functional Programming: A lot of folks are disappointed that closures are not being added to Java, at least not in Java 7. JavaFX does have closures. You can define a var to have a function type and can specify the signature of that function. The syntax is pretty nice. For example, you could do var join:function(String[], String):String to define a variable called join that is a function that takes in an array of strings and a string as input parameters and returns a string. I would like to see this is in ActionScript. JavaFX also has support for list comprehensions. You could do var squares = for (i in [1..100]) { i*i} to get an array of the perfect sqaures up to 10,000. However, JavaFX does not make as much use of closures. You would think that its sequences would have common functional mehtods like filter, map, fold, etc. For filter and map, there are alternatives. For example, let's say you wanted the square of all of the odd integers less than 100. In Scala you would do something like val os = (1 until 100).filter(_ % 2 == 1).map(_^2) . In JavaFX it would be val os = for (x in [1..10][i | i % 2 == 1]){ x*x }. It's a case of syntax over API. The second set of curly braces is like a select clause. I want to like it because it uses mathematics insired symbols.

Other: There are a few other JavaFX syntax bits worth mentioning. First is bind and bound. These are for data binding expressions. These can be very powerful. You can bind variables together, so that the one changes when the other changes. Better is that you can bind to functions and list comprehensions. The other interesting syntax in JavaFX involve the keywords delete and insert. These give LINQ-ish syntax for sequences. In fact if you combine the mathematical style select syntax with insert/delete and with the declarative constructors, you get expressiveness that is pretty on-par with LINQ in my opinion. When you see everything working together, it kind of makes sense but it does seem kind of random at first.

Thursday, January 08, 2009

MacWorld n00b

I've been at MacWorld this week. Actually I'm blogging this from the Microsoft lounge at MacWorld. This is my first MacWorld, and it definitely a lot different than technical conferences. There is the obviously nice things, like the far better male-to-female ratio. There are some strange things, too. Yesterday I noticed these large, yellow Nikon bags near the entrance. I thought to myself "why in the world do people need these bags?" Then I realized what they really were. They were shopping bags. And a lot of people were walking around with their big yellow Nikon shopping bags full of Mac-related goodness. Shopping does not figure in so prominently at JavaOne...

Tuesday, January 06, 2009

JavaFX Performance

Recently I did a comparison of JavaFX performance vs. Scala. I did this mostly for kicks, and because some people thought that Mr. JavaFX was picking on other, non-Java languages that run on the VM. James Iry duly pointed out that JavaFX should be benchmarked against the likes of Ruby, Groovy, or Scala. It is meant to be a client-side technology, so it should go up against client-side technologies. So I re-did the little performance comparison to match JavaFX against JavaScript, ActionScript (Flash), and C# (Silverlight).

A comparison like this really becomes a comparison of Virtual Machines. For JavaScript there are a lot of choices. I decided to go with some that are supposed to be fast: Google Chrome, Firefox 3.1 (beta 2), and Safari 4 (developer preview.) Because I wanted Chrome involved, I had to go Windows. So I ran everything under Windows XP, running under Parallels on my MacBook. Here is the pretty graph:

I was just a little surprised by these results. JavaFX is indeed the fastest, but just barely. I was somewhat shocked to see Chrome's V8 JS engine right behind. In fact the difference is negligible for small iterations (shown above.) At larger iterations, JavaFX maintained 20-40% margin. As you can see from the graph, Flash and Silverlight were kneck-and-kneck as well, but was always about 7-10x slower than Chrome/JavaFX. Safari and Firefox were very underwhelming.

Of course this was just a micro-benchmark. The code just does a series of recursive calls. So what were are really measuring is the ability of the VMs to unwind these recursive calls. It is not suprising that HotSpot handles this easily. Actually, the same code in straight Java is much faster than the JavaFX version. It is surprising to see how well V8 handles this.

Now does the ability to unwind recursion translate into performance that a web application user would notice? Maybe. It certainly points to JavaFX's or V8's ability to make optimizations to application code. It is probably a more meaningful test than some raw number crunching.

Saturday, January 03, 2009

Wildcards

Yesterday I was reading Bill Simmons' predictions for the NFL wildcard games. I was familiar with his Playoff Manifesto, so I thought that he might actually pick the Miami Dolphins over the Baltimore Ravens. After all, Miami has a number of the Manifestos' rules in its favor:

Rule #1: Never back a crappy QB on the road. Maybe crappy is too harsh of a term for Joe Flacco, but he is a rookie with limited upside. And he's on the road against an experienced QB.

Rule #2: When in doubt, seek out the popular opinion and go the other way. Check out the "experts' picks": They all pick Baltimore.

Rule #6: Ignore the final records and concentrate on how the teams finished the last five or six games of the season. Miami won their last five games and nine of their last ten. In fairness, Baltimore finished very strong too.

Rule #8: Beware the road favorite. Baltimore is a road favorite, Miami the home-underdog.

Those were all rules definitely in favor of Miami. In fact, none of Simmons' rules would lead one to pick Baltimore over Miami. So Simmons picked Miami, right? Nope. He got on the Ravens bandwagon and picked Baltimore to win by 19. Ouch. Hopefully he'll be kicking himself on Monday!

Friday, January 02, 2009

JavaFX vs. Scala

Chris Oliver post a nice little performance comparison of JavaFX vs. Groovy and JRuby. He concluded that JavaFX was 25x faster than these other two languages running on the JVM. Of course I had to see how this compared to Scala. First here's the direct translation to Scala I did of the code:

object Tak{
def tak(x:Int, y:Int,z:Int):Int = if (y >= x) z else tak(tak(x-1, y, z),tak(y-1, z, x),tak(z-1, x, y))
def main(args:Array[String]) = {
0 until 1000 foreach ((i) => tak(24,16,8))
}
}

Here are the results for JavaFX on my MacBook:

$ time javafx -server -cp . tak

real 0m12.847s
user 0m11.926s
sys 0m0.338s

So my system is a little slower than Chris Oliver's. That's why I had to run his bench on my MacBook first, to make a fair comparison to the Scala version. Here are those results.

$ time scala Tak

real 0m9.690s
user 0m9.122s
sys 0m0.261s

Scala not only beat out JavaFX on my system, but was also faster than JavaFX running on Chris Oliver's faster system. The guys at Sun should have never let Odersky go!