Wednesday, December 31, 2008

Music 2008, Part 2: Fresh Choices

Old farts weren’t the only ones making good music in 2008. The best of them was the debut by Vampire Weekend. This album took a little while to grow on me, but once it did it became a staple for me. I listen to it in the car, while working out, and while programming. Ten years ago I would have probably said it was the best of 2008, instead of Coldplay. Vampire Weekend was the best of the “young” artists, but hardly the only noteworthy album.

The Very Good
Attack & Release by The Black Keys is a very good album. These guys are both good songwriters and talented musicians. The album runs a little flat at times, but is thoroughly enjoyable. Definitely best listened to in the car, which for me means blaring loud.
For a more relaxing listen, try For Emma, Forever Ago by Bon Iver. The music is clever and enjoyable. The vocals can take some getting used to, but do not get in the way even if you are not a fan of the folk and falsetto.
Death Cab or Cutie has had a lot of success in recent years, and they only got better in 2008. Narrow Stairs is their best album to date. In fact “Cath...” was one of my favorite songs of 2008, but the album is full of beautiful songs.
The Seldom Seen Kid by Elbow received a lot of hype in 2008, and it deserved it. These guys really know how to create space in their music. “Grounds for Divorce” and “Weather to Fly” were personal favorites, but the whole album received a lot of playtime for me. It is a better headphones album, good for programming.
Back to Black by Amy Winehouse was a 2007 release that was discussed ad nauseum in the past. Personally I did not really get into the album until this year. She really does have a remarkable voice and the album is full of memorable music.

Merely Good
Metro Station’s self-titled album was technically released in 2007. I did not listen to it until 2008. It spawned a number of pop hits, but you should not hold that against them. It is a quality release. These guys are definitely know a good beat and good hook. They remind me a lot of a less polished Garbage.
Another heavily hyped album from across the pond was Fed by Plush. The songwriting and production is excellent, but a little indulgent at times. It did not live up to the hype to me, but is a good listen nonetheless. The adjective “lush” is often used to describe it, and with good reason.
If Plush puts you to sleep, then throw on Santogold’s debut album. It is a very fun and exciting album. This is probably not the kind of album that you will find yourself going back to in a year or two, but it is a pleasant treat for now. Of course I once thought the same thing about The Gorillaz...
Finally a year-end retrospective of music these days is obliged to mention the most popular showcase of music: American Idol. This year’s winner, David Cook, was the first “rocker” to win the event. However, former finalist Chris Daughtry has had as much success as any winner with the exception of the new Queen of Country, Carrie Underwood. So it was not surprising at all for Cook to win. His self-titled album is par for the course. It is not as good as Daughtry’s debut, but that probably won’t matter. It definitely reproduces the “emo” sound that was Cook’s signature and has quality songwriting. In other words, if you liked Cook on the show, you will like his album. This may not seem like a notable accomplishment, but actually there have been very few Idol winners who you could say this about.

Music 2008, Part 1: The Early Bird Special

It has been a remarkable year for new music. Not just because there has been some good new music released this year, but because of who has been releasing it. A lot of old dudes have rolled out of their pillow top beds and put down some good to great music this year: R.E.M., Nine Inch Nails, Metallica, The Cure, AC/DC, Q-Tip, and Guns n’ Roses. All of these guys released some of their best work more than 20 years ago, and they were all back this year releasing good music once again. You could argue that if any of these guys were new bands then what they released this year would have created huge hype. To some degree they are victims of their own success. None of these guys can ever release music that would be judged as being as good as what they did at their peaks.
R.E.M., AC/DC, and Metallica took a similar approach. They all tried to go back to their roots and resurrect the sound of their glory days. They all succeeded in my opinion. Each group released albums that were very enjoyable for long time fans. Neither really tried to reach new fans, but why bother? It was an election year, and playing to the base is a tried and true strategy.
NIN and The Cure both took their music in the direction they wanted. In both cases this was not a huge departure from where they had been in the past, but it was definitely distinct. The 21st century version of NIN has proven to have its own distinct sound from the 20th century NIN. Since 2005, NIN has released four albums: With Teeth, Ghosts, Year Zero, and 2008’s The Slip, plus the remix album Y34RZ3R0R3M1X3D. That’s a lot of music, especially compared to ye olde NIN that tended to take many years between releases. With more product, it has been a little more hit and miss for NIN. That is fine by me. The Slip contained several awesome tracks, and several so-so ones.
It is not uncommon to see rockers past the age of 30 continue to release music, even if it is usually not very good. It is more rare to see a hip-hop artist do this. This year saw an excellent album released by A Tribe Called Quest’s Q-Tip. Tribe and De La Soul once provided an excellent alternative to the misogynistic lyrics of “gangsta” rap. Of course today we have dropped the term gangsta, as it is the norm. Nearly every hip-hop artist is bound to make reference to dealing drugs, brandishing firearms, treating women as property, etc. Q-Tip still does things his own way and The Renaissance sounds as fresh in 2008 as Tribe did back in the late 80’s and 90’s. It was easily the best hip-hop album of 2008, far ahead of good but predictable albums by Kanye West, Lil Wayne, or T.I.
And then there was Guns n’ Roses... The long winding road of Chinese Democracy has been well chronicled. When you listen to Chinese Democracy, it is amusing to speculate about the evolution of the songs. However, I think you are best off to consider Chinese Democracy without its history and build-up. If you do that, it is an average album. It has some good moments, but nothing spectacular. It sounds modern, and that is noteworthy given the age of its creator (even more so if you speculate that many of the songs are 10+ years old, but I said I wouldn’t indulge in such speculation...)
The members of Coldplay are not quite the oldtimers as the above artists. They have been around for a while, and like the other artists, they are always compared to what they have done in the past. Maybe some fans may also think that Coldplay can never do anything that is as good as what they have done in the past. I definitely belonged to that camp -- I thought that Coldplay would never come up with anything as good as “Yellow”. I was wrong, because Viva la Vida is Coldplay’s best album of their career. It was one of my two favorites albums of the year, and I would give it the very subjective nod as the “best”.

Monday, December 29, 2008

Reality Check: SF 49ers

Yesterday I was driving home from Southern California, after spending the holidays with family. My wife was giving me updates on the Miami-New York game, and I was thrilled that the Dolphins won. Today I am back in the Bay Area and all anybody is talking about is Mike Singletary and the 49ers. In truth this has been going on for the last month, but I can take no more.

Now I can't really blame them. The 49ers seemed awful before Singletary took over, with a 2-5 record. A ten loss season looked like it was on the way. They finished 5-4 under Singletary, and now have given him a new contract. Many (including myself) were surprised when Singletary was named interim head coach. We weren't suprised that Mike Nolan was fired -- he should have been fired after last season because he behaved childishly and subsequently ruined the career of Alex Smith. Now the surprise was Singletary being named instead of Mike Martz. After all, Martz had coached both St. Louis (with great success) and Detroit (not so much.) Now with Singletary being given the reigns for next year, it is a foregone conclusion that Martz will be run out of town so new O-coordinator can be brought in to install a "power running game." Everyone in the Bay Area is giddy about all this -- but they won't be so happy 365 days from now. Let's take a look inside the numbers to see why.

Under Nolan, the 49ers were 2-5. As much as I disliked Nolan, that 2-5 mark was mostly a result of their schedule. The 5 losses were against teams with a combined 0.618 winning percentage. Three of the five teams made the playoffs, and the other two just missed. All five of those teams ranked in the 7 in the league in offense. Not surprisingly, the 49ers surrendered 339 yards per game. Their offense produced nearly 300 yards per game playing against defenses that were on average ranked 17th in the league -- right in the middle.

Under Singletary, the 49ers were 5-4. Again much of this was the product of their schedule. Their opponents had a combined winning percentage of 0.430. None of the teams that they beat made the playoffs. Actually they only played two playoff teams, and they lost to both. They played average offenses (19th ranked on average) and average defenses (18th ranked on average.) They produced 320 yards per game and gave up 315 yards per game. That is about a 20 yard swing on both offense and defense, but this is easily explained by their opponents. They were -6 turnovers under Nolan and -9 under Singletary.

In summary, it is hard to argue that Singletary made much of a difference. If Nolan had stayed on as head coach, the 49ers would have probably had similar "success". This is not a good reason to get excited about Singletary and offer him a multi-year contract. Worse, by making Singletary the permanent head coach, you guarantee that Mike Martz will be gone. Why is that a big deal? Keep reading.

The 49ers had a terrible passing offense last year, but this year they were ranked 13th in the NFL. That's not great, but it's a big improvement. Now maybe that is because of the opponents they played, but most of the really bad teams they played were division foes who were also very bad last year as well. So maybe some of that improvement is because of Mike Martz. Indeed, Martz's game planning ability was even more on display late in the season. He arguably exposed huge weaknesses in several defenses: Arizona, Dallas, and especially the New York Jets.

After beating the 49ers in week 10, the Arizona Cardinals went 3-4 the rest of the way. Before that game, they were giving up 23 points per game. Not great, but good enough when you have a top give offense. After that game, they gave up 31 points game. Martz's offense put up 275 yards passing against the Jets. After that game, the Jets faced the likes of JP Losman, Seneca Wallace, and of course Chad Pennington. The Jets defense was great against Losman, but Wallace and Pennington both posted 100+ QB Ratings against the Jets. In the previous 13 games, the Jets allowed a 100+ QB rating only three times (Philip Rivers, Matt Cassel, and Tyler Thigpen.)

You can argue that Singletary deserves credit for benching JT O'Sullivan and starting Shaun Hill. However, most of the credit for the 49ers "turnaround" goes to an improved offense engineered by Mike Martz. Next year, Martz will be gone. Instead we will see an offense that is more similar to what Nolan preferred last year, and we all saw how that turned out.

How 'Bout Them 'Fins!

This is the first of two NFL posts. This is the happy one. My childhood team, the Miami Dolphins, amazingly made the playoffs yesterday by beating the New York Jets. They went from 1-15 last year to 11-5 and this year, thus providing hope for Detroit Lions fans everywhere... Admittedly I did not get to watch too much Dolphin football this year (or last year for that matter) because I am in the Bay Area. Based on what I saw and what I have read over the last two years, there are a couple of major reasons for the turn around.

1.) Health. Last year Miami was plagued with injuries. Ronnie Brown started the year off on fire, making huge plays in both the running and passing game. I always joked that Brown would rack up 100 yards on the ground in the first half, while the game was close, and then another 100 yards through the air in the second half when Miami was trying to come back. It's not just Brown. The D-line was healthy all year. The O-line only had one missed game combined. The secondary is obviously the weak link of the defense, but at least there were a lot less games missed this year than last.

2.) Parcells and Sparano. The new brain trust certainly re-built the team beautifully. That healthy D-line? All new. The O-line returned several starters from last year, but was certainly improved by drafting Jake Long. Certainly the addition of Chad Pennington helped -- a lot. But don't forget about other subtle improvements like trading for Anthony Fasano.

3.) Brett Favre. I'm not putting him here because of the three picks he threw in the season finale against Miami, but because of his offseason drama. If Favre would have never retired, he would have stayed at Green Bay this year. I don't want to even try to speculate how that would have turned out -- Green Bay was bad this year because their defense could not stop the run -- but I know how it would have affected Miami. Favre in Green Bay, means no Favre in New York, and that means no Chad Pennington in Miami. Pennington is no messiah, but he was good. He posted a 97.4 QB rating! That was second in the NFL! He was obviously a huge improvement over the mess that Miami had last year. Here's a few more stats for you: Miami was 5th in offense in the AFC, 6th in run offense, 5th in pass offense, and had the fewest turnovers in the AFC. Thanks Favre!

With all of that being said, I shoudl expect Miami to roll to the Super Bowl, right? Well maybe not. I do think Miami has a good chance against Baltimore this weekend. Miami was 5th against the run, but only 12th against the pass in the AFC. Baltimore was only 13th in passing, and with Miami at home ... they have a chance! The early line has Baltimore as a 4.5 point favorite, so obviously some smart folks think Miami's fantastic run is about to end. Either way, it's been a great season for Miami.

Monday, December 22, 2008

Fun Things in January

Here are a few things I am attending/involved in that are coming up in January.
  • Macworld 2009, January 5-9, Moscone Center (San Francisco) -- Steve Jobs won't be there, but I will. Coincidence? I'll mostly be bugging iPhone developers. If that's you, then look out.
  • BCS Championship Game, January 8 -- Oh, I wish I could say that I was going to be in Miami for this game. Instead I will just be at my house in San Jose, watching the Gators triumph.
  • Java Community Process, 10th Birthday Party , January 13, Computer History Museum (Mountain View) -- Like other Silicon Valley JUG members, I got sent an invite for this little party. Democracy and Java, two killer ingredients for a party...
  • Introduction to Scala for Java Developers, January 20, San Francisco JUG -- I get to talk about Scala along with Lift creator David Pollak and the venerable Bill Venners. Bring your Scala book for Bill to sign it.

Thursday, December 18, 2008

Scala Golf

Recently I pointed out how Scala has Golf built into it, in the form of the _. So of course it is fun to solve a Golf problem using Scala:

val m=scala.collection.mutable.Map.empty[Char,String]
args(1).split('|').map(_.split(',')).foreach((s)=>m+=s(0)(0)->s(1))
println(args(0).map((s)=>m.getOrElse(s,s)).mkString)

Another Golf friendly feature at work here is the apply method in Scala. It is defined on RichString as an alias for Java's String#charAt method. Given my Golf comment, it is unfortunate that the _ can't be used more in the above problem. Two of the closures needed to refer to the bound variable twice. If you use two _ then the compiler assumes the closure is taking two input parameters. Seems like you could infer the number of input parameters and allow more flexible use of _ ... which could make Scala even more expressive and even more confusing!

Tuesday, December 16, 2008

Is Scala too hard or are you just stupid?

In case you missed it, Python's BDFL a.k.a. Guido van Rossum filed some major complaints about Scala. After you read that, read a good rebuttal and especially observe the ensuing debate. I especially liked Tony Morris's closing salvo. He brings some validity to the ad homenim response: Maybe you don't like Scala because you are too dumb to understand it. Please allow me to commit the most common fallacy of human thought and attempt to answer this question by only examining my own experience. In other words, if Scala seems too hard to me, then it is too hard. Period. If it seems simple enough to me, then anybody who finds it hard is just stupid.

Since I am going to make a statement of fact based entirely on my own experiences, maybe a little background would be relevant. I started programming 27 years ago when I was 7 years old. I programmed in BASIC, first on an Apple II at school and then on my very own Commodore 64. My first "real programming" was in Pascal when taking Advanced Placement Computer Science in high school. I probably still think in Pascal and translate everything into it first. I also learned FORTRAN and C while in high school.

In college, I learned a lot more C. For awhile I double-majored in math and computer science. Then I took the CS 10, the weed-out course for CS majors. For all of our programs in that class, we started with a predicate calculus definition of the problem, and then we had to logically derive the solution. We had a tool (I think the prof or somebody else in the department wrote) that we transform the problem statement to a computer program. But not just any program, a Lisp program. I hated everything about this, including Lisp. That class successfully weeded me out, and stuck to math.

In college I also learned Perl (for an economics class) and C++ (for a summer math research project.) After college, I programmed mostly in Java, with some C++, Perl, Ruby, and C# sprinkled in along the way. So in summary, I've programmed in a lot of language, but most of them are from the imperative cum OOP tree with C based syntax. I had one major experience with a functional language, and hated it so much that I change my major. So now on to my observations and emotions about Scala.

1.) Type inference sucks, but you (Java developers) gotta love it. I love having a statically typed language. I am definitely in that camp. However, I love not having to declare types very often. How often do you get to have your cake and eat it too? But this is, in my experience, the major source of pain in Scala. You are at the mercy of the cleverness of the compiler. In Java you are often at the mercy of the cleverness of the JVM, but it requires some esoteric Java (the kind that you only see in job interview problems or in books by Josh Bloch and Neal Gafter) to produce code that looks like it should compile, but will not. This is all too common in Scala. Here is an example.


// this compiles
object ItemMetaData extends Item with KeyedMetaMapper[Long, Item]{
override def dbTableName = "items"
// here is the problem line
override def fieldOrder = List(name, description, reserve, expiration)
}


class Item extends KeyedMapper[Long, Item]{
def getSingleton = ItemMetaData
def primaryKeyField = id

object id extends MappedLongIndex(this)
object reserve extends MappedInt(this)
object name extends MappedString(this, 100)
object description extends MappedText(this)
object expiration extends MappedDateTime(this)
}
// but this does not
object ItemMetaData extends Item with KeyedMetaMapper[Long, Item]{
override def dbTableName = "items"
// here is the problem line
override def fieldOrder = name :: description :: reserve :: expiration :: Nil
}


class Item extends KeyedMapper[Long, Item]{
def getSingleton = ItemMetaData
def primaryKeyField = id

object id extends MappedLongIndex(this)
object reserve extends MappedInt(this)
object name extends MappedString(this, 100)
object description extends MappedText(this)
object expiration extends MappedDateTime(this)
}

The two list expressions are equivalent, and yet one compiles and the other does not. Why? Is it a flaw in the compiler? Is it bad syntax? Is it a flaw in the language implementation (i.e. the List code and the :: code) ?

2.) Having no operators means that operators are everywhere! You can make the case that Scala's syntax is simpler than Java, C++, Ruby, or C++ for that matter. Why? Because it has no operators. There is no operator overloading, because there are no operators. The flip side of this is that it is easy to simulate operators and that everybody can do it. This is great for the wanna-be language designers in all of us, but can really suck. Why? It's like you are constantly encountering new operators. It can make Scala's syntax feel infinite is size, even though it is actually quite small and simple.

3.) Golf is built in and is known as _. Scala uses the underscore all over the place to allow for shorthand. Once you have become reasonably competent at Scala, you begin to like this. Until then, you hate it. In other words, it steepens the learning curve. It is a feature for power users, but it is prevalent. Thus you either become a power user, or you quit (insert snide comment about Guido or Cedric here.)

4.) Expressive is as expressive does. A lot of folks have thrown the term "readability" around. There are two aspects of readability. When you read code, do you know what it does? When you read code, do you know how it does it? The more expressive a language is, the easier it is for it to satisfy the first question. However, it may be harder for it to satisfy the second question. One could argue that if a language is better at both of these things than an alternative language, then you would probably rather program in that language. It is certainly easier to tell what idiomatic Scala will do than Java, for example. However, it is often harder to understand how it does it. You can substitute Python for Scala in the above two sentences and it is still true. But could you substitute Python for Java in those sentences? Probably not.

5.) Scala is not for creationists. Ok maybe that's a bad joke by an atheistic evolutionist. What I mean is that object creation can be confusing in Scala. Constructors are not obvious at first. Case classes add confusion, and then you throw in companion objects and you might start screaming "Make it stop!" Oh do I complain too much? Well in most of the other "mainstream" languages you need to learn exactly one thing to figure out object construction. Sure there may be other patterns (factories, singletons, builders, etc.) for augmenting this, but there is only "natural" way instantiate a class. In Scala you see standard stuff like val foo = new Foo("2ez") but also val foo = Foo("wtf"). The latter could be a case class, in which case the former won't compile. Or it could be a companion object, in which case both things compile.

So what is the conclusion? Well for all of my struggles, I have managed to find my way around Scala. So there, it must not be too hard. I am no smarter than any other programmer out there, so if I can learn it, they can too. Plus, my job did not even depend on it. Usually when a programmer has to learn a language, their job depends on it. A motivated programmer can definitely learn Scala is no time!

Saturday, December 06, 2008

Gameday

There are two games from this season that provide insight into today's Florida vs. Alabama game. First is Florida vs. Georgia. The Bulldogs have an offense with some significant similarities to Alabama's offense. They have a great, power running back (Knowshon Moreno) and a big play wide receiver (A.J. Green). Florida's strength is its pass defense, and they are good at both man-to-man and zone. They played a lot of man-to-man against Georgia to limit Moreno (17 carries, 65 yards.) Georgia was productive through the air (292 yards) but also produced three interceptions. Florida will try this same formula against Alabama.
Alabama's Glen Coffee might actually be a more explosive runner than Moreno, and their stellar freshman receiver, Julio Jones will definitely be playing on Sundays. However, I don't think Alabama has the confidence in John Parker Wilson to let him attack Florida's man-to-man play on the WRs. If they can make some big plays in the air earlier in the game, then they can win. If not, then it is hard to see Bama putting up enough points or controlling the clock enough.

The other important game is Alabama vs. LSU. LSU's running backs, Keiland Williams and Charles Scott, combined for 180 yards, 2 TDs on the ground against the vaunted Alabama defense. Alabama must do better against Florida's running backs. Look for Florida to use Rainey and/or Demps out of the ever-popular Wildcat formation. They have been experimenting with Percy Harvin out of that formation, but Harvin is out today. Everything has been pretty vanilla out of it, but you gotta figure Urban Meyer would save the more creative Wildcat plays for a game like this. Florida's rush attack is very similar to LSU's, so Alabama has to do better against it. They can't count on turnovers like they got against LSU. The thing that is going to be tough for them is that this game is on artificial turf, giving a boost to Florida's speedsters.

Friday, December 05, 2008

JavaFX: Some Frustrations...

Yesterday was the much ballyhooed release of JavaFX. I was excited about this for a few reasons. First, I started following JavaFX back when it was called F3. I remember last year planning on attending Chris Oliver's session at JavaOne on F3, when I read on his blog that F3 was now being called JavaFX. Second, a big part of my job is staying on top of RIA technologies and JavaFX certainly falls in that category. Lastly, it's a new programing language that runs on the JVM! How could I not want to learn it.

So I started playing around with JavaFX. Instead of going after lots of graphical goodness, I went for more mathematical stuff. I did a few problems with JavaFX, and then came across a problem that required finding the prime factors of a large integer. JavaFX has an Integer type that maps to java.lang.Integer. There is nothing in JavaFX specifically for large integers, but part of the beauty of JavaFX is that you can use classes from Java in it. In particular you can use java.math.BigInteger. So far so good.

In my solution, I wrote a prime sieve and then checked the modulus of the primes against the large integer. Now for normal 32-bit Integers, JavaFX has some nice syntax:

var remainder = n mod p;

It looks like math! Of course I love this. However, this does not work for BigIntegers. No problem, BigInteger has its own method for this:

var remainder = n.mod(p);

But this does not compile! Why? As you can infer from the first example, mod is a reserved word in JavaFX, so you can't use it as an identifier. Thus you can't use it as a method name. Of course you could surely use reflection to get around this, but who wants to do that?

Sunday, November 30, 2008

Gator Talk

Yesterday I went to the Florida vs. FSU football game. I haven't had much recent, personal college football history...

The last college football game I had been to was ... a Bakersfield College game earlier this year. That's not exactly Division 1. The last D1 game I had gone to was Tennessee vs. UCLA in 1997.

The last Florida football game that I attended was 1995, vs. Tennessee.

The last Florida vs. FSU game that I attended was ... 1986!

Needless to say, Saturday's game was a sweet return. I went with my brother, and we managed to stay relatively dry despite the dreary weather. Some folks thought that the Gators played a little sloppy, but I thought that the game, especially the first half, was about as good as could be ... if it wasn't for the kickoff team. FSU scored 9 points without any offense, courtesy of filed position. Two FGs were setup by long kickoff returns, and another by a fumble (only one fumble given the conditions, is not too bad.)

Anyways, all attention shifts to the SEC Championship Game against Alabama. The game is a de facto playoff game: whoever wins will be playing for the BCS Championship. Injuries are playing a big part of this game. Percy Harvin hurt his ankle last night against FSU, and that is a huge blow. The Gators have a lot of dangerous players, but Harvin is the most dangerous. You have got to assume that either he will not go, or (worse) will play, but play ineffectively. Advantage Alabama.

The Gators still have their three blazing fast running backs, Rainey, Demps, and Moody. They will have to get more out of these guys. However, this game will on artificial turf in the Georgia Dome. The fast track definitely favors Florida.

Florida will also get Lawrence Marsh back. That should help, as they looked a little vulnerable up the middle at times against FSU. If the Gators are stout up the middle (as they have been all year,) it will be very tough for Alabama to score much. Oh what I wouldn't give to go to that game!

Monday, November 24, 2008

Delta Airlines Entertainment

A few days ago I flew from San Jose back to my hometown, Panama City, Florida. To be accurate, I first flew from San Jose to Atlanta, and then from Atlanta to Panama City. On the flight from San Jose, the 757 was equipped with an onboard entertainment system. There was a free part and for-pay part. The main component of the free part was satellite TV courtesy of Dish Network. The other part was a decent selection of free music to listen to. The for-pay was comprised of movies, HBO programming, and a few games.

The Dish Network TV was a major disappointment. It did not work at first. Then it started to work, but would consistently freeze up. This would cause a reboot of the system. The only amusing part of this was that it was Linux system and I got to repeatedly watch the Red Hat reboot sequence. I was never so annoyed to see that freakin' penguin... I've flown JetBlue on several occasions and always enjoyed DirecTV on it with no problems. Actually the first time I flew JetBlue was the night that the Iraq War began (Shock and Awe! Go America!!) Talk about a surreal experience: watching live coverage of the beginning of the war while on a jet (flying to Vegas for the beginning of March Madness...) I digress.

The free music was not so bad actually. I was lending my MacBook to Michael, Jr. so he could watch some DVDs, so I was going to listen to my Nano. I wanted to listen to Coldplay's latest, so I was distressed when I realized that I had loaded up all of Coldplay's music except Viva la Vida. However, I noticed that Delta had that album available, and so I listened to it. The sound quality was even worse than the average sound quality of the Nano, but I still appreciated it.

I didn't try out the for-pay entertainment. Michael, Jr. finished watching the Leap Frog DVDs that he wanted to see, and wanted to draw. So I got the MacBook back and reviewed some chapters from a couple of books-in-progress. I did notice that Delta will begin offering Wi-Fi on flights for $10-13 per flight, depending on the length of the flight.

Wednesday, November 19, 2008

Gumbo and Data

This week I was at Adobe MAX. While there I picked up a DVD with a preview of Flex Builder 4 and Flash Catalyst (Thermo.) One thing of particular interest to me was the new data-centric abilities of Flex Builder. Let's take a look at them.

Recently I finished an article for IBM on using Grails and Flex together. In the article I created a Digg-clone and used things like Flex DataGrid and Forms to browse and submit stories. I decided to see how well the new Flex Builder tools could create the same code for me. So I took the existing service and imported into Flex Builder

The screenshots give you an idea of the process. As you can see from the final screenshot, there is a lot of code generated! A service proxy and return types are both generated. Actually for each there is a base class with all of the guts and empty class built for you to customize. There is a modeling class that contains metadata about the return type. The base service class uses a new Flex class HTTPMultiService. It is a lot of code, a lot more than I wrote. Most of it seems reasonable though.

You can also generate a data grid for displaying the data. This worked perfectly for me. Gumbo also promises that in the future you will be able to generate other components, in particular a Form.

Friday, November 14, 2008

Facebook Puzzler in Scala

An associate of mine pointed out some fun programming puzzles on Facebook. This one was interesting to me. It was interesting because I think it is poorly. Well at the very least it confused on first read. When specifying the cost/weight, if they would have just used the word 'each' in there it would have been far less confusing... Anyways, I decided to solve the problem in Scala, since I've been doing a lot more Scala programming lately. Here is my Solver:

object Solver {
def cache = new scala.collection.mutable.HashMap[(List[Int], Int), List[List[Int]]]()

/**
* Generates a list of solutions to the Diophantine inequality
* w1*x1 + w2*x2 +... wN*xN >= max
* where weights = (w1,w2,...wN)
* Each solution is a minimal solution.
* This means that if (x1,x2,...xN) is a solution
* then (x1,x2, ... ,-1 + xM , ... xN) is NOT a solution
*/
def solve(weights: List[Int], max: Int): List[List[Int]] = {
if (cache.contains((weights,max))){
return cache((weights,max))
}
if (weights.length == 1) {
return List(List(max / weights(0) + 1))
}
var all: List[List[Int]] = Nil
var a = 0
while (a * weights(0) < max) {
all = all ++ solve(weights.drop(1).toList, max - a * weights(0)).map(a :: _)
a += 1
}
val solution = (a :: weights.drop(1).map(_ * 0)) :: all
cache.put((weights,max), solution)
solution
}

/**
* For a given set of weights (w1, w2, ... wN) and costs (c1, c2, ... cN)
* This finds the solution (x1,x2,...xN) to the inequality
* w1*x1 + w2*x2 + ... wN*xN <= max that minimizes the total cost
* c1*x1 + c2*x2 + ... cN*xN
* It returns the solutions as a Tuple where the first element is
* the solution (x1,x2,...xN) and the second is the minimal total cost
*/
def optimizer(costs: List[Int], weights: List[Int], max: Int): (List[Int], Int) = {
val solutions = solve(weights, max)
var answer: List[Int] = Nil
var best = (answer, Integer.MAX_VALUE)
solutions.foreach((solution) => {
val cost = solution.zip(costs).foldLeft(0){(a,b) => a + b._1*b._2}
if (cost < best._2) {
best = (solution, cost)
}
})
best
}
}

I put in the cache for memoization purposes, but it doesn't always help. For example, with their sample input/output, the cache is useless. Anyways, I showed the problem to other smarter folks who immediately pointed out that this was a variant of the unbounded knapsack problem and that my solution uses dynamic programming.

Here's a dirty little secret about yours truly. I studied mathematics in college, not computer science. So it's always amusing for me to come across a problem like this and have people start talking about CS terms. Personally I looked it as a Diophantine equation (inequality to be more accurate.) Of course maybe if I was a CS person, then I would have written a nicer solution.

Tuesday, November 11, 2008

Scala Constructors

Tonight at BASE, I had a rant about Scala constructors. So I'll just continue the rant here. Constructors seem great in Scala. At first. They give you some great syntactic sugar where it creates accessors/mutators all in one shot:

class Stock(val name:String, val symbol:String, var price:Double, var change:Double){
}

This lets you do nice things like :

val stock = new Stock("Apple Computers", "AAPL", 94.77, -1.11)
println(stock.symbol) // works great
stock.price = 95 // works good, price is var
stock.symbol = "APPL" // won't compile, symbol is a val

Yay, no getter/setter garbage. But what about overloaded constructors? You can kind of do that...

class Stock(val name:String, val symbol:String, var price:Double, var change:Double){
def this(name:String, symbol:String) = this(name,symbol, 0.0, 0.0)
}

So in Scala you can do implement the telescoping constructor anti-pattern. Nice. But what if you got your stock data as a CSV from Yahoo's web service? You need to do some parsing. You might think this will work:

class Stock(name:String, symbol:String, var price:Double, var change:Double){
def this(name:String, symbol:String) = this(name,symbol, 0.0, 0.0)
def this(csv:String) = {
val params = csv.split(",")
name = params(0)
symbol = params(1)
price = java.lang.Double.parseDouble(params(2))
change = java.lang.Double.parseDouble(params(3))
}
}

Nope, this won't work. You can only do a single statement in the 'this' constructor, and it must be to either the main constructor or another 'this' constructor. No extra code. Bill Veneers pointed out that this often leads to code like the following:

case class Stock(val name:String, val symbol:String, var price:Double, var change:Double){
def this(name:String, symbol:String) = this(name,symbol, 0.0, 0.0)
def this(ser:String) = this(parseName(ser), parseSymbol(ser), parsePrice(ser), parseChange(ser))

def parseName(ser:String) = ser.split(",")(0)
def parseSymbol(ser:String) = ser.split(",")(1)
def parsePrice = java.lang.Double.parseDouble(ser.split(",")(2))
def parseChange = java.lang.Double.parseDouble(ser.split(",")(3))
}

Oy. I think even the most enthusiastic Scala programmer would agree that is some very smelly code (and inefficient to boot.) A more common pattern is to use a factory object:

object Stock{
def apply(ser:String):Stock = {
val params = ser.split(",")
new Stock(params(0), params(1), java.lang.Double.parseDouble(params(2)), java.lang.Double.parseDouble(params(3)))
}
}
class Stock(val name:String, val symbol:String, var price:Double, var change:Double){
def this(name:String, symbol:String) = this(name,symbol, 0.0, 0.0)
}

Having a singleton object and a class by the same name is a construct introduced in Scala 2.7. Now usage looks like this:

val apple = new Stock("Apple Computers", "AAPL", 94.77, -1.11)
val microsoft = Stock("Microsoft,MSFT,21.20,-0.10")

Kind of inconsistent, no? In one place you use the new, but to get the benefit of the factory, you can't use new. So usually people change the class to a case class:

object Stock{
def apply(ser:String):Stock = {
val params = ser.split(",")
new Stock(params(0), params(1), java.lang.Double.parseDouble(params(2)), java.lang.Double.parseDouble(params(3)))
}
}
case class Stock(name:String, symbol:String, var price:Double, var change:Double){
def this(name:String, symbol:String) = this(name,symbol, 0.0, 0.0)
}

Now usage is more uniform:

val apple = Stock("Apple Computers", "AAPL", 94.77, -1.11)
val microsoft = Stock("Microsoft,MSFT,21.20,-0.10")
val test = Stock("Test Stock", "TEST")

I guess that is ok. Becuase Stock is now a case class, you don't have to declare name and symbol as public vals. I kind of like using 'new' and I really don't like having to create both an object and a class just to get overloaded constructors. I think it is still a code smell.

Update: In the comments it was pointed out that the companion object pattern (object and class of the same name) has been around for a relatively long time. What was introduced in Scala 2.7 was allowing case classes to have companion objects. So the last version of the code will not compile on anything but Scala 2.7+, but if you make the Stock class a normal class then it will. Of course then you are back to the problem of having two different syntaxes for the constructor, one that needs the 'new' keyword and one that does not (and cannot.)

Friday, November 07, 2008

Pastrami Dog?

I was driving home tonight and noticed a local Wienerschnitzel advertising a pastrami ... thing ... on a (hot dog) bun. This reminded of exactly one thing: being a kid, in high school or college, and scrounging through the kitchen looking for something to eat. All you find is some sandwich meat and some hot dog buns. There is no bread to make a proper sandwich, so what do you do? You used the hot dog buns. Now Wienerschnitzel has taken this masterpiece and turned it into a product.

Wednesday, November 05, 2008

You Want XUL, You Got XUL

I've written a lot of articles for IBM over the last couple of years. I've covered a lot of topics. It was interesting for me to see which articles have been viewed the most. Last year I did a tutorial on XUL for beginners. To be honest, I really thought XUL was a very niche topic. If you are going to write a Firefox extension, you have to learn XUL. There are lots of resources out there about that. I did not want to write about that kind of XUL development. So instead I wrote about creating a XUL desktop application that used a lot of web development skills. Hey if Adobe can market AIR as a way for web developers to create desktop apps, then why shouldn't Mozilla do the same thing with XUL? It was a fun article to write, but I didn't expect it to be especially popular. Boy was I wrong!

That tutorial has been one of the most popular things I have written for IBM. So it made sense to update it this year. The tutorial used Firefox 3 as a XUL runtime. When I wrote it, Firefox 3 was in alpha stage. Obviously it has been released since then and is in wide use. So the opportunity for web developers to use XUL to create desktop applications is greater than ever. This week IBM published the updated tutorial, so go check it out.

Tuesday, November 04, 2008

Slides from AjaxWorld

Posted by somebody else! Thanks!
Netapp Michael Galpin
View SlideShare presentation or Upload your own. (tags: ajaxworld-2008)

Saturday, November 01, 2008

Vote by Mail

This year I voted by mail. It was very convenient. The state of California even sent me a nice little "I voted" sticker. So how did I vote? Most things will probably not be a surprise if you read my blog on a regular basis. There were a lot of things to vote on, so here is a quick summary.

President -- Barack Obama. I am no fan of his proposed economic policies, but the war remains the most important issue to me. Besides, most of McCain's proposed policies are so similar to Obama's. It is very sad.

Congress -- I voted against my congressman, Mike Honda. I wrote him about the bank bailout, but he still voted for it. Gotta vote against him for that.

Propositions -- These are the fun stuff here in California. The most debated is Proposition 8. This was a no brainer to me, I voted against it. Terry put together a nice post about Prop 8. I was surprised to find myself voting for Prop 2...

Wednesday, October 29, 2008

EclipseWorld

This week I have been speaking at EclipseWorld in Reston, VA. I have been talking about Ruby on Rails mostly, and a little session on iPhone (web) development. Most of the developer here are Java developers, so they look at Rails as a way to make their jobs easier. It's a grim economy, and they are being given the classic task of doing more with less. They look at Ruby on Rails as a way to do just that. More features. Less code. Happier developers!

EclipseWorld has been a great conference to speak at. What has been very cool for me is interacting with developers outside of Silicon Valley. Now don't get me wrong, if you are a developer, especially a web developer, then Silicon Valley is the place you want to be. I would compare it to working on Wall Street if you are in finance, or working in Hollywood if you are in show business. It's not for everybody, but it presents the chance to prove that you've got what it takes, and, if you are lucky, a chance to make a lot of money.

However, in the Valley it is easy to forget that most web development is not about creating The Next Big Thing. In the Valley, @tychay will rip you up for using Rails because it fails at web scale. On the east coast, I have met a lot of developers creating internal web applications, or maybe customer service applications. These are applications that can run just fine a single multi-core box, even with the database running on the same machine. They aren't stressing out over page weight, database partitioning, or terabytes of cache. They are creating sophisticated applications, and always have way more feature requests than they have time.

These are the people who are most empowered by tools, especially Eclipse. They don't have some huge team with specialists for doing the CSS or tuning the database. They do it all themselves, and Eclipse makes that a whole lot easier. I've written a lot about things you can do with Eclipse, but this experience has really put things into better perspective.

Wednesday, October 22, 2008

AjaxWorld 2008

AjaxWorld was this week, and it was interesting. I think the down economy is having an affect on everyone, but there were still a lot of interesting things to learn about. On Monday, I did a talk on a favorite topic of mine, networked applications. The talk was a lot of fun, hopefully the audience would agree with that assessment. Overall though, I would say there were a couple of major themes at AjaxWorld this year.

1.) Comet. There were a lot of talks about some form of data push from the server to the browser. Kevin Nilson did a nice job of differentiating Ajax (infinite loop of XHR polls) vs. Comet (long poll.) The folks at ICEFaces have built some nice abstractions on top of Comet. There was also a lot of interest around WebSockets, especially the work by the folks at Kaazing. A duplexed socket connection to the server sounds great on paper. I think there will be some very interesting technologies that grow around that.

2.) Don't make me learn JavaScript! There seemed to be a lot of folks advocating the "only know one language" approach to the web. In most cases that language was NOT JavaScript (even though the Jaxer guys say it can be.) Vendors liked Oracle and ICEFaces preached abstractions that shielded the developer from JavaScript. Of course the GWT folks say do everything in Java. Microsoft says use Silverlight so you can do everything in C#. Of course one of the best sessions was by Doug Crockford who told everyone to man up and learn JavaScript. I tend to agree with Crockford, even though I prefer ActionScript...

Wednesday, October 15, 2008

Recession and The Valley

Are you a programmer who recently moved to Silicon Valley? Are you nervous about what things will be like now that the greater economy has gone pear shape? Then listed to this old man tell you all about his experiences in the last recession in the Valley.

I moved to the Bay Area in 2000, right as the last great recession was getting started. Like this recession, it was happening in an election year and it really hurt the incumbent party. I started off working at a very small start-up in San Francisco called InternetElements. They had a cool idea. Allow any bank or even large organization provide the tools to their members to buy and sell stocks. Remember that back then the stock market was really hot, much more so than it has gotten in the last few years. New companies sprung up all the time and went IPO. Everybody bought into the IPO and made crazy money. It was great.

Anyways, once the stock market started tanking and IPOs disappeared, nobody had much stomach for InternetElements' idea. I was employee #4 there. Our CEO basically told us that we had cash in the bank to make payroll up until a certain date. This was about a month before that date. There were talks going on to get us either some more money or do a merger with another company that had more money, but neither of those worked out.

I started looking for a new job. This was fall of 2000, so things weren't too bad just yet. I didn't have much experience, but I had a degree from a (among the tech world) well known school. So that opened doors for me. I found a new job two weeks before the old one was set to die, and started at the new one on the Monday after InteretElements stopped promising to make payroll. The two founders of the company continued on trying to salvage their company, but there were no hard feelings at all.

My next company was called RMX. I would work there for the next two years. It was also a start-up, but was sort of a spin-off from Chevron. I was there throughout the recession. When I first started, we were expanding pretty quickly. They had hired a lot of consultants to build the initial site and needed to replace them with full-time employees. There was also a grand vision of a big company with an large and intricate org chart.

That vision died pretty quick. Soon the hiring stopped, and then some layoffs started. Tech companies everywhere were struggling. We knew there would be no additional rounds of funding. So we had to become profitable to sustain ourselves. Our management were great. They explained everything in detail at all times. We met as a company every Friday morning. Management talked about how much money we had in the bank, what our burn rate was, and what kind of sales prospects we had.

We worked really hard at RMX. We closed deals with new customers. We drastically cut costs by replacing licensed software with either open-source or in-house built software. By the spring of 2002 we were profitable with about $4M in the bank still. Everybody felt pretty good about themselves. We had survived the recession, even when it got amplified by the aftermath of 9/11. Or so we thought...

In May of 2002, our board of directors voted to shut us down. The reasoning? Even though we were profitable, the recession had changed the landscape they thought. Our ceiling was much lower, even though our risk was also now very low. We weren't a worth investment, so decided to liquidate us.

Half of the company was laid off within a week of that. The other half was kept around. We had customers and contracts with those customers that required us to help them transition to not using our service. Basically each customer got a copy of our code so they could run our service for themselves, on their own hardware. So all of the engineering folks were needed for the transition, but obviously sales and marketing folks were not.

Everybody laid off got one month's pay as severance. Everybody who was not laid off and staid until the end got one month's pay severance, and got a bonus for staying to the end. I was one of those folks. It was a pretty good deal in some ways. I had a job, while it was understood that I would look for a new job. However it was pretty depressing. Half the folks in the company were gone, including a lot of friends. Everybody still working knew their own end was in sight. The closer it got, the more stress people felt.

I wound up staying until the very end. We had a big party on the last day. It was nice, but it was pretty upsetting for me. After two years with a start-up, I had a lot of emotional investment. I was too shook up by it to even say proper good-byes to everyone.

The next week I went on unemployment! I had COBRA papers ready to file when my health insurance ran out. And I did a lot of job hunting. I felt a lot of desperation to find a new job and took the first offer that came my way. That was a mistake in hindsight. I wound up only going one week without a job.

The new job was a contract and it was doing code in C#. I was intrigued about learning a new language, as I had only done Java, Perl, and a little C++ previously. I was a total gun-for-hire at this job, and I was not used to that. I had been an integral part of a start-up for the three years prior to that, and I did not adjust well. Luckily after four months, I found a job at Yet Another Startup: KeepMedia, now MyWire.

The worst of the recession was over in 2003, but things were not peachy. I started working at KeepMedia in February and we launched that summer. It was a great company, and I was back in the kind of role I liked. Things did not take off like we wanted. I think that had more to do with the business plan then the economy, but who knows. We never had any layoffs or anything like that there. But we did everything on the cheap, and I do mean cheap. Our biggest expense was an Oracle database. We were scared to put people's credit card numbers in a MySQL database.

Anyways, that was an interesting experience too. It was a stat-up that started in a recession. What was a little different about KeepMedia is that we were funded by a single person, Louis Borders. We did not have a certain amount of money in the bank and there were no plans to seek VC funding. That would have been tough to get anyways at that time. But there was still huge emphasis on saving money at all costs. We were very creative at doing that. I learned a lot of valuable lessons by having such constraints placed on the systems I built.

I wasn't at KeepMedia very long. That's a long story in itself, and I hated to leave. However, by the time I left, the recession was officially over in The Valley. Let me summarize some lesson that I learned back then:

1.) Start-ups are still start-ups. They are no better or worse just because there is a recession going on.
2.) However, if you are at a start-up, it becomes even more important to know what the heck is going on.
3.) You should still be picky about your job. Don't let the recession force you into a job you hate. Now the recession can force you into that situation, i.e. you are running out of money, etc. But don't put yourself into that situation artificially.
4.) If you do find yourself in a bad position, don't be afraid to make a change.

That covers the professional side of things for me. When it comes to personal things, honestly the last recession did not affect me negatively. Rent prices dropped a lot during that recession, mostly because they were way too high before it. I never took a pay cut and I never had any money in the stock market other than my 401K. So in many ways my buying power actually increased during the recession. Now if I had been unemployed for a long stretch... well obviously that would have been a lot different.

Will this recession be even worse? I actually don't think it will be worse for The Valley, just because the last one was so bad. This one looks like it will be worse for the country at large, and maybe it will last longer. The last recession was four years solid in The Valley, though maybe less elsewhere.

Monday, October 13, 2008

ActionScript Vector Performance

Flash Player 10 is coming out this month. This weekend I went to Adobe's FlashCamp. It was a lot of fun by the way, and a big thank you must go to Dom and the folks at Adobe for a great event. Adobe really treats developers well. Anyways, there are a lot of great new features in Flash Player 10. Other folks will talk about 3-d and text engines and Pixel Bender, etc. but me? I'm excited about Vectors.

If you are like me, then the first time you heard about the new Vector class in ActionScript, you thought it might have something to do with graphics and what not. However, I was overjoyed to learn that it was like the Vector from C++ and Java, i.e. a list-style collection. Like the Vector from the STL and in Java 1.5+, it is parameterized. Even better, you can specify a fixed length for it. In other words, you can say that is a collection that only allows one type of object and that there is fixed number of objects in it. This leads to excellent performance optimizations that are possible, and indeed Adobe has taken advantage of this. Of course I had to test this out for myself.

One of the other nice things about FlashCamp was that they gave us a new build of Flex Builder. This one has support for the new language features supported in Flash 10. To enable the new features, you just change the Flash Player version that you target and voila! I took some benchmark code that I had used for Flash/JS comparisons. Here is the original code:

private function testArray():Number{
var startTime:Date = new Date();
var arrStr:String = null;
var arr:Array = new Array();
var i:int=0;
for (i=0;i<=94;i++){
arr.push(i);
}
for (i=0;i<=arr.length;i++){
arr.push(arr.pop());
arr.sort().reverse();
arr.push(arr.splice(0,1));
}
arrStr = arr.join();
return (new Date()).getTime() - startTime.getTime(); }

And here is the new version that uses a Vector instead of an Array.

private function testVector():Number{
var startTime:Date = new Date();
var v:Vector.<int> = new Vector.<int>();
var arrStr:String = null;
var i:int = 0;
for (i=0;i<=94;i++){
v.push(i);
}
for (i=0;i<=v.length;i++){
v.push(v.pop());
v.sort(comp).reverse();
v.push(v.splice(0,1));
}
arrStr = v.join();
return (new Date()).getTime() - startTime.getTime(); }

Do you like the Vector syntax?

Anyways, back to the results. The Array averaged around 90 ms on my MacBook. The Vector code averaged around 20 ms on my MacBook. 4.5x? Very nice.

One thing I immediately wondered about Vector was if the Flash compiler erased the type. At first glance, there is no reason to do this. It is a different type, so there is no backwards compatibility issue. It does not extend Array, but it is a dynamic class. The documentation states that the push() method does not do type-checking at compile time, but does do this at runtime. This seemed weird, but it would imply that type information is not erased, since it can be checked against at runtime. However, in my testing I could use push() to push any object into any Vector, and had no compile time or runtime errors.

Keynes vs. Hayek, The Final Round

This is it. Much of the last century has been a showcase of two divergent schools of thought in economics: Keynes and Hayek. Keynes ruled up until the late 70's. The Hayek school found believers in Thatcher and Reagan, but they were both compromised. At best a compromise was struck, with attempts at "supply side" economics that was close to Hayek's Austrian school, along with more Keynesian monetary policy.

Now we have the kind of financial implosion that Austrians have all said was an inevitable consequence of Keynesian monetary policy conducted by central banks. Governments have responded with extreme measures -- extreme Keynesian measures. Austrians aren't willing to say that this won't work, but do say it is only delaying an even worse fate.

The Austrians are smart folks, but they don't like to be measured and tested. They denounce any kind of objective, scientific measurement of their ideas. But they cannot avoid this one. This is it. If you are a follower of Hayek, then you must agree that we will see economic hardship on a grand scale within the next ten years or so. How grand? Again the Austrians will never give you numbers, but you gotta figure we're talking Great Depression kind scale. That would be 25% unemployment, western governments collapsing, democracy giving way to totalitarianism. If we don't have something like that in the next decade, just a run of the mill recession, then the Keynesians (and most of civilization) win.

Wednesday, October 08, 2008

MLB Final Four

What can you say about the Cubs ... but wow. Statistically they were favorites against the Dodgers. Most people would have said that they were "heavy" favorites, since they had the best record in the NL. As an Atlanta Braves fan, I can tell you how little that matters. Statistically there was a 48% chance of the Dodgers winning, vs. 52% for the Cubs. That being said, there was only about a 10% chance of a sweep...

Of course those numbers are based on season statistics, and many would point out that the Dodgers were a much better team with Manny Ramirez on the team. Is this true? Their record was 29-24 with Manny vs. 55-54 without him. They outscored their opponents 249-214 with Manny, which would translate to a ridiculous 40-13 expected record. Even with 53 games, you see the craziness of small sample sizes... The Dodgers actually gave up slightly more runs per game with Manny than without him, 4.04 vs. 3.98. So the improvement really was in the offense. They scored 4.7 runs per game with Manny, vs. 4.14 without him.

So, Viva la Manny? The small sample size skews things, but they sure look like good picks for the NLCS. The Phillies were a better team in the regular season, but nobody is as good as the Manny Dodgers. You're not going to find me picking the Dodgers. The Braves were in the NL West for a long time, so I learned to hate the Dodgers many years ago. Of course that's only gotten worse since I moved to the Bay Area nine years ago.

So what about the ALCS? Boston is statiscally a better team than Tampa Bay. What is unusual too is that these two teams had strong home vs. away stats. Both teams were much better home teams than road teams. Tampa Bay won the AL East, so they have home field advantage. Could the home team win every game in this series? Even with these teams it is statistically unlikely, but the home team bias suggests that this series will be very close.

By the way, it should be no surprise that the ALCS is between two AL East teams. Six of the top AL hitters in terms of runs created were from the AL East. Ten of the top twenty hitters in terms of runs created per 27 were also from the AL East. Eight of the top fifteen AL pithcers in terms of ERA were also from the AL East. And it's not just Boston, Tampa Bay, and New York. Baltimore and Toronto also had very good hitters (Nick Markakis, Aubrey Huff, Alex Rios) and pitchers (Roy Halladay, Jeremy Guthrie).

Sunday, October 05, 2008

October Talks

October is going to be a busy month for me. Next weekend I will be at Adobe's FlashCamp. I will be there Friday night and Saturday, and I may do a short session on TwitterScript, the ActionScript API that I maintain. In particular I want to talk about some of the authentication wrinkles present in TwitterScript and its forked brothers.

On October 20, I am speaking at AjaxWorld. I am going to be talking about a subject near and dear to me, Networked Applications. I'll be talking about why you shouldn't waste the power of your servers building HTML strings but why you should instead start using things like jQuery, GWT, or Flex to cash in on the power of your user's computers.

The week after that, I will be on the east coast speaking at EclipseWorld. On Day One, I am doing a day long, introductory workshop on Ruby on Rails. Of course I'll also talk about how Eclipse can help you out. On Day Two, I am doing two talks. One ties in to the previous day's workshop and is about RadRails. The other session is on ... iPhone development. Kind of a strange topic for me. Chris Williams from Aptana was supposed to do both sessions, but couldn't make it. So Aptana asked me to fill in for him. Hopefully they won't wind up regretting that decision!

Friday, October 03, 2008

Feed of Shame

What do you do when you are on Facebook and notice this in your feed?
Palin? Really? I can understand people supporting McCain. If you are pro-war, then you should be pro-McCain. If you are in a very high tax bracket, then it is in your best interest to vote for McCain. There are other rational reasons as well, and of course there is the old standby "he's not as bad as the alternative."

But what would make you support Palin? Are there women who she appeals to, despite her extreme anti-abortion stance? Maybe you like her views, that is reasonable. But then to have as a spokesperson for your views is ... embarrassing to say the least:


Tuesday, September 30, 2008

WSDL in ActionScript

One of the advertised features of Adobe's Flex Builder is that it works with web services. Indeed, in any project you can import a WSDL and Flex Builder will generate lots of code for you. The resulting generated code states that the code generator is based on Apache Axis2, and it looks like it. This is mostly a good thing.

This is ok for a single developer or even a small team. Once you get to larger scale development, you usually want to keep code generated artifacts separate for the source code that generated them. Often you never want to check-in generated code. Why? Because then you have two sources of truth: the artifact (WSDL in this case) and the generated code. You don't want to have to keep these things in sync manually, you want your build process to do it. So you don't check in the generated code, and your build system generates it instead.

So ideally the code generation in Flex Builder could be invoked outside of Flex Builder. This may be the case, but so far I have no luck in this. It is certainly not a documented part of the SDK.

I looked for an alternative and found wsdl2as. This looked promising, but did not work out. First, it expects you to send in XML literals when sending SOAP messages. Sure it generates the boilerplate around the core message, but if I wanted to work directly in XML, I would not have bothered with a code generator. It has an option that seems designed to deal with this, but did not. Even worse, it does not handle anything except the simplest WSDLs. The first WSDL I tried with it defined complex types for the input parameter and return type of the web service. This caused wsdl2as to choke, as it expected any type information to be inlined. Sigh.

Monday, September 29, 2008

At Season's End

The regular season of Major League Baseball is at an end. That is always a bummer to me. One of the reasons that I like baseball so much is that it is played every day. Every day something interesting happens. Of course the playoffs are here, but there is not much joy in those for me this year. No Braves. No A's. No Giants. At least there are no Yankess or Mets, though...

It is always fun to look back at the season, and of course, to speculate on the future. Who should win the awards? And, who should win in the postseason? Being a numbers man, the awards are the most fun to examine.

AL MVP
This is a close race because there are no outstanding candidates. In fact, top AL hitters were significantly weaker than NL hitters this year. If Lance Berkman or Chipper Jones was in the AL, you could make a very strong case for them as MVP... Let's look at a couple of relevant stats. First, runs created:

1.) Grady Sizemore, 128
2.) Josh Hamilton, 122.8
3.) Dustin Pedroia, 120.2
4.) Nick Markakis, 118.4
5.) Aubrey Huff, 116.5

That is a nice advantage for Grady Sizemore. One reason for the advantage over the other players is that he played a lot and lead off, leading to a lot of plate appearances. Still he had a very good season. Who would guess that a lead-off hitter would have 33 home runs and 98 walks? Perhaps he should not be hitting lead-off... A more weighted number is runs created per 27 outs. Here is that top five.

1.) Milton Bradley, 8.97
2.) Alex Rodriguez, 7.89
3.) Kevin Youkilis, 7.8
4.) Carlos Quentin, 7.67
5.) Nick Markakis, 7.42

Only one hold-over from the previous top five, and that is the very underrated Markakis. Perhaps he is the MVP? Perhaps. The other leaders in total runs created are all in the top eleven in runs created per 27 outs. For a final measure, let's look at the top 5 in VORP.

1.) Alex Rodriguez, 65.6
2.) Grady Sizemore, 62.7
3.) Dustin Pedroia, 62.3
4.) Aubrey Huff, 58.4
5.) Josh Hamilton, 57.1

Another very different top five! Even missing some games, A-Rod provided the most "value" for his team. Don't tell Yankee fans this, as I am sure they are working on a way to blame their postseason absence on A-Rod. I can just imagine "Ah, Moose got us 20 wins, if only A-Rod could have hit some!"

From a pure statistical consideration, Milton Bradley was the most "potent" hitter, but only played 126 games. Throw him out, and it sure looks like you would have to go with A-Rod as MVP, once again. If I had a vote, that is who I would go with.

That is not going to happen, and everybody knows it. People like to vote for players who are on "winners". You have to be clearly the best (and even that is not good enough often) to get a MVP trophy and be on a team that is not playing in October. So the people they list are folks like Boston's Pedroia and Youkilis, as well as Justin Morneau and Joe Mauer from the Twins. If Carlos Quentin had not broken his hand during a temper tantrum, he would surely be a front runner. The other name I've heard is Francisco Rodriguez, from the Angels.

Given that, it would seem that Pedroia has the advantage over the other "candidates."

NL MVP
This one is a little easier. Albert Pujols lead the league in all of the stats mentioned previously. He was clearly the best hitter in the league, and nobody is really arguing this one. Ryan Howard's .251 average pretty much guaranteed that he is not in the mix. He is the only guy with "traditional" stats (HRs/RBIs) that beat Pujols, and he plays for a division winner. He also finished very strong, just as his team did, coming from behind to pass the Mets in the last month. But there's no chance of this argument working! Let us hope not at least...

AL Cy Young
This is viewed as a two horse race between Cliff Lee and Roy Halladay. That is good, but that is how it should be. They were far and away the two best pitchers in the AL. Nobody was even remotely close. Most people think that Lee will win because, well because he is a winner. His 22 wins jumps out. He also led the league in ERA. It is rare for a pitcher to lead in both of those stats and not win the Cy Young. For what it's worth, he led the league in VORP as well, edging out Halladay. You can make nice arguments about how pitched against weaker compettition, but it's hard to imagine too many people buying that. Cliff Lee should win and will win.

NL Cy Young
Now this is more interesting. Once again a lot of people think it should be a two-horse race. Once again they are right, but they've got the wrong horses. Most people think it is between Brandon Webb and Tim Lincecum. This may indeed be the two "finalists" for the award, but it should not be that way. Webb was nowhere near as good as Lincecum. He just has a lot more wins, and people get carried away over wins. So Lincecum should be Cy Young, right?
I won't argue against it, especially since I root for the Giants against most teams. However, there is a guy who has been just as good, and maybe even a little better than Lincecum: Johan Santana. He edged Lincecum in ERA, and in VORP (73.4 to 72.5.) Statistically, over the course of the season, he was worth about one extra run (total) more than Lincecum. By comparison, Cliff Lee edged Halladay by about 3.5 runs in VORP.
If you start making the "they played for a winner" argument, then clearly Santana has the edge over Lincecum. You can take that one step further. The Mets were battling the Phillies for the NL East crown this weekend. On Saturday they sent Santana out on short rest and he delivered better than you could hope for by throwing a complete game shutout while striking out nine. I think "clutch" is an illusion, but most people belive in it and I am sure they would say that Santana was as clutch as it comes. He definitely did everything he could to get his team in to the playoffs.
So if people were talking about Lincecum vs. Santana, I would guess they would pick Santana. But they are not. They are only mentioning Lincecum vs. Webb. Lincecum is the clear choice there. Personally if I had a vote ... I would vote for Santana. He has been a little better. The NL East is much better (in terms of hitters) than the NL West.

Thursday, September 25, 2008

The Great Bailout

"OMG! The _____ is in trouble! What are we going to do!!!?!"

When government people say things like this, it is always a precursor to the government proposing itself as the solution to the problem. The problem is so dire, that only the government can solve it. Of course they will need more money and more power to solve the problem. Oh, and if you don't think this is all true, then you are too dumb to understand the problem or you are just un-American because you don't care about all of the Americans who could be hurt by this grave danger.

Mr. Dave Winer makes the point that the current administration has used this argument before. Only then it was Colin Powell making the case for war in Iraq. Now it is Henry Paulson doing the same thing but with regards to the banking meltdown. Dave is right on all of this. He then goes out of his mind by suggesting that Bush/Cheney should resign, Nancy Pelosi be made President, and Paulson's plan to move right ahead. The problem is not just Bush/Cheney, and Pelosi is definitely not the solution. The problem is Paulson's request for power and money. It's like saying it would have been ok to listen to Colin Powell and attack Iraq, but only if Al Gore would have been president. It didn't matter who was President, attacking Iraq was wrong in every possible way. 

Of course Ron Paul has some interesting things to say about the bailout. His opinions are largely grounded in the Austrian economic theory that the government makes business cycles more extreme (bigger booms and bigger busts) by causing malinvestments, like buying subprime mortgages for example. Like all things in Austrian economics, it is a matter of "belief" as these are statements that are purposely impossible to scientifically verify. However, it is hard to dispute that the U.S. government has encouraged high risk loands for the purpose of buying real estate, and that the very financial institutions who did this most are now the ones that are going bankrupt.

The point is that our government does not have a good track record here. Maybe it has been the main source of the problem, as Paul suggests, or maybe not, but it certainly has been part of the problem. Now it wants unprecedented (in this country at least) power and money to solve the problem that it has been at least complicit in. Given that, how can we support this idea?

Oh, but what is the alternative? I don't know, and I don't think the government knows either. Yes, there will be banks that go under. Does that mean that we'll all be out of money? No, of course not. Anyone's savings are already guaranteed by FDIC. Not to mention that even in the case of bankruptcy, creditors (that would be people that bank borrowed money from, i.e. depositors) have first priority. Nobody is going to lose their savings. 

But surely there will be other disasters, right? If so many go out of business, how will we get loans for houses, cars, or new businesses? Well perhaps not all of the banks will go out of business. Certainly there are those that have been buying up these insolvent banks. Or maybe other companies will take the opportunity to expand into the banking vacuum created by the insolvent banks. I'm not sure, but I'm not willing to let FUD from the government convince me to give the government the kind of virtually unlimited power that they are asking fo.

Tuesday, September 23, 2008

No SharedObjects Allowed

Client side storage by the Flash player (SharedObjects) has several advantages over traditional client side storage, a.k.a. HTTP cookies. From a security standpoint, it is better because the data is never sent over the wire. However the main advantage to most people is that it is bigger, and when it comes to managing data on the client, size definitely matters.

By default you get 100 KB instead of the 4 KB you get with cookies. If your application tries to store 101KB, it won't fail. Instead the user will be prompted to increase the allocated space by a factor of 10, i.e. from 100 KB to 1 MB. Of course you probably don't want the user to ever see this screen. One of the other advantages of SharedObjects is that people don't delete them. People blow away their cookies all too often, but most people would have no idea how to do the same with SharedObjects. The only you would find out would be if you saw the Flash player settings screen, i.e. the interface that appears when a Flash application tries to go over the 100 KB default limit.

So stick to under 100 KB and all is good, right? Not so fast. The settings interface requires that your Flash app is at last 136x213. If it is smaller than that, then what happens? First let's explain what happens when it is big enough to show the settings interface. When you flush data to local storage, a string is returned with a status. Here is typical code for this.


var testSo:SharedObject = SharedObject.getLocal("test", "/", false);
testSo.data.testValue = "test";
var soStatus:String = testSo.flush();
if (soStatus != null){
switch (soStatus){
case SharedObjectFlushStatus.PENDING:
testSo.addEventListener(NetStatusEvent.NET_STATUS, someHandler);
break;
case SharedObjectFlushStatus.FLUSHED:
break;
}
}

There are two possible return values, either "pending" or "flushed." There is no fail. So if you were flushing 101 KB, then you would get a pending return value. Now all you can do is what for an event, or more precisely a NetStatusEvent. This will tell you if the user allowed you to increase the size or not. If not then the NetStatusEvent will come back with a failure code.

If there is not enough space to display the settings interface, then you would think that you would just get an automatic failure, but you don't. Instead you get a "pending" from the return of flush. It's not really pending, since the user can't actually choose to allow it to succeed. It can only fail. But the player pretends this is not the case and that the user denied you request. So you need to still listen for the NetStatusEvent. If you don't catch that event, then it will cause the Flash player to throw an error to the user, and of course you do not want that. Here is a picture of that.


Monday, September 15, 2008

Death Magnet



Last week, Metallica released Death Magnetic. Your opinion of it seems to have been determined approximately 17 years ago. That is when Metallica released their self titled or so called "Black Album." For some people, this was Metallica's sell-out album. They went from being a cult favorite to being mainstream. Nevermind that they had already multiple gold and platinum records prior to the Black Album, but no one can argue with the success of the Black Album. It has always been hip to criticize that album and everything after it, and to praise everything before it. If you are hip like that, then obviously you won't like Death Magnetic. On the other hand, if you thought the Black Album was a big improvement for Metallica, then you will love Death Magnetic.
Personally, I like the Black Album and I like Death Magnetic. It is definitely in the vein of other recently successful rockers of the 80s/90s, like U2, R.E.M., and the Red Hot Chili Peppers, in that it "channels" a lot of their classic material while still sounding modern. The guitar playing is impressive, and in many ways the whole thing felt like it had been inspired by the Guitar Hero video game (which I also love to play.) In fact Death Magnetic can be downloaded and played on the XBox 360 and Playstation 3, but unfortunately for me, not the Wii...

Monday, September 08, 2008

Scala ArrayStack

I had not done any Project Euler problems for awhile, so I decided to solve one yesterday. I was also planning on attending the next BASE meeting, so I wanted to brush up my Scala. Thus it was time to solve Problem #47 in Scala.

The solution got me a little more familiar with some of the data structures available in scala.collection.mutable. In particular I needed a structure to hold a list of factors. I decided that ArrayStack was the best choice. Here is my solution:


package probs
import scala.collection.mutable.ArrayStack

object Euler47 {
def main(args : Array[String]) : Unit = {
val start = System.nanoTime
solve(4)
val duration = System.nanoTime - start
println("duration=" + duration/1000000.0)
}

def solve(n:Int):Unit = {
var i = 2
while (i > 0){
var j = i
while (j<i+n && numFactors(j) ==n){
j += 1
}
if (j-i == n){
val msg = (i until j).foldLeft(""){(x,y) => x + y + " "}
println(msg)
return
}
i += 1
}
}

def numFactors(n:Int):Int = {
var factors = new ArrayStack[Int]
var i=2
var m = n
while (i <= m/i){
while (m % i ==0){
if (factors.size ==0 || i != factors.peek){
factors += i
}
m /= i
}
i += 1
}
if (m != 1){
factors += m
}
factors.size
}
}

I was very pleased with the performance, solving the problem in about 0.4 seconds on my MacBook. I saw a similar, but not as good Java solution on the message boards that ran in 1.5 seconds. That solution added all of the factors repeated times and then had to loop through them again to get rid of duplicates. I ran it on my MacBook and it ran in 1.1 seconds. Even when I "fixed" it, it still took about one second. I am sure I could have done a lot of work to it and got it as fast as Scala, but why bother.

Thursday, September 04, 2008

JavaScript Faster than Flash

This is the last benchmark for awhile. Well, at least for today. I converted the JS benchmarks to ActionScript and tested them. The result were surprising, as JavaScript in Safari 4 and Firefox 3.x edged out Flash:


A few notes. I could not convert all of the tests, as two of them (the DOM and Ajax tests) were predicated on browser specific code. I could have done 'equivalent' functionality in ActionScript, but it did not appropriate for comparison. Otherwise the code was translated as is ... for the most part. I did add static type information where possible. There were also a few APIs (on Date and Array) that had be tweaked slightly. I tested similar changes to the JavaScript. The only test where there was any effect was the Date test. The JavaScript used Date.parse, which does not exist in ActionScript. The Date constructor does the same thing. If I switched to using the Date constructor in JavaScript, it was just slightly slower.

It certainly seems that much of the performance advantage enjoyed by Flash upon arrival of Flash Player 9 has been erased. Flash had a strong advantage still in more mathematical calculations (dates, integer and floating point arithmetic) as well as string manipulation. It did very poorly with arrays and regular expressions. I would guess that as the JITs for JavaScript get better, the string advantages will disappear. Flash will probably maintain an advantage in more mathematical computations, especially given its vector graphics features. Hopefully advances in JavaScript will spurn Flash VM progress.

Notes
1.) Tested on both Flash 9 and 10 RC2 on both OSX and Windows. Negligible performance differences in any of the permutations.
2.) Also tested with Silverlight, but only on Windows. It was slower than everything except IE7. However, that was because it was terribly slow at regular expressions and error handling. It clearly had the best JIT as it was able to reduce some of the tests to 0 after a couple of executions.

Distractions

Distractions are everywhere. Some people say that Ron Paul is a distraction. Is Sarah Palin a distraction? Or maybe it was Hurricane Gustav. I say that the economy is a distraction.

The focus of the election has become the economy. The economy is important, right? For two years in college, I actually double-majored in economics. If I wouldn't have been so lazy during my senior year, I would have a degree in it. However, it is not the most important issue in this election year, at least not to me. That distinction is still the war.

Sometimes other libertarian leaning people question me for voting for Democrats. I always say that I would rather have my economic freedoms violated than personal freedoms. In one case I am broke, in the other I am in jail. I don't want to be broke, but I really don't want to go to jail. There are worse things than jail, namely death. U.S. foreign policy has been dealing out death in a big way over the last eight years. War is worse than any economic or personal freedom violations. Of course war actually cause these violations as well.

Look at the Patriot Act. Clearly a war-time measure that is one of the most egregious violations of personal freedom in the checkered history of the United States. Look at our budget deficit and how much money we are spending on wars. Go beyond that and look at the weakness of the dollar and the problems that is causing.

If you keep looking, you'll soon notice the price you pay for gasoline. How much did gasoline cost before we started waging war in Iraq? I know better than most that correlation does not imply causality, but what do you think the price of gasoline would be today if the United States never invaded Iraq?

If we gasoline was in the $2/gallon range, the deficit was a fraction of what it is currently, and the dollar was stronger, do you think the economy would be much of an issue at all?

There is a price to pay for war. We have tried to push all of that cost to our children in the form of budget deficits, but it has not worked. We are paying it at the pump. We are paying it at the grocery store. We are paying it when we buy "cheap" goods at Wal-Mart.

War is the most important issue. The only hope for less war is to vote for Obama. I wish Obama would pull all of our troops out of Iraq and not even leave behind any bases. I am frightened that he will expand military activities in Afghanistan and maybe Pakistan. He is not a perfect choice, by far. But in the interest of Country First, he is the only responsible choice that I can make.

JavaScript Benchmarks, now with Chrome

As promised yesterday, I did the JS benchmarks again on a Windows machine so I could include Google Chrome. I tried to be pretty inclusive, adding in IE7, IE8 beta 2, Firefox 3.0.1 (current release), Firefox 3.1 with and without JIT, Safari 3.1 (current release), Safari 4 beta, Opera 9.5 and Chrome. This was all run on my workstation, a 4-core, 3.2 GHz box with 8 GB of RAM. Any add-ons, extensions were disabled. Here is the pretty picture.


Once again Safari is the kind. Safari 3.1 beats everything except for Safari 4 beta, which crushes even Safari 3.1. Opera was a little slower than Safari. Chrome was generally comparable to the various Firefox browsers, but overall slightly slower. Like Firefox 3.1+JIT, it was very on error handling! Of course IE was the slowest by far, but at least IE8 is faster than IE7. Maybe IE8 is shipping with debug symbols included (as Microsoft has often done in the past) and the release candidates will be much faster than the betas. Or not.

Anyways, Chrome, and its V8 engine, does well, but does not seem to be ahead of Firefox and is certainly behind Safari and Opera. Maybe they can do better on the Mac!