Kagglers versus experts

Over on the Kaggle discussion board, Jeff made a few really interesting posts. In one, he compared how the approximately 250 Kaggle teams fared, as far as pick percentage, relative to a set of more than 60 national experts.

(1) How accurate were the winner’s predictions?

The simplest way to measure the accuracy of basketball game predictions is to calculate the percentage of winners correctly predicted. In general anything over 70% is pretty good, especially this year when there were many upsets. To get a general sense, I looked at the pre-tournament rankings from 60+ different “experts” – organizations or individuals who publish weekly top-to-bottom rankings for all 351 Division-I teams. By seeing who they had ranked better in each of the 63 tournament games, I could then determine how many of the games those rankings would correctly project. Note that I found these using Kenneth Massey’s “College Basketball Ranking Composite” page.

Out of 60+ experts, only two of them correctly predicted 70% or more of the tournament games – two organizations (the Dunkel Index and UPS Team Performance) both predicted 45 out of 63 games correctly, a 71% rate. On the other hand, there were 26 Kaggle teams that predicted 70% or more games correctly, led by Siddharth Chandrakant, who picked 48 correctly (a 76% rate), and three different teams (KUL_Pandabär, jostheim, and Jason_ATX) who picked 47 (a 75% rate), and four other teams (One shining MGF, mm2012mm, Fomalhaut, and jitans), who each picked 46 (a 73% rate), all superior to any of the 60+ experts.

Looks like the Kagglers were onto something.

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s