Having said that, let me go back one more step to my heady days as a college first year atop the hill at Hamilton College. I sat there in one of my first courses, Intro to Macroeconomics, knowing full well that I had officially taken the first step in my major and therefore the most likely path my adult life would follow. Granted, a few years later I would mic-drop that plan and decide on a different path for graduate school but that's a story for another time. I still find economics fascinating - thus why I love listening to the aforementioned Planet Money - and sitting there in that macroeconomics class, I couldn't have been more excited.
So we start doing some of those basic supply and demand graphs and I remember going over the list of assumptions that needed to be in place in order for the basic graphs to work. Assumptions like buyers and sellers have perfect information, or that supply and demand are independent. At its most basic level, a supply and demand curve can't comprehend nuanced information, advanced economic mathematical procedures take care of that. At the time, as I read down the list of assumptions that just didn't hold up in the real world, I smiled and thought to myself "this is a little bit of bullshit." And I dove in.
Economics is not a precise science, or a hard science, but it doesn't do itself any favors when you hear stories like the one Planet Money recently covered in their 452nd episode released on April 19, 2013:
The story covers a very influential economic study released a few years ago. From the article that accompanied the podcast:
Three years ago, Carmen Reinhart and Ken Rogoff published a study that quickly became one of the most famous, most talked about economics papers since the financial crisis. It got so much attention because it answered a basic question everybody was asking: How much debt is too much?
Reinhart and Rogoff looked at what had happened in many different countries over many years. And they found a what looked like a clear debt threshold: 90 percent. Average growth was much, much slower in countries with debt-to-gdp ratios over 90 percent.
The paper got a lot of coverage in the press. Politicians cited it in the U.S. and Europe.
Then, this week, a 28-year-old grad student and his professors published a startling finding: Reinhart and Rogoff had made a simple Excel error in one part of their study. The authors of the new critique also questioned other elements of the study and argued that, in fact, there is no debt threshold.
A lot of that European austerity going on preventing our neighbors across the pond from getting out of this global recession? Reinhart and Rogoff's study frequently came up in policy discussions as support for the path Europe eventually chose. Our own Rep. Paul Ryan, who keeps trying to give himself a nickname about being a numbers guy even though his numbers never add up, cited this same study as evidence that the rich and corporations need more back rubs while the poor need to be sat on harder. Here's an article from PolicyMic that goes a little deeper into the flaws of the Reinhart/Rogoff study that Ryan loves so much:
A number of oddities occurred in the Reinhart/Rogoff study. First is the Excel error that I make all the time in my own adventures in spreadsheeting mentioned in the NPR quote above. Granted, when I blow my spreadsheet, the only thing affected is my catalogue of my Marvel Comics and Magic The Gathering collections. This study may have erroneously led governments down paths that have actually prevented economic recovery. But as the PolicyMic article explains, there also seems to be some selective exclusion of data, and at the end of the day, the 90% hard conclusion the study arrived at doesn't to be such a hard conclusion at all. From the PolicyMic article:
The first error of Reinhard and Rogoff is that they excluded data from their dataset with no explanation given. The years excluded were times in three countries: Australia (1946-1950), New Zealand (1946-1949), and Canada (1946-1950), which were the countries with high debt and solid growth. For example, the data for New Zealand in the paper's high debt data set changes dramatically, from a dreadful -7.6% to a respectable 2.6%. This is a 10-point error.So there have been countries with high debt that still managed economic growth that the Reinhart/Rogoff study just left out. Which kind of begs the question initially posed by the Planet Money podcast... how much should we trust economics? After listening to the podcast and reading up on what happened, my thoughts on economics remain unchanged. It's not a hard science, it may be better suited to exploring the past like a detective might than for creating predictive models, and like any research, it needs to be handled honestly.
In this case, there's the simple Excel error - fine, things like that are going to happen. That's one of many reasons why most journals in academia require a peer review for potential articles. The Reinhart/Rogoff study was never submitted to peer review before publication. There is a process in place to help people receive the best in current research, it just wasn't followed in this case. It wouldn't surprise me if Reinhart and Rogoff felt like they were on to something pretty significant and topically relevant, then got a little sloppy and rushed in their enthusiasm to get their insights out to the world. In that rush, Reinhart and Rogoff ignored the process that keeps our information honest. The Excel error is just forehead-slapping incompetence, the omitted data feels like something different.
To answer Planet Money's question, I don't think economics needs to be trusted any more or less, we just need to be more vigilant concerning the human element in research we base policy decisions on. Policy decisions cannot be taken lightly at any step in the process.
I'll let Stephen Colbert take us out. Both he and Planet Money present all of this in a much more entertaining fashion than I ever could:

