Marginal dollar values per run and first-order win, Pt. 1

facebooktwitterreddit

Yesterday I talked about payroll being significant in baseball. I know, that should be obvious. But as I did the quickie chart on Word and tried to manipulate the dimensions on WordPress’ HTML editor, I thought about how wildly inaccurate taking a win-loss record and measuring the marginal wins over some replacement level team probably was. I knew of course about how run differential, the number of runs a team scores and allows, is a much better indicator of a team’s quality than their actual win-loss record, as bullpen quality and pure luck can affect wins and losses in a big way.

Having said that, I did a quick search on FanGraphs, feeling certain that someone had written on the topic on which I was about to embark. Sure enough, the always excellent Dave Cameron did a piece on this using win-loss record and payroll for the 2008 season. Cameron’s work was just a quick study done near the end of 2008, and more has been done to try and break down revenue based on hitters’, pitchers’, and fielders’ performance. The Baseball Economist by J.C. Bradbury goes in far more detail thatn I’d ever be able to or have the time for, so if you’re really interested in this, you can take a look. Also, check out the explanations, issues, and other things involved (it’s a lot of reading, I haven’t gone through it all, but it looks very good and I promise to do so myself).

That being said, I was focusing on Cameron’s quick piece. I thought that using run differential for this type of analysis made for more accurate results and, honestly, I was intrigued by what the numbers would look like for high payroll teams like the Yankees and Red Sox as opposed to lower payroll teams which had enjoyed some success like the A’s and Twins.

So I embarked on the path to something related to economic analysis. The study would look at the last ten seasons, 1999-2008, and compare marginal dollar values per run of each team. I knew my methodology would be based upon using the Pythagorean expectation formula to determine a theoretical replacement team’s run differential. The version of the formula I used was the Pythagenpat method, commonly recognized as the most accurate of Pythagorean expectations methods. To calculate the theoretical replacement team’s run differential, I took the desired win percentage calculated by Tom Tango (see the link Mr. Tango’s calculations in the link to the previous post above), calculated to be .292 and inserted it into Pythagenpat. Run environment for each team was taken into account by using the total runs scored and allowed for each individual team and using this in Pythagenpat to determine the replacement team’s runs allowed and scored, thus equating that value for both real and replacement team. Run totals were taken from Baseball-Reference. Marginal runs were calculated as the difference between a real team’s runs scored and allowed and that of its replacement team with the same run environment.

For team salaries, USA Today’s Salary Database was used. In order to more accurately compare between the years within the ten-year period, I corrected all payrolls from the first nine years to 2008 dollars using known CPI indexes. The baseline payroll of $12 million dollars (30 players times the league minimum of $400,000 each) was also corrected for each year.

With that introduction to the methodology, here’s the work in spreadsheet format. Each tab contains a year from 1999-2008, with the Summary tab containing the averages for each team in that ten-year span and a tally on playoff appearances, division titles, wild card berths, and World Series results. Italicized team names indicate playoff teams for the year, underlined team names indicate World Series runner-ups, and bolded team names indicate World Series winners.

Here’s some conclusions we can look at from just checking out the summary page. From here on out, I’ll be using the convention of 10 runs/1 win to discuss some of the data. I didn’t include first-order win totals in the spreadsheet so that you may use your own more accurate convention at your leisure.

– Only two teams were able to get a marginal win for less than $1M throughout the ten-year span, and it should come as no surprise that those teams were the Oakland Athletics and the Florida Marlins. However, the teams did so using two different strategies. The A’s, run by everyone’s favorite/most hated genius GM Billy Beane, ran competitive, division-winning teams for half of the observed decade, mostly on what is considered “mid-market” budget. The A’s averaged $54 million in 2008 dollars in payroll a year and turned it into a .552 theoretical win percentage. The club won four AL West titles and one Wild Card berth, but never advanced into a World Series, highlighted by Beane’s well-known quip: “My shit doesn’t work in the playoffs.”

The Marlins on the other hand accomplished their feat in a less successful fashion. Instead of remaining competitive each year on a tight budget, the Marlins often forgoed competitiveness on their extremely minimalist budget. The Marlins average the lowest payroll in baseball over those ten years, clocking in at about $37.9 million adjusted. They garnered their value without experiencing a great deal of postseason success, appearing in only one October season, though they won the World Series that season. Overall the Marlins came out with a .479 in percentage based on run differential, which is middling and not overall playoff worthy. The actual team did compete for Wild Cards in five seasons, but were mostly eliminated by September and overachieved in many of those years anyway. So while the A’s showed postseason consistency for half the decade, the Marlins held tightly to an almost non-existent budget and were able to use mostly rookies to play mediocre baseball.

– It also should come as no surprise that the Yankees and Red Sox both topped the win percentage standings and were among the leaders in marginal dollars per run. The Yankees led the way, spending $328K per marginal run, far and away the largest total. But can they really be blamed for spending that sort of money? The Yankees made the playoff nine out of ten years and won two World Series, though both were at the start of the sample decade. The Red Sox similarly won two World Series near the middle and end of the decade and were not shy with their money either, spending $238K per run. There is a large distinction between the success of these teams and the others; while the Yankees and Red Sox posted win percentages of .575 and .570 respectively, only two other teams, the Braves and the A’s, were above .550. So money indeed can buy you wins, and more often that not a lot of them.

However the correlation is at times broken at best. This is where the personnel people at the top of the Yankees and Red Sox shine, as they often find the right people to throw their copious coffers at. The Yankees are first in marginal dollars per run, and the Sox rank fourth, but the two teams ahead of the Sox, the Mets and Orioles, were not nearly as impressive. Despite spending $2.7M per marginal win, the Mets mustered only .518 win percentage, good for about three games over .500 each season. The Orioles, in an attempt to “keep up with the Joneses,” spent lavishly and poorly, wasting $2.49M per win and getting a pathetic .452 win percentage. It looks as if the team is finally working towards rebuilding now after years of struggle being in the AL East. Certainly, the last ten years for this team would be best forgotten.

– The average marginal dollar per win over the ten-year span was $1.88M. Dave Cameron had another piece discussing the final piece of the win values puzzle that stated that in 2008 one could determine that the average salary cost of a win in the league was $2.31M. Using first-order wins rather than basic wins I have my total at $2.23M. The low average makes it obvious that the market has become more and more flooded with salary money as the decade has passed, even after correcting for inflation. In 1999, the average value of a win was $1.25M; in 2003 it was $1.94M. Payrolls are as top-heavy as ever with players getting more and more money. Can a small-market strategy like the Marlins’, Rays’, or Athletics’ work with teams paying more marginally per win? In 2008, 16 of 30 Major League teams averaged less marginal dollars per run than the average for the league, which sounds reasonable enough. However, four of those teams were at $2.0M per win, meaning that the majority of teams are still attempting to eke out their wins paying less than $2.0M per win and thus depending heavily on rookies. Some teams found success and some didn’t, likely a product of scouting and personnel. It doesn’t seem likely that these strategies would falter so much that the personnel departments who run them would do a poor job doing so. Teams like Pittsburgh were notorious for spending less and spending wastefully when they did, while the A’s have always been effective with their spending.

I think in general these numbers have reflected what most people thought of various teams throughout the time period. Still, it’s interesting to see the actual figures and compare them. On the next installment of this piece, we’ll look at some of the best and worst seasons of marginal dollar per run/first-order win from the first half of the decade.