June 15, 2010
Prospectus Hit and Run
Year of the Pitcher?
In the wake of Ubaldo Jimenez's no-hitter, two perfect games, Armando Galarraga's near-perfecto and Stephen Strasburg's sizzling debut, we've been treated to a spate of "Year of the Pitcher" articles, even from within the BP fold. While none of the pundits have gone so far as to suggest that it's 1968 all over again—Jimenez's Bob Gibson-esque 1.16 ERA notwithstanding—the collective wisdom of such pieces suggests that the pendulum that has favored offense for the better part of the past two decades has finally swung in the other direction.
If you've been reading this column for any significant length of time, you know I'm generally quick with attempts to debunk such early-season claims. I already took my first stabs at tempering the breathlessness regarding the "Year of the Pitcher" via Twitter a couple of weeks back—scoring levels have decreased a couple of ticks since then—and last week, I scratched out some figures showing that this year's wave of no-hitters and perfect games was essentially a random occurrence. Nonetheless, it's time to zoom in for a closer look and see if such claims have any merit.
Like George Washington, I cannot tell a lie: major-league scoring is indeed down this year. Through Sunday it had decreased by 3.1 percent relative to last year's full-season rate. A look at recent history:
Taken together, the two leagues are at the lowest level of scoring since 1992, the year prior to the expansion that brought the Colorado Rockies' high-altitude hitter's haven into the National League. Furthermore, scoring rates are in decline for the fourth straight year, a trend not seen since 1939-43, when scoring declined for five straight years, aided in part by the introduction of the infamous balata ball in 1943 (ask Steven Goldman about that next time you're sitting on his lap for story hour).
Off hand, three quick potential explanations for scoring to have sunk below last year's level come to mind, though whether they hold any water is another story:
Examining scoring trends through a similar number of games over the past decade (I'm cutting off at the day where the total number of games played is closest to the 949 played through this past Sunday), we find relatively little difference between the segment from Opening Day through mid-June (the "Early" column below), and from mid-June through the end of the year (the "Rest"):
The data isn't exactly clear-cut. Rest-of-season scoring levels have actually decreased relative to early-season levels in six of the past 10 years. They've risen slightly overall, though the magnitude depends upon the arbitrary line one uses to decide what constitutes the recent past; if we exclude 2000, when overall scoring rates reached their post-World War II peak, the margin becomes +0.03 runs. Limit the data to the last five years and the gain is +0.07 runs. In any event, it's hard to support the case that this season's drop is a small-sample illusion caused by not factoring in the warm-weather months, particularly when this year's scoring rates have actually fallen from April (4.55 runs per game) to May (4.44) to June (4.42).
Turning to the question of interleague play, first let's examine the separate league trends:
For all of the dominance that Jimenez, Strasburg, Roy Halladay, and their Senior Circuit brethren have shown, NL scoring has budged less than 1 percent from last year's mark, a wholly unremarkable result. Meanwhile, AL scoring has dropped like a Bert Blyleven curveball, showing the sharpest decrease either league has seen since 2001, albeit a rate much more in line with the current NL level. Note that these rates are from the standpoint of the hitters. Measured from the pitchers' point of view, the NL's decrease would read as 1.3 percent, from 4.49 runs per game allowed last year to 4.43 this year, and the AL's decrease would read as 5.0 percent, from 4.75 runs per game allowed to 4.53.
Does the drop owe anything to interleague play? The AL has dominated such contests in recent years, winning at a .566 clip from 2005-09, outscoring the NL by 0.76 runs per game, and scoring 5.06 runs per game, five percent higher than their intraleague rate of 4.82 runs per game. The gap narrowed last year, as the AL won at a .548 clip, outscoring the NL by an average of 0.58 runs per game, albeit at a rate (4.79 runs per game) slightly lower than AL intraleague contests. Thus far this year, the two leagues have split the first 84 contests, one-third of the interleague slate. The AL has exactly matched last year's rate of 4.79 runs per game in interleague play, outscoring the NL by 0.33 runs per game despite the 42-42 split. Perhaps with more interleague play, the AL's rate will rise a bit.
As for the new park in Minnesota, teams at Target Field are averaging just 4.29 runs per game, 20th in the majors. Last year at the Metrodome, they scored 5.06 per game, fifth in the majors. Control for those changes and the majors' 3.1 percent increase in scoring is trimmed to 2.6 percent. It's worth noting that six of the seven most recent ballparks to open—Target Field, Citi Field, Nationals Park, Citizens Bank, Busch Stadium III, and Petco Park, with Yankee Stadium III the exception—are playing host to below-average scoring this year. Looking at the multi-year Park Factors in our True Average report, only the Phillies (1007), Nationals (1006), and Yankees (1001) ballparks are above-average when it comes to scoring; the average of those seven parks is 975, meaning that they decrease scoring by 2.5 percent—something to consider when taking a longer view of scoring than just year-to-year changes.
So, yes, scoring has fallen. Not only from last year, for reasons not adequately explained by the theories I've offered, but down to levels not seen since 1992. Since then, Major League Baseball has undergone some dramatic changes, expanding from 26 to 30 teams, building new ballparks for all but eight of those teams, splitting into three divisions with wild cards and interleague play, with steroids and perhaps ball doctoring muddying the mix as well. Increased steroid testing is often cited as one reason scoring levels have dropped. Perhaps over time that's true, but we do know via the Mitchell Report and other investigations and suspensions that pitchers were juicing alongside hitters, and in any event, no particular point of inflection lines up the drug policy changes with changes in scoring level.
If there's one piece of evidence to hint that we have entered some New Golden Age of Pitching, it's overall strikeout rates. Currently, batters are whiffing at an all-time high of 18.1 percent of all plate appearances, one-tenth of 1 percent higher than last year, but a full percentage point higher than in 2007, and over 2 percent higher than in 1994. Meanwhile, BABIPs—the rise of which have generally gone hand-in-hand with scoring rates—are down a point from last year, though the current rate fits comfortably within what we've seen in recent years:
One of the most telltale indicators regarding scoring levels is what paleo-sabermetrician Eric Walker has referred to as Power Factor, the rate of total bases per hit. Since World War II, the correlation between Power Factor and per-game scoring is a hefty .77; since 1969, it's an even heftier .91. This year's figure is the lowest seen since 1995:
The ratio is still much closer to the levels we've seen from 1994 onward than it is to those of the early 1990s, the last time scoring levels were below 4.6 runs per game, suggesting that the fundamental conditions under which the major-league game is being played haven't changed significantly. So color me skeptical as to whether this year's numbers represent anything terribly noteworthy beyond a bunch of eye-catching early-season pitching performances, as enjoyable as they might be. Scoring will likely wind up below 2009 levels, but I don't think we're headed for 1968 or even 1992 levels anytime soon. Bob Gibson can breathe easy.