Jumping right in today, reader Benjamin Lauderdale sent this
question
Will Carroll’s way; he was kind enough to forward it to me:
I have never seen an analysis of the effects of consecutive games in
baseball and recent Cubs history makes me wonder whether it might be
important. You may recall that the 20+ consecutive games at the end of
last season culminated in an ugly finish and the long stretch just
finished ended with losing 3/4 to the DBacks. Obviously this is
anecdotal, however it should be easy for someone to compile the records
of all the teams in baseball as a function of how many consecutive days
the team has played. If there is a steep falloff in record at 10+ or
15+
games that would have implications for the fairness of season
schedules.
One could extend this analysis by looking at the relative rested-ness
of
the two teams in each game, which might make the effect clearer if it
exists.
Thanks for the question, Benjamin, it’s a good one. I explored a
similar angle back in March, checking to see if teams coming off extra
inning games or double headers showed any notable decline in their next
game. The results were inconclusive, mostly due to a sample size issue,
but this angle compiles a much larger sample of games, even from a
single season, so we should be able to put a great deal more confidence
in the data.
First, let’s look at those 2004 Cubs. As many of us can attest,
memory is a fallible tool. The Cubs backed out of the wildcard last
year
by losing seven of ten to the Mets (71-91), Reds (76-86), and Braves
(96-66), not the D-Backs, but those 10 contests were the end of 26
games
in 24 days, the result of three consecutive rainouts in early September
that gave the Cubs four days off in a row. The Cubs certainly appear to
have been the victim of fatigue in this case; they won 12 of the first
15 games during this stretch before collapsing to the aforementioned
3-7
finish.
Of course, this could have just been a bad stretch of games for the
Cubbies. To check, let’s see how the entire league did in 2004 when
broken down by how many games a team had played since its last off
day.
In 2004, the minimum was obviously zero (the day immediately after an
off day) while there was one stretch where a team played 29 games in a
row without a break–the Marlins to finish the season, the victim of
those aforementioned rainouts with the Cubs in early September. Here’s
how they did:
As we’d expect with a diminishing sample size, the range of the
results expands exponentially as the data reaches the outliers. Thus,
the results of our trendline are skewed because equal weight is being
placed on single games–towards the right of the graph–and the
thousands of games at the left. Let’s look at things while removing the
situations in which there weren’t at least 50 games.
Suddenly that slight downward trend is gone and we’re left with
almost exactly a .500 winning percentage over the course of the graph.
Expanding the data back through 1995 shows very similar trends; teams
don’t diverge from .500 by more than .025 points until 21 games from
their last off day (a sample size of only 40 games). Even if we
consider
every game after the 21st game in a row as a single data point to
remove
some of sample size issues, those teams had a 104-105 record since
1995.
From this perspective, it appears that teams don’t fare any worse as
their distance from their last rest day increases.
However, we have not considered the last point made by reader
Benjamin, the “relative rested-ness” of the teams involved in any game.
As anyone struggling around the television dial Monday night knows,
there are certain days of the week that are light on baseball games as
teams travel the country. This past Monday, for example, there were
only
four out of the maximum fifteen games. Because team off-days are often
the same day, much of the time the two teams matched up have both
played
the same number of games since the last rest day. Thus, the reason
things appear close to .500 is that if both teams have the same number
of rest days, they’re always going to post a cumulative .500 record.
To check this out, we can subtract a team’s games since their last
rest day from the opponent’s. Call it “Rest Advantage” for short; the
lower the number, the closer the team in question is to their last rest
day. If teams see an advantage over opponents who’ve had to play a
larger number of games in the days leading up to the current contest,
we
should see an increasing trend across the data. Let’s see how things
shake out, limiting the data to points that have at least 100 games:
Not unlike the raw data earlier, this set of data indicates that
there is no inherent advantage to having had the most recent rest day.
While the trendline does indicate a small increase as rest advantage
increases, the r-squared (the coefficient of
correlation) is so small that we cannot place much faith in the
data
at all. Furthermore, even if the data was absolutely correct, the
difference between the trendline at the far extremes of the chart is a
mere .028 points of winning percentage, an advantage much smaller than
the typical home field advantage.
Getting back to those Cubbies last year, as much as it would be nice to
blame the unforgiving schedule or Hurricane Frances in Florida, there
just isn’t any evidence that teams fare worse as they play more and more
games without a break. Considering that MLB schedulers are prevented
from scheduling more than 20 consecutive games without an off day
anyway, there’s no reason to think that some teams get lucky or receive
preferential treatment when it comes to the schedule. Instead, the
conspiracy theorists will have to stick to the travel schedule.
Thank you for reading
This is a free article. If you enjoyed it, consider subscribing to Baseball Prospectus. Subscriptions support ongoing public baseball research and analysis in an increasingly proprietary environment.
Subscribe now