The people have spoken. When I ended my last article with the conclusion that the gap between high-school and college players in the draft has almost completely evaporated, I expected to spend the next few days figuring out for myself the reasons for the change. Fortunately, you did that for me.
Reader after reader responded with their own theory as to what could cause teams to do a significantly better job of drafting high-school talent, even as they drafted more high-school players. And each response looked frighteningly liked the last: it’s the signing bonuses, stupid.
In 1988, #1 overall pick Andy Benes received the highest signing bonus for an amateur player in baseball history, breaking Rick Reichart‘s 24-year-old record. Benes signed for $235,000.
By 1991, the #1 overall pick–the ill-fated Brien Taylor–signed for $1.55 million.
That, in a nutshell, is what changed the equation of the draft. Signing bonuses exploded in the late ’80s and early ’90s. The average first-round signing bonus nearly tripled between 1989 and 1992, and went up more than ten times between 1989 and 1999, from an average of $176,000 to $1.81 million.
Why bonuses went up so much, so fast is not something I can adequately answer in a paragraph, but a variety of factors were in play. The general increase in revenues that made baseball owners flush with cash meant they were more willing to spend more money to make more money. Aided by the emergence of super-agents, drafted players were more aware of their leverage and were more than willing to use it. Once Taylor and Todd Van Poppel had shown that teams were willing to shell out seven-figure bonuses, the dam broke.
Teams were willing to pay for the best talent, because they began to realize that even at inflated prices, it was still far more cost-effective to sign amateur players than to throw money at free agents. Whereas, ten years ago, teams were perfectly willing to let high-school players walk away over a small disagreement over compensation, now those teams became willing to shell out the dough. After all, many of the players that headlined the historic college crops of the mid-’80s–Barry Larkin, Will Clark, Barry Bonds, etc.–were very highly regarded high-school players, but chose to go to college because the money wasn’t there. Bonds, famously, was a second-round pick by his hometown Giants in 1982, but didn’t sign over a difference of six thousand dollars.
Of course, many top high-school players didn’t go to college because of a spat over money–many of them went to college because they wanted a college education. When they were leaving a five-figure sum on the table–a dollar figure that was barely worth more than their college scholarship–it was an easy choice. When the ’90s dawned and top players were being offered million-dollar bonuses to sign…suddenly the resolve of even the most college-committed player was tested.
So it might not be that major league organizations started doing a better job of identifying high-school talent. They simply were doing a better job of convincing those players to sign. Not only did this increase the pool of high-school players in the professional ranks, but it meant that college programs had less talent to draw on, and the pool of collegiate talent that was draftable three years later was thinned out.
That seems to be, by far, the most compelling and significant reason for the swing of the pendulum. Just to be sure, though, I spoke with Jim Callis of Baseball America for more insight; frankly, after my discovery that Callis had been right all along, I pulled an Anakin and got down on bended knee to swear my allegiance and beg for his knowledge of the dark side. (Yes, Anakin turns to the dark side. Palpatine turns out to be the Sith Lord, Padme gives birth to twins, and all the Jedis except Yoda and Obi-Wan get offed. Hope I didn’t ruin the movie for anyone.)
Here are a couple of other factors that are much less significant than the signing bonus issue, but as Callis said, were all pulling in the same direction:
- MLB instituted the college scholarship plan in the 1980s, in which players signed out of high-school were guaranteed to have their college tuition paid for should their pro careers not pan out and they choose to go on to college later. In reality, the plan is more bark than bite–it’s a lot harder for a 26-year-old with a wife and kid to go back to college than it is for an unattached 18-year-old, and of course it’s a lot harder to get admitted to a quality school when you can no longer offer your services on the diamond. But it may have helped tipped the scales for a few high-school picks.
- The NCAA made drastic reforms around the same time, cutting scholarships by 10% (from 13 to 11.7 scholarships per school), and preventing teams from scheduling more than 56 games a year (some schools played 70 or more games prior to the reforms). The NCAA also limited the hours spent coaching, preventing college coaches from working with their players in the summer, and also eliminating a formal fall schedule. These reforms allowed pro teams an opening, making it possible to convince a prospect that he wouldn’t get nearly as much time to refine his skills in a college setting as he would in the pros.
- The emergence of regional showcases for high school talent–events like the Area Code Games–has given high-school players an opportunity to play against the best high-school players in the country…and given scouts the opportunity to see how those players fare against top-line talent. This has been particularly beneficial in helping scouts get a read on players from cold-weather areas like the upper Midwest and New England.
These explanations explain perfectly the existence of our paradox–the fact that high school players yielded more value even though teams were drafting more of them. It’s precisely because teams were drafting more high school players that their value went up. Teams were drafting the very best players available, and not simply the best ones who would sign.
(Note that there’s still room for improvement here. Even today, some high draft picks out of high school don’t sign–remember, Mark Prior was a supplemental first round pick of the Yankees–and others announce their intentions to go to college in advance and are all but ignored in the draft. Of course, money still talks: Grady Sizemore promised he wouldn’t sign out of high school, then got an enormous bonus from the Expos (!) that changed his mind.)
The only problem with the theory, frankly, is that I know of no way to test its validity one way or the other. This is one of those theories that we’ll just have to take on faith.
Having settled the reasons behind the change in the draft, I want to tweak my final conclusion from Part Three a little. I ended the last article by stating that the difference in value between college and high school players had dropped to a mere 8%. The exact figure, however, depends on the assumptions you make.
In particular, the assumption I made that we should measure a draft pick’s value over 15 years has come under some fire from readers, who (correctly) wonder why a team should care what their draft pick does 15 years from now, since there is no reasonable chance that said player would not have reached free agency by that point. It’s a fair question. Remember, the 8% figure was derived by extrapolating data from 10 years out (which is all we have at this point); if we use the 10-year data, college players have a much more comfortable 21% cushion.
What should the cutoff be? Theoretically, a team can hold the rights to a high-school player through Y12; a high-school player doesn’t have to be added to the 40-man roster until after Y3, they can then be optioned to the minors for three more years (Y4 through Y6), and then the service time clock starts ticking at Y7. College-signed players have to be added to the 40-man roster after Y2, which puts a theoretical cap at Y11 for them.
Realistically, Y10 is a pretty fair estimate of how long we should count a player’s value towards the team that originally drafted him. If we use Y10 as our cutoff, then, college players are still 21% more valuable than high-school players on average.
But…high school players are younger, and take longer to develop, than college players. Is it fair to make the cutoff the same for both groups? College players reach the majors 18-24 months quicker than high-school players, so it stands to reason that they would declare for free agency 18-24 months quicker as well. If we then use two different cutoffs, two years apart–Y8 for college players, Y10 for high-school players:
Average Y8 value (college players): 5.03
Average Y10 value (high school players): 5.22
We can make the argument that high-school players are more valuable to the teams that drafted them than college picks.
But…there’s also the issue of discounting. “Discounting” is an economic principle that refers to the fact that a dollar today is worth more than a dollar ten years from now. You can invest that dollar today and have more than a dollar in ten years, or you can spend that dollar today and derive value from it that you can’t do with the dollar you have to wait ten years on. Similarly, a draft pick that has value immediately is more valuable than a draft pick that will develop in five years. The Al Leiter/Steve Avery comparison holds here…Al Leiter has had significantly more career value than Avery, but it took so long for him to realize his value that the Yankees got rid of him for pennies on the dollar, while Avery’s peak occurred early in his career, when he was able to do the most good for the team that drafted him.
The exact “discount rate” that applies to baseball players is impossible to state definitively, but given the rapid attrition of ballplayers and the likelihood that they’ll be traded or released if they don’t develop quickly, we can say it’s rather high, on the order of 10-15%. Which is to say, a player worth 1 WARP today is as valuable as a player worth 1.15 WARP next year.
By this token, a college player will have more value than a high-school player, other things equal, simply because he develops faster. For instance, a team gets more present value out of a college player’s Y3 year (when he creates 0.52 WARP) than they will out of a high-school player’s Y10 year (when he creates 0.87 WARP) because the discount rate knocks more value off of the more distant timeframe.
If we use the data we arrived at in the last segment, and discount future years’ WARP data into Y0 WARP equivalents, we find the following. Using a 10% discount rate:
Average discounted Y8 value (college players): 2.84
Average discounted Y10 value (high school players): 2.47
Using a 15% discount rate:
Average discounted Y8 value (college players): 2.13
Average discounted Y10 value (high school players): 1.68
And suddenly, the college players once again enjoy an advantage of 15% in the first example, 27% in the second.
There’s a lot of different ways to massage the data. I’m not smart enough to know which methods are the most valid, but then I’m not deathly interested in knowing whether the advantage enjoyed by college players is 12% or 13%. In terms of the general brushstrokes, I’m comfortable in saying that college players are worth, on average, somewhere between 10 and 20% more than an equivalent high-school draftee. It’s enough of a disparity that you need to be aware of it, but not so much that you need to obsess over it.
Whether a player is drafted out of high school or out of college is a factor that must be considered, but it’s only one factor of many, and unlike 20 or even 10 years ago, it’s no longer the dominant factor.
Next time, we’ll break down the data into pitchers and hitters, and see whether some of our other time-honored principles–like Thou Shalt Not Draft a High School Pitcher in the First Round–still hold true.
Thank you for reading
This is a free article. If you enjoyed it, consider subscribing to Baseball Prospectus. Subscriptions support ongoing public baseball research and analysis in an increasingly proprietary environment.
Subscribe now