
Getty Image
There are countless factors that come into play when you find yourself in a position to negotiate, and in a perfect world, coming to the table with an ironclad argument steeped in logic and reason would be enough to make whoever you’re dealing with see things from your point of view and consequently agree to all of the terms that you’ve proposed.
Sadly, the world is far from perfect, and as a result, you’re usually going to end up with jackshit unless you’re also armed with some leverage.
The NCAA has traditionally been thrilled to be able to use this reality to its advantage on the countless occasions people have argued the organization should allow student-athletes to profit off of their likeness, as it can simply respond “Well, they can always go and play somewhere else” before pausing to add “Oh, that’s right…they can’t!” while laughing maniacally and running away to dive into a swimming pool full of gold coins.
However, the winds have recently begun to shift, as multiple states have passed legislation designed to allow players to leverage their talents into cold, hard cash and the G League recently restructured its approach to contracts to give the nation’s best basketball players some serious incentive to not bother with the “one-and-done” nonsense.
As a result, the NCAA has begun to explore possible compensation strategies, but if history is any indication, it’ll take its sweet damn time doing so. This still leaves football players in a bit of an awkward situation, as they don’t really have any other alternative routes after the XFL shit the bed once again.
One of the biggest unanswered questions concerning this issue is that if players should be allowed to get paid (which they should), how do you determine how much they should make? Well, I’m sure glad you asked because there’s a new study that may be able to provide us with a little bit of insight.
That aforementioned question was basically what Ohio State University economics professor Trevon Logan set out to answer when he sat down to analyze every high school football recruit evaluated by Rivals from 2002 to 2012 to determine the tangible value they had based on their contributions to the program they played for.
Using math I won’t even attempt to pretend I totally understand, Logan was able to determine the number of “wins per season” players were responsible for and attach a sum to that figure. He concluded 5-star recruits generated around .437 WPS worth an average of $650,000, making them far more of an asset than the typical 4-star guy who “only” made his school $350,000.
3-star players also contributed a solid $150,000 but things get interesting once you hit the next level, as Logan estimates teams are shooting themselves in the foot to the tune of $13,000 for every 2-star athlete taking up a spot on the roster each season.
Of course, I doubt the NCAA will pay this much notice, and if it does, I can only assume it will lead to the organization telling high school players with the same rating as a rest-stop McDonald’s that they now have to pay if they want to play football.