A few weeks ago I had a Twitter conversation with the former host of QDT’s Modern Manners Guy, Trent Armstrong, in which Trent said something along the lines of: “I've always wondered how to tell how fast a football player is going based upon how quickly they run from, say, the 50 to the 40 yard line.” And I thought: “That’s a great question!”—especially since the American football season is just getting started. So, that’s exactly what we’re going to talk about today: How can you gauge how fast a football player is running by watching them pass the lines?
How to Calculate Speed?
First off, let’s talk about what speed is in the first place. Speed is just a measure of how fast something or someone is moving. We usually measure it in miles-per-hour or kilometers-per-hour, and those units pretty much tell you everything you need to know about speed. Namely, if you are traveling at a given speed—say 55 miles-per-hour—and you travel for 1 hour, then after that hour you will have traveled 55 miles: 55 miles-per-hour x 1 hour = 55 miles.
So how is speed actually calculated? It’s pretty easy. In fact, the units you measure speed in—miles-per-hour—almost give away the answer. All you have to do is divide the total distance traveled (the “miles” part) by the total amount of time it takes to do the traveling (the “hours” part), and the number you get out will tell you the average speed traveled over the course of that 1 hour.
How to Estimate How Fast Someone is Running
So, how can we estimate how fast someone is running? And in particular, how can we estimate the speed of an American football player running down the field? Well, as I described earlier, speed is calculated by dividing the distance someone or something travels by the amount of time it takes them to do the traveling—that’s miles-per-hour. And, as it turns out, a football field is fortuitously well suited for this sort of measurement because it has gigantic unmistakable lines drawn across it each and every 10 yards. All you have to do, therefore, is time how long it takes a player to run the 10 yards from one line to another—say from the 30 to the 40 yard line—and then divide the number “10 yards” by the measured time in seconds.
But how should you measure that time? Well, you could be really accurate and use a stopwatch. Or, if an estimate is good enough for you, you could just estimate the number of seconds—or fractions thereof—in your head. If you want even more accuracy, and the player is running far enough, you could measure the time it takes him to run 20, 30, or more yards—so that the length of time you measure will be longer than one second—and then divide the total distance the player ran by the total time you estimated. What will this tell you? Well, it’ll tell you the player’s average speed (by which I mean the arithmetic mean) over the time you were timing him, and this speed will be in units of yards-per-second because that’s what we’re measuring—how many seconds it took to run 10, 20, 30 or more yards. But those probably aren’t terribly meaningful units to you since we usually measure speeds in miles-per-hour. Can’t we turn this into something a bit more sensible?
How to Convert Speed Into Units You’re Familiar With
Of course we can—we just need to use a little unit conversion magic. So, our problem is to convert 1 yard-per-second into some number of miles-per-hour. Which means that we need to convert the distance units in the numerator from yards into miles, and we also need to convert the time units in the denominator from seconds into hours. What do we know about the connection between these units? For distances we know (or we asked Google to tell us) that there are 3 feet-per-yard and 5,280 feet-per-mile. And for time we know that there are 3,600 seconds-per-hour (that’s 60 seconds-per-minute x 60 minutes-per-hour). Combining these together, we find that
Okay, but 1 yard-per-second is more like a walking pace—it’s definitely slower than players run. A more reasonable case might be that you watch a player run 10 yards—from the 30 to the 40 yard line—in one second. So how fast would that be? Well, all we have to do is multiply 2 miles-per-hour by 10 (since the player is now moving 10 times farther in that same 1 second time period).
So, if you time a player running 10 yards in one second, that means that they’re running at nearly 10 x 2 = 20 miles-per-hour—which is actually pretty close to the fastest speeds that football players run. In other words, when a player is running flat-out fast, they should cover about 10 yards each second. Now, what if the player is jogging and you time that they run those 10 yards in 2 seconds instead of 1? That just means that they’re running half as fast (since they went the same distance in twice as much time)—in other words, they must be running at 20 / 2 = 10 miles-per-hour. That’s all there is to it!
Wrap Up
And that’s all the math we have time for today. Now, the next time you’re watching a football game, you’ll have a whole new perspective on just how fast those players are running.
Please email your math questions and comments to..............You can get updates about the Math Dude podcast, the “Video Extra!” episodes on YouTube, and all my other musings about math, science, and life in general by following me on Twitter. And don’t forget to join our great community of social networking math fans by becoming a fan of the Math Dude on Facebook.
Until next time, this is Jason Marshall with The Math Dude’s Quick and Dirty Tips to Make Math Easier. Thanks for reading, math fans!