So I happened to flip to Fox Sports Net for a few minutes this afternoon and it happened to be their "Sports Science" show in which they analyze movements in sports. It turns out that according to FSN, the fuzz on a tennis ball and wind resistance makes the ball lose up to half of its initial speed on a serve by the time it reaches the opposite baseline. As best as I can recall, the show explicitly stated that Andy Roddick's 155 mph serve would have been going around 78/80 mph by the time it reached the other player. I can't do the physics myself, but if this is true, then the difference between a 120 mph serve and a 140 mph serve would only be 10 mph by the time it reaches the returner; a significant difference, but much more manageable than an actual 20 mph difference. The difference between a 120 mph serve and 125 mph serve would be almost negligible. It seems like 50% speed loss is a pretty broad stat, especially when things such as spin and placement aren't accounted for. What are your thoughts?