View Single Post
Old 06-17-2012, 11:16 AM   #1
Mdubb23
Hall Of Fame
 
Mdubb23's Avatar
 
Join Date: Aug 2008
Location: Completing the point with a shoulder-high punch into the open court.
Posts: 1,694
Question Overthinking Simple Math

Purely out of my own curiosity, I was running through some math in my head this morning.

If someone has a 40% success rate at some activity, then attempts two more trials, the first of which is a failure and the second of which is a success, what is flawed in the following logic:

When the failure occurred, there were less total trials than there were when the success occurred, meaning the failure constituted a larger percentage of the total trials than did the success, and so the total average should decrease more than it should come back up?

I realize this is inherently false; the total average will always increase. If you are earning a 40% in a class and then score a 50% on a two-question quiz, your grade will always increase, no matter which order the questions were asked.

The most apt explanation I've been able to come up with is not to look at the 1/2 as a 50%, but instead to look at 0/1 and 1/1 separately, as 0% followed by 100%. Because the 0% is closer to the average (40%) than the 100% is, the 100% will have a greater effect on the average. In a set of data, an outlier will always effect the data more than a trial closer to the average, no matter which order the two trials were conducted.

But what makes that logic more valid than my earlier 'logic'?
Mdubb23 is offline   Reply With Quote