Originally Posted by TW Staff
Please understand the overall score is not the average of the other scores, it is the overall impression of the shoe. Each tester provides their overall score (which can take into account fit, feel, cosmetics and other intangibles not scored in the other sections) and the average of those scores is posted as the overall.
In regards to scores, they are as much opinion as the comments. The testers are not always the same from review to review, people's opinions change over time and I have seen testers go from scoring tough to being much more generous as the years go by. I think that when a number is assigned to something, people tend to take that in a more factual way, but in the end it simply is a reflection of the findings of a group of testers at the time.
We do our best to be fair and accurate. When our reviews are glowing, we get accused of being positive just to boost sales. When we ding something that others like, we get accused of being biased, clueless or what have you. That being said, the scoring is as accurate as possible and the overall score stands alone as the average of the scores for the overall impression of the shoe.
Hope that helps.
Thank you Chris for being so incredibly responsive, articulate, and helpful! TW is lucky to have you. I thought that was what was going on with reviews, i.e., that the final score meant more than the individual scores and wasn't the mathematical average of all of the scores of individual characteristics.
I recommend the following to be helpful to consumers and all concerned: simply have your product testers rank the products with each review. For example, if a new product comes out, at the end of the review each of the testers would say that he/she would rank this product 1st, 2nd, 3rd, etc., behind other named products (current offerings as well as past offerings) and state why. Sometimes the reviewers do just that and it's very helpful. So perhaps product review rankings (and updated rankings) would be more helpful than raw scores, which as you point out, vary from reviewer to reviewer, whose tastes change over time, can be whimsically more or less generous with scores, and where the reviewers themselves may change with each product reviewed.
As an aside, it's also odd to me that some products are not reviewed at all. I play USTA league tennis in Metro NYC. The most popular tennis shoe among league players hands down is the Prince T22 because it is so good in virtually every category, including ventilation, weight, durability, stability, and price. It's a great shoe and even better because of it's great value. Yet, this shoe has not even been reviewed by TW. I may not have given this shoe a try were it not for the fact that SO many league players wear them.