Originally Posted by jaydubbs15
Thank you Chris for being so incredibly responsive, articulate, and helpful! TW is lucky to have you. I thought that was what was going on with reviews, i.e., that the final score meant more than the individual scores and wasn't the mathematical average of all of the scores of individual characteristics.
I recommend the following to be helpful to consumers and all concerned: simply have your product testers rank the products with each review. For example, if a new product comes out, at the end of the review each of the testers would say that he/she would rank this product 1st, 2nd, 3rd, etc., behind other named products (current offerings as well as past offerings) and state why. Sometimes the reviewers do just that and it's very helpful. So perhaps product review rankings (and updated rankings) would be more helpful than raw scores, which as you point out, vary from reviewer to reviewer, whose tastes change over time, can be whimsically more or less generous with scores, and where the reviewers themselves may change with each product reviewed.
As an aside, it's also odd to me that some products are not reviewed at all. I play USTA league tennis in Metro NYC. The most popular tennis shoe among league players hands down is the Prince T22 because it is so good in virtually every category, including ventilation, weight, durability, stability, and price. It's a great shoe and even better because of it's great value. Yet, this shoe has not even been reviewed by TW. I may not have given this shoe a try were it not for the fact that SO many league players wear them.