TW Reviews-- questionable?

Discussion in 'Racquets' started by nguyenthuc, May 14, 2005.

  1. nguyenthuc

    nguyenthuc Rookie

    Dec 10, 2004
    Something just doesn't look right... By pure chance, I happened to compare the averaged scores of TW review for the Babolat AeroPro Drive and the new Prince O3 Tour and they are EXACTLY IDENTICAL in every category. Now what are odds of that happening with a team of half a dozen playtesters or so, each player with a different game and style, and comparing completely different racquets and brands. Coincidence? I'm not buying it....
  2. andirez

    andirez Rookie

    Apr 14, 2004
    Indeed, chances are very low of this happening, which lends me to believe that they have made an error in processing the data and that one of the graphs is wrong.

    A possible scenario:
    I can see something like this happening: you open the file for the AeroDrive and want to modify it for the O3 Tour. You already save it as O3tour but haven't adjusted the data yet. Somebody calls you to do something else and when you come back you just use the graph which is still the same as the original one (aerodrive).
  3. AndrewD

    AndrewD Legend

    Dec 11, 2004
    Actually, although it would seem improbable it is very likely when you have the same group of people conducting the same test over and over again. Each tester has a particular number that they assign to something (in this case: return, serve, spin, slice etc) and will do that, unconsciously, more times than they are aware of. Example: A thinks the racquet has good stability but not great so will award a 75. B thinks the same but his ranking for the same experience is 74. Etc, etc.

    So, if the racquets perform in a relatively similar fashion they'll end up with similar ratings and, surprise though it may be, will end up on occassion with exactly the same scores. That each person reviews a racquet with exactly the same tension and string also encourages the numbers to be consistant.

    If you think that sounds far fetched just have a look at the various movie review shows which have more than one reviewer for the same film. Each reviewer will invariably have one rating number to describe their experience so will continually repeat that figure. If they loved it, they give it a 4.5, if they merely liked it, they'll give it a 3.5. The actual variance in their scores is minimal because there isn't much variance in their experience of the film.

    All that tells you is that there is a sameness about the films or, in this case, the racquets. A truly horrendous or magnificent racquet will rate in a way totally inconsistant with their normal pattern of voting.
  4. nguyenthuc

    nguyenthuc Rookie

    Dec 10, 2004
    How many playtesters? How many different categories of testing? 2 completely different racquets. IDENTICAL SCORE in EVERY category. Doesn't really strike anyone? Intentional or unintentional erroneous data processing as Andirez suggested is more plausible, however, still unexcusable.
  5. A Defenseless Creature

    A Defenseless Creature Semi-Pro

    Nov 27, 2004
    The reviews are enjoyable reading, nothing more. It is always interesting to see how the playtesters rate their experiences with the racquets, but the results are almost always predictable. Seriously, you need to keep in mind the reviews are primarily a marketing tool for TW, not a completely objective rating of racquets. Not that there's anything wrong with that. Do you think you are ever going to see TW pan a racquet that they are trying to sell? I doubt it, nor should they. Just read the reviews as you would a fluffy magazine. They are meant for enjoyment. For slightly more critical/objecive reviews read comments on the boards. Although discussions tend to draw out the extreme experiences. The truth is somewhere inthe middle. Always best to demo yourself if you have the opportunity.
  6. louis netman

    louis netman Hall of Fame

    Feb 20, 2004
    I don't think there's a real methodology involved. It's a great selling tool, though. If you notice, all reviews end up covering all the bases, "This is a great racket for a baseliner, all-courter, or serve & volleyer." If you examine various racket test figures, they cannot be accurately compared in a mega-analysis. I think it would be better if they did comparative reviews of similar sticks. Granted, each reviewer has his/her own subjective perceptions of power, spin, maneuverability, etc. With a comparative, the reader will have a more valid interpretation of the results....
  7. finchy

    finchy Professional

    Apr 11, 2004
    do note that the given numbers are the average of all of the playtesters. perhaps this will settle your squabble with TW?
  8. Morpheus

    Morpheus Professional

    Feb 19, 2004
    The value is in the commentary not in the numerical rating system. Yes, this is a marketing tool and I have no doubt that the manufacturer's pay to have their rackets reviewed. It may not be an explicit payment, but part of the total financial package with TW (i.e., "in exchange for $100 grand, TW agrees to ... including at least two racket reviews.") That's how all distributors/retailers work.

    I like the reviews and pay close attention to how a couple of playtesters describe the experience. Over time you learn the code. I put no stock in the numbers, however. They all end up being between 70 and 80 anyway (recently more like 75 to 80).
  9. Richie Rich

    Richie Rich Legend

    Feb 24, 2004
    Defensless Creature,

    check out the review of the ncode Tour 90. Not exactly glowing!! Your point, though, is well taken. TW reviews should be read with a grain of salt and shouldn't be used as a substitute for demoing a racquet before buying!!

Share This Page