Why are the test results the way they are
From time to time we get emails questioning our testing protocol and asking us how is it possible for some driver they never heard to have a score of 8.0 and their favorite driver from one of the top golf companies only has an 8.5 rating. That works out to 94% as good as their $400 driver. We would like to respond by explaining that when 60 golfers test the clubs and we average out all the results there is unusually only a 10% range from the top rated clubs and the lowest rated clubs. It works out like that every time we conduct these tests. How can that be, they ask? Well, take distance. Driver A from one of the big guys averages 225 yards during the test. Driver B from “We never heard of You” golf company averages only 221 yards. Now what is the percentage difference? Ah! About 2%. But wait a minute. The percentage difference in the ratings was 6%. So it would appear that Driver A was 6% better than Driver B when in reality it was only 2% better. How does one account for the difference. Is it possible that the testers are biased? We ask each tester to be fair and unbiased when they test and rate each club. We hope they are. That is why we have 60 testers test each club. From that group we pick the 50 testers which we felt most fairly and equitably evaluated each club according to our directions. By doing this we have found we are able to level the playing field for all the clubs participating in each test. Now we are not saying that the system is perfect or without flaws. We are constantly looking to improve it. We do know for a fact that TaylorMade, Callaway, Nike, Cleveland, Titleist, Cobra, Ping and a few others tend to get higher ratings even if they don’t perform in actual testing at the higher levels. We do our best to take that into account so that all the companies participating are fairly evaluated.
Our objective is to make sure golfers understand that all we are attempting to do is to provide what we feel is valid and needed information from which they can make informed decisions about what golf equipment might be best for them.
We have had golfers contact us who are upset and obstinate that we had the unmitigated gall to show test results that indicate driver B is almost as good as their beloved Driver A. Actually most golf clubs are pretty good and there is really only a small difference in the quality and performance of the majority of the golf clubs out there today especially considering that only companies that make really good clubs send us their equipment to be included in our testing.
We have looked at how we might be able to incorporate the street price of equipment into our rating system but prices are so dynamic and change so quickly we haven’t developed a way to do it yet.
Any suggestions would be appreciated.
If you have any comments please feel free to Contact Us.
-The Test Administrator Staff at GolfTest USA