I love wine. But I am influenced by expert ratings and gold stars on the bottle – I can’t taste every wine before I drink it! Over time, I’ve felt that the number of wines receiving high ratings and gold stars has increased significantly – when was the last time you say a wine rated below 90? A bit of numerical analysis supports this: either wines are highly over-rated…or Australia has the best wines in the world.Analysing a Sales Catalogue
I buy a lot of wine. I seek variety and I want to buy quickly. Branded Australian wines vary annually because our use of single varietals results in different tastes from the same vineyards each year. (The European blending approach achieves similar taste each year, but the grapes come from different places and different varietal percentages.) Analysing one monthly sales catalogue showed the following:
Rating (out of 100) | No. | % | Cumulative No. | Cumulative % |
97-99 | 25 | 14.5 | ||
96 | 37 | 21.4 | 62 | 35.9 |
95 | 58 | 33.5 | 120 | 69.4 |
94 | 33 | 19.1 | 153 | 88.5 |
93 | 15 | 8.7 | 168 | 97.2 |
<92 (actually 89-91) | 5 | 2.9 | 173 | 100.0 |
The table shows that, for wines priced between $5 and $490 (yes, really), ratings varied from 89-99. Imagine if all children scored 89-99 on a test, how would you feel about the value of the test?
Worse, 74% of wines were rated 94-96 and 97% were rated 93 and over. This seems unreasonably high, given the wide range of wines covered, including a significant number of cleanskins!
Analysing a Wineries Book
I had on hand a 2012 book of Australian wineries, all of which were rated at least 3 stars (out of 5). Interestingly, the author said 46% of all wineries were rated below this or not rated, which suggests a much wider spread of wine quality than is represented in the book. A quick survey of the alphabetical As and Bs wineries (181 wineries) showed:
Rating (out of 5 stars) | No. | % | Cumulative No. | Cumulative % |
5 | 53 | 29.3 | ||
4.5 | 31 | 17.1 | 84 | 46.4 |
4 | 43 | 23.8 | 127 | 70.2 |
3.5 | 22 | 12.2 | 149 | 82.4 |
3 | 32 | 17.7 | 181 | 100.0 |
This table shows that over 70% of the rated wineries scored 4-5. A Bell curve would not fit this data – you would expect wineries rated 2-3 to be the largest number, with those rated 5 being the lowest number. Since this book is 6 years old, an up-to-date analysis would undoubtedly find an even greater percentage at the top end, consistent with the results in the first table.
Chasing Ratings Increases All Ratings…and Lowers Their Usefulness
Although not all wines are rated by the same person/company, most are rated by a small group of well-known ‘experts’. Since there are over 4,000 wineries (now that’s competition!), and, as all wineries want to be well rated to boost their chances of sales, by being able to add a ‘97’ or a ‘gold star’ to their bottles, the pressure on raters increases.
It’s easy to put ratings up. No one objects to receiving a higher rating than is justified. It’s very difficult to put them down. (When I taught in a top US university, all students expected to get an A or, at worst, a B+…ie 87-100%…).
So it’s not surprising to see grade inflation in wine…but the extent of the inflation seems out of control. Are virtually all Australian wines very high quality? Can I really distinguish a ‘97’ from a ‘96’? When will ‘100’ become a common – and acceptable – rating?
What to Do?
While I very rarely get a ‘bad’ wine these days and the quality seems uniformly high…surely it’s just not that high. Can a $5-10 wine really be rated say 95???
It seems clear that the ratings system is broken, but it is not yet publically recognised. This trend will continue…until wines are routinely rated 100, at which point the scale will clearly be useless! The best you can do is:
– be aware of the ‘average’ rating of good wines
– look for the credibility of the rater
– seek a variety of raters
– remember what you like
– seek the opinions of others whose tastes are like yours.
Enjoy!