Wednesday, December 30, 2009

Rating the Raters (Cont.)

Wine Press Northwest 92/12 Wine Press Northwest, as its name indicates, deals with wines from Washington, Oregon, British Columbia, and Idaho. Since Bonair Winery is a Washington winery, this is important to me.

Wine Press Northwest uses a double blind methodology to insure complete anonymity to the wines being judged. They use a simple scoring system, Outstanding, Excellent, Recommended, and if none of the above, a simple description of a wine that at least rates publication. A 'Best Buy' is awarded on price and quality. They even allow winemakers to sit in on the tastings as blind judges – a good idea to let winemakers know their strengths and weaknesses. Only in a blind tasting would Washington Pinot Noirs stand up to Oregon Pinot Noirs. When raters see the bottle, Pinot Noir loses ten points for being from Washington. When raters see the bottle, the wine gets five extra points for a three pound bottle. Obviously, only a great wine would be put in an ecologically insane chunk of glass. (Waste makes me grumpy and rewarding waste makes me grumpier.)

Their judging panel is listed on the website. In addition to Dan Berger (already rated) the panel includes Bob Woehler. I have judged with Bob at the Washington Wine Competition and in my opinion he is not the most perceptive judge, but on the bright side, he is very positive toward Washington wines and rates things generally higher than I do. When I was on his panel, more gold medals were given than from any other panel I have served on. Great news for wineries! Bob is like a cheer leader and probably adds to the positive aspect of the reviews.

Andy Perdue, editor, gets a little testy when challenged on the facts, so tread lightly. For example he made the following assertion, " The Rattlesnake Hills can be somewhat cooler than other areas of the Yakima Valley." which just ain't true. The Rattlesnake Hills is one of the warmer AVAs in Washington and definitely a warmer area in the Yakima Valley. He defended his statement by saying it is cooler than Red Mountain. Well, Red Mountain is cooler than Badger Canyon (not yet an AVA, but sure could be), so does that make it a "somewhat cooler" area of the Yakima Valley, also? The year in question was 2006 when the Rattlesnake Hills AVA racked up 2799 degree days - 228 more than the Yakima Valley. The new Snipes Mountain AVA is cooler than the Rattlesnake Hills AVA. There is a continuing misconception that the Yakima Valley gets cooler as you go west. If this were true, Red Willow Vineyard, in the shadow of Mt. Adams, would be ice cold. It is not. It is considered a warmer site. The cooler areas in the Yakima Valley are found around Prosser, Grandview, and Sunnyside. (Misconceptions make me grumpy.)

So, I give Wine Press Northwest and 'outstanding' rating and a 'best buy' if you are interested in northwest wines.

Saturday, December 26, 2009

Rating the Raters (cont.)

Dan Berger Wine Experiences
98/18 Dan's personal experience and qualifications can be summed up by the link to his website, so I won't go into it here. I give Dan an 18 out of 20. Pretty high score.

I had the privilege of judging wine with Dan at the Washington Wine Competition. This is a unique competition where the wines are all judged anonymously (double blind), privately, and individually, followed by the judges posting their scores for all to see, followed by discussion of the wines. It is sometimes intimidating to post your scores for all professional judges to see, but usually there is not a lot of variation in the scores. Judges may be cajoled into changing their scores so as to help a wine to medal or not. Flaws are pointed out and the wines re-examined during discussion. Judging is based on a simple, yet very effective criterion: Gold, Silver, Bronze, and no award. Double gold medals are awarded when all the judges agree that the wine is gold medal quality. If four out of five judges give it a gold, the dissenting judge is allowed to up his/her score to make it a double gold. This happens often if the dissenting judge gave the wine a silver.

So, I have judged wine side by side with Dan Berger and I know exactly how good he is! Basically, he is as good as I am and that is damn good. We only disagree on certain late harvest wines and I tend to deviate quite a bit from all judges in this category. This causes him to lose two of his twenty points, but no one is perfect, just like no wine is perfect in every sense.

Dan does not like big alcoholic wines, but he and I are both guilty of scoring them highly when sniffing and spitting are de rigueur. These wines show well out of the chute, but get really boring after one glass. Give me some wine with varietal character and some acid to stiffen it up so it goes well with food. If I want a cocktail, give me a gin and tonic – real gin, not Bombay Sapphire!

Personally, Dan is affable and easy to talk to. He even published a humorous article that I wrote anonymously in his newsletter. The piece made fun of none other than E. Robert Parker.

Disclaimer: to my knowledge, Dan has never knowingly rated a Bonair wine and has never published a rating of our wine.

Tuesday, December 15, 2009

Scientifically Speaking, the 100-point Scale Isn't

Before I get into rating the raters, I need to address two fundamental concepts in testing: validity and reliability. I have a master's degree in school administration so I have had a lot of classes on statistical analysis and trust me; wine raters do not score high on either count, validity or reliability.

Validity refers to the fact that a test measures what it purports to measure. Wine quality is such an individual value that no tester can validly measure a wine that you would like. I refer to an experience in the Bonair Winery tasting room where a man walked in and asked me to try "our best wine." I proudly poured him our Morrison Vineyard Cabernet Sauvignon. His face scrunched up in pain and announced, "yuk, that is really sour. Don't you have anything sweeter?" Ah yes, a nice sweet late harvest Riesling. So, to understand the validity of a rater, you have to know what that rater likes. For example, Parker likes boatloads of new French oak, high alcohol, and micro-oxygenation to the point of burning the varietal character out of the wine. So if this is your palate, Parker might be considered valid for you, but not for real wine drinkers who drink wine on a daily basis with meals. If you like food friendly wine, you should probably find a more 'valid' rater.

Reliability refers to the ability to give the same score to the same wine again and again. This is easy with most raters, because they are looking at the bottle and their notes. "Hum, this tastes like dog piss. What did I give this wine last time? Oops, I gave it a 94 – (note to self: don't smoke too much marijuana prior to rating wines.) Well, to be reliable, I'll give it a 92 an hope no one calls me on it" Or, "This wine is from Walla, so I have to give it at least 90 points, otherwise people would think I don't know what I am doing. (Note to self: What am I doing?)"

Psychology, which deals in an inexact science just like wine rating, uses test-retest reliability to verify that a test can be repeated and give the same result. Using a complex formula, the test to retest reliability must be 95% or better for the test to be reliable. If there were a true, 100-point scale, this might be easier. A score of 90 would be statistically the same as 94 and 86. (This is probably truer than you think.) But, we are only dealing with a 20 point scale, so the test-retest reliability is harder. It comes out that a rater must nail the score within one point each time to be reliable, i.e. first blind test 90, second test cannot be more than 91 or less than 89. Also, to be a truly reliable score, not only must Gregutt nail the score blindfolded in this manner, the Wine Advocate and the Wine Advisor must also come within the same range of 89-91.

So, you can see, wine ratings are highly unreliable and invalid, statistically speaking both internally and externally. If you are unsure about what wine to buy, buy only wine your friends will think is good. They will be impressed and you should be able to choke the stuff down while all are smiling and commenting how good hog piss tastes.

What is the difference between a connoisseur and a city sewer? Not much, both are full of crap.

Tuesday, December 1, 2009

Rating the Raters on the 100-Point Scale

Most winemakers shudder in fear from wine writers - and now wine bloggers. I think it is time to rate the writers on their phony 100-point scale. But first, I will offer my corrected version of the 100-point scale.


 

Actually, it is a 20 point scale at best and can be understood better by subtracting 80 from all scores. Want to know why people don't seek out wines rated 87? Well, with my corrected score, they only rate 7 out of 20. That is pretty sad. So my ratings will look like this:


 

Brett Weinspitter/Wine Expectorator 86/6. (6/20 being his real score) Brett loves alcohol - so much that he refuses to rate any wine that contains less than 14.5%. He also loves the sweetness that goes along with this high alcohol and the boatloads of new French oak that held the wine. In fact he loves oak juice so much he rated a wine at 99 points because it had 200% new French oak. Brett never drinks wine with food since none of his selections go well with food. Brett has also been known to weigh the empty bottles and add points for extra glass. Brett is known to be bad for wine, but in small amounts it is called terroir.


 

So, I will rate each wine writer/magazine/newsletter on the 100-point scale followed by my corrected score. Stay tuned to see how the raters score.