This is not the answer on reporting polls, but there is one

In a piece that posted today, the respected Columbia Journalism Review argued, according to the headline: "Why political journalists shouldn’t report on internal polling."

This is obviously wrong. But the piece, written by freelancer Steve Friess, makes some excellent points without exactly reaching the conclusion contained in the headline.

With renewed discussion over polling methods with the release of a NBC News ponline poll on the presidential race (I am really not sure about this methodology yet), it seems a good time to talk (again)  about how poorly the media reports on surveys, not just internal polls, which can be very useful.

The Friess piece doesn't exactly say that journalists should ignore leaks of internal campaign polls because they are tendentious and obviously intended to help a candidate. That is, more often than not, the motivation for releasing them. But sometimes, maybe even often, it helps illuminate what is going on in a campaign.

Why is the campaign releasing the poll? What does it say about the state of the race? Is there a motivation that reveals much about the candidate and/or the campaign?

These are questions that should/must be asked. And, as some in the CJR piece argue, the real imperative is for the reporter to elicit as much information about the poll, even asking to see the entire instrument off the record to assure its validity. (I do this all the time.)

The general point that the CJR piece makes -- use caution when reporting internal polls and don't do so just to fill space -- is a good one. But if you can get enough information about them, they may be worth reporting, especially if  put in context with other available data.

As pollster Fritz Wenzel told Friess: 

Journalists ought to demand to see the entire poll, not “just one or two random questions pulled out,” and refuse to write about it if they can’t.

“Even if I couldn’t publish all of it, I would want to see it off the record,” Wenzel said. “And then you say that in the story: ‘We were able to talk directly to the pollster who conducted the survey and agreed with the campaign not to publish certain aspects of their data. However, we can ascertain that the ballot test was conducted in such a way so as to not bias the respondents.’ Readers have the right to know what the context of these polls is. If an internal poll’s all you got, that’s fine.”

Finally, Friess mentions me in the piece:

Many internal polls, of course, are accurate – and most are never released to the public. Campaigns spend a lot money on them to provide intelligence to plot strategy, and they can assist journalists’ understanding of a race when there are strong reporter-source relationships. Such was the case in the 2010 Nevada race between Senate Majority Leader Harry Reid and his GOP challenger, Sharron Angle. Then-Las Vegas Sun columnist Jon Ralston was the only prominent journalist to predict Angle’s defeat; he later explained that the Reid campaign had shared its internal numbers with him, and he concluded the campaign was sampling the Nevada electorate more accurately than public polling. Ralston used that information to inform his reporting, but didn’t publish specific stories about it.

This is accurate but not complete. Yes, the Reid internals, when I could get them, informed my reporting and my ultimate call of the race. But I didn't rely on those interanl polls by the estimable Mark Mellman.

I did what I learned to do after early years of not understanding polls and simply regurgitating them: I analyzed every poll that was released, including from major news organizations (CNN, FOX) and well-known pollsters. More specifically, I pored over the internals of those polls to show how they were favoring Angle because of undersampling of Hispanics, for instance.

This is what all reporters should do and why Wenzel is so right: It's not that internal polls should be ignored; it's that the internals of all polls cannot be ignored.