Site icon NewmanPR

Zagat’s Cruise Line Survey Is Flawed

Public relations people love surveys. If your client gets rated in the top five of anything positive, that automatically merits a press release. However, the problem is that PR folks tend not to look too closely at a survey if it’s positive toward a client. Why question the methodology or the sample pool — just write the release and get it out there. But sometimes it’s good to question a survey — especially one that is obviously flawed.

Zagat, the company famous for its restaurant surveys, recently released its first cruise line survey to widespread coverage in the cruise-related press. Most of the coverage was typically not analytical, merely regurgitating the Zagat press release. WISTV of Columbia, S.C., simply picked up the PRNewswire release and slapped it up on its Web site. Online travel trades Travel Pulse and Agent@Home published the Zagat press release with virtually no changes. At least USA Today’s Cruise Log tossed in a mild caveat “if you trust Zagat Survey” (sic), but that’s as close to a hard look at the data as it gets with this survey.

Cruisereport.com and cruiseindustrywire.com published another press release on the survey, this one distributed by Windstar Cruises.

See, Windstar saw the PR value in getting a press release out immediately because it was rated high in a number of categories. And, of course, its release got picked up in media as “news,” though it was just a repackaging of the Zagat release. Can’t blame a cruise line that is having very public money woes for seeking a little positive publicity, can you.

But when I first looked at the survey results, all kinds of red flags popped up, from the lack of an explanation about the methodology to a skewed sample pool, it seemed to me that the results were a far cry from reliable. So I wrote to Zagat’s public relations execs Monday, Nov. 16, asking them a few questions. Here’s my e-mail to them:

Dear Ms. Barbalato and Mr. Sampogna:

I was just looking over the results of your first cruise survey. As a longtime cruise industry observer and participant, I found some of the results to be anomalous, and decided to write to you to ask for some additional information about how these ratings were decided.

Since there was little information provided on the methodology either on your Web site or in the press release, I was hoping you could explain how the participants were selected — how do you find 2,300+ people who average nearly 10 cruises each and average almost 90 days at sea each? And why are 39 percent of them in their 60s?

Also, I’d like to see a complete list of the 22 “major” cruise lines included in the survey, since I wouldn’t ordinarily consider Uniworld River Cruises one of the majors. Since 17 of the 22 made it into the survey, I’m curious about who the other five were. MSC? P&O? Azamara?

Also, please explain how it was decided to divvy up the industry into three sizes? The Berlitz Guide to Cruising divides ships into four categories: boutique (50-200 passengers), small (200-500 pax), mid-size (500-1,200) and large (1,200-4,000). The way Zagat has divided the industry puts Crystal (1,100 pax) and Seabourn (212 pax) in the same category, size-wise. I’d just like to know the thinking behind that.

Finally, I’m curious whether you all consulted with any cruise industry experts when designing the survey, and if so, who they were. If not, why not?

I have already had some back-and-forth with a couple of colleagues in the industry regarding the survey results, and would appreciate any insights you could provide that could put to rest some of the questions folks have about the survey.

Thank you for your time.

Best regards,

Buck

Needless to say, I haven’t heard back from them. Nor do I expect to.

Without some idea about how a survey — any survey — was conducted, it’s hard to determine whether the results are reliable and meaningful. In the case of Zagat’s first (and I hope last) cruise line survey, there are enough questions about the sample pool and methodology to undermine its credibility. Unfortunately, journalists covering the industry were more interested in publishing content than questioning the accuracy and reliability of the survey.

That’s unfortunate because it is the role of journalists to be skeptical and to not simply accept what a commercial entity alleges in a press release. It is always prudent to question the validity of a survey’s results, that is, unless it ranks you client near the top — then all bets are off!

Exit mobile version