Costly Mistakes Marketing Researchers Make

Here are some common and often costly mistakes we marketing researchers make.     

 

 

  • Not clarifying the objectives of the research, which are often vague in the original RFP and initial discussions with the client. Conversely, there sometimes are too many objectives for a single project. Some may creep into the project after the cost has been finalized and approved by the client. One predictable result is a long, unfocused questionnaire and a shallow report or presentation.
  • Some clients issue uninformative RFPs, for example, only listing fieldwork details, and then complain that the report lacks insights.
  • Not doing our homework when preparing proposals is a serious mistake. For instance, many clients already have a considerable amount of data from various sources. Some of these data would provide valuable context, and it is often possible to merge data the client has with survey research data. For example, sales figures and tracking survey data are now frequently integrated and jointly analyzed. Another common oversight is not confirming what segmentation schemes the client is already using, and which data they use for these segmentations. Some ABCs of RFPs may be of interest to you or your junior colleagues. 
  • Lengthy screeners that restrict the survey to a narrow slice of consumers are more the rule than the exception. Besides increasing costs and lengthening fieldwork, we may be excluding people from the survey we should be listening to. “Target consumers” are often illusory and based more on assumptions than evidence.
  • Poorly-worded questions that mean different things to different people, and questions that have little meaning or relevance to ordinary people can be found in just about any consumer survey.
  • Some questionnaires ask highly detailed recall questions that no human could reasonably be expected to answer accurately, despite decades of warnings against this practice by survey methodologists and competent marketing researchers.
  • Making dramatic conclusions based on limited data, or even a single comment by a focus group respondent, is not rare. Many of us need to bone up on how to connect the dots. See Causal Analysis: The Next Frontier in Analytics? and Research Thinking: Some tips on how to do better marketing research more easily for some tips on how to do this.

 

D

 

  • Dredging data until we find something the client might like is easier than ever with today’s software and computers. Stuff Happens reveals some of the dangers inherent in this.
  • Confusing statistically significant with important – and letting t-tests do our thinking – is endemic at some research companies and clients. Related to this is a lack of understanding of the assumptions of inferential statistics. Tables Tips gives some advice on what marketing researchers need to know about statistics.
  • Over-interpreting survey data is a bad habit that just won’t die. Survey results are usually best interpreted directionally, not as precise numbers. We are not measuring height and weight! Patterns in the numbers are normally more important and trustworthy than the numbers themselves.
  • Misunderstandings about sampling, e.g., thinking weighting a convenience sample makes it representative, is still typical in the marketing research community. Likewise, the popular notion that “big data” has made sampling irrelevant is nonsense. In general, we are careless in the way we generalize our findings to a broader (and often ill-defined) population.
  • Marketing researchers now seem to understand less about attitudinal measurement than in the 1980s, when I entered the profession. Likewise, we often use semiotics and ethnography to describe garden variety qualitative research. Few marketing researchers are highly skilled in either of those areas, which are quite specialized.
  • Multivariate analysis suffers from the twin curse of being underutilized and misused. Another short post of mine, Taking Quantitative Marketing Research to a Higher Level, elaborates on this point.
  • Reports that look great but say little of substance are everywhere to be found. At the other extreme, reports that are boring and confusing – the classic dust collectors – are still alive and well.
  • We often chase fads and toss around jargon we don’t understand…only to pay a hefty price later…
  • With respect to my final point, there is too much uncritical acceptance by marketing researchers of practically anything that claims to be new, innovative or disruptive. There are also marketing researchers who will summarily dismiss anything new and just stick with what they know and are comfortable with. 

Personally, I am in favor of progress but opposed to B.S. If someone has a new method or technology that truly moves our profession forward, there should be no need for exaggerated claims or straw man arguments. For those of you who may be interested, I offer my thoughts about this topic in more detail in Disrupting B.S.The Onion Is The Reality and Anyone Can Do Marketing ResearchPut simply, I advocate better-trained marketing researchers with better tools.

I have not listed human foibles that apply to any profession because they apply to all of us. The Penguin History of the World by J. M. Roberts and Odd Arne Westad speaks better to that.

Arrange a Conversation 

Browse

Article by channel:

Read more articles tagged: Analytics, Featured, Marketing Analytics

Data & Analytics