TypewriterFrom The Urbach Letter – May 2004

Return to Archive

The King's Tax SurveySurvey Secrets
(
What Do You Want to Know Today?)

Information is Power! Do you believe that? Yes? Really? Uhh, sorry, I don't believe you. If you truly thought it was all that important, you'd be doing a lot more information gathering… and a lot less guessing and assuming.  But don't worry. We can fix that.

Welcome to the "new and improved" Urbach Letter. I know what you're looking for: more creative business-building ideas, more cutting-edge marketing concepts, and more ways to leverage the power of the Internet to gain competitive advantage or advance your professional career.

How do I know what you want? Because I asked. 128 Urbach Letter subscribers accepted my invitation to participate in a reader survey. Their responses confirmed a few things I believed to be true, but much more importantly, provided new insight I might not have gained any other way. I hope you agree with something I wrote in the March issue Getting What You Want: if you sincerely want to get ahead in business and life, becoming a great question-asker is perhaps *the* most important skill you can develop. That article focused on one-to-one communications, but now we're going to take it forward a giant step and obtain meaningful answers and suggestions from a much larger group of people via a computerized survey.

Warning:
Don't just read the article passively, hoping to gain a bit of general knowledge. You can't absorb this stuff in the abstract. Instead… please think about something YOU would like to know: the *real* buying preferences of your customers, how the marketplace will react to a new service you're planning to offer, which specific things in your organization you'll need to change or improve, etc.

I'm going to take you by the hand and show you how to design, distribute, and interpret your own customized survey. But first, a reality/motivation check. Why, exactly, would *you* want to do a survey in the first place? What's the point? Well you just may learn something that'll positively *transform* your business or profession. You may just find that what *you* think your customers (or clients or patients or donors or employees) want, is, in reality, way off the mark. Dead wrong! You may get early warning of customer service problems -- or uncover a new competitive threat -- before they do serious damage to your business. You will definitely learn how to best allocate your limited resources (time, energy, and money – these are *always* limited) to activities that'll provide the greatest return on investment. Perhaps you've been thinking about launching a new product or adding a new service, or even starting up an entirely new business venture. Before you invest your hard-earned money and precious time, do some "guerilla" market research first. There are dozens of good reasons for conducting a survey -- and now, very few excuses for *not* doing one. Computer technology has taken all the drudgery out, and made the whole process easy, engaging, and almost fun.

But you're a busy person. You don't have time to read thick books or long academic articles about market research or conducting surveys. That's why I've created a video tutorial for you. Even if you haven't tuned into the Urbach Letter Video Magazine in the past, I highly recommend investing the next 18 minutes in this fast-moving, *interactive* tutorial.

Interactive Desktop Video Mini-Seminar

Urbach Letter Video Magazine Web Guide

Click to Open Video Launch Page

May 2004 Urbach Letter Video Magazine

Guided tour of the enhanced video magazine format:
I'll be greeting you from the familiar video window in the upper left corner. Right below are full VCR controls to start, stop, pause, mute, and adjust volume. Below the controls are Chapter Markers: an outstanding new feature that'll let you jump to any topic of interest in the presentation; you're in full control. To the right is the new "live" window, where you'll interact with fully-functional web pages, graphic images, and other cool stuff. Let's get started. Click the play button now.

Hey! Hold on!
Did you skip over the video tutorial just now? Why? If you skipped it, you're missing out on the future of electronic communications and learning. (However, if technical problems are holding you back, that's a completely different story. Click for technical support). Besides, I put a ton of "unpaid" work into this desktop seminar and I hate to see my effort wasted. This video tutorial is a major part of "new and improved," and I'm eager to know how you like it.

If you've completed the tutorial, you *could* stop reading this article at this point and immediately start working on your own survey. Indeed, if you're the kind of person who enjoys figuring things out by yourself, go for it. However, if you'd rather gain from my pain (i.e. learn from the many, many mistakes I've already made), and want to maximize your return on investment for this project, please read on.

Market Research 101
Don't let the term "Market Research" scare you off. I know it may seem like an activity best left to clipboard-toting, spectacle-wearing academic-types, who say things like, "Chi-Square Regression," and "Multivariate Analysis." Don't worry about any of that stuff. Here's what you need to know. There are only four basic reasons for doing any kind of market research:

(1) Market Analysis. Translation: understanding the characteristics of the people who "buy" from you now, or might do so in the future. [Important Note: my articles are written for a general audience. I therefore use the terms "buy" or "sell" in the broadest possible sense. You may not "sell" anything per se, but you undoubtedly influence the thoughts and actions of other people in some way. That's selling. And gaining a deep understanding of who those people are (demographics), and what makes them tick (psychographics) is a fundamental success factor.]

(2) Product/Service Potential. Translation: projecting what product or service (and how much of it), people might want to buy from you. This also involves understanding *why* people might or might not buy from you (i.e. the marketplace's buying criteria).

(3) Advertising and Marketing Effectiveness. Translation: are you getting your money's worth? If you do the kind of direct marketing I advocate, your marketing research is built-in. That is, you'll be able to immediately measure the cost-effectiveness of every marketing initiative. However, if you insist on spending your money on brand/image advertising (like 98% of the business population), you'll have no direct correlation between ad expense and customer/prospect behavior. In that case, you'd be smart to measure ad recall, brand preference, and other metrics via interviews, focus groups, and surveys.

(4) Strategic Business and Marketing Planning. Translation: getting smart about what you're going to do next. It means gaining a high-level view of your business, setting goals, and developing an integrated strategy to reach those goals. "Strategic Planning" might sound like something you'd hire a high priced consultant to do, but if you ask the right questions, you can turn your survey respondents into unpaid strategic advisors.

Only four reasons to do market research… and they all overlap to some extent. But if there's a unifying element among them, one factor that's crucial across the board, it's asking the right questions. That's where I can help you the most. But before we get to question-asking, I should mention something important. There's more to market research than surveys. Much of the "research" in "market research" is just what it sounds like: researching the demographics and psychographics of your market/customer base, reviewing industry-wide surveys, studying other people's "generic" research, etc.

Pro market researchers divide all information into primary data and secondary data. Primary data is what we're concentrating on now: the data you can generate for your own business. It comes from one-on-one conversations with customers, suppliers, and employees, from a directed group discussion (focus group session), or from your own survey results. Obtaining primary data will always cost you time and/or money. Secondary data, on the other hand, is often free for the taking. Secondary data is information collected by others. Some examples are: industry surveys conducted by a trade association, census data, government statistics, or commercially available databases.

Time to Call in a Pro?
Here's something else you need to consider: when it's time to stop and retain a professional market researcher. This article is your guide to a "quick and dirty" study, geared to relatively small surveys, and used for general feedback, rather than mission-critical purposes. Indeed, if you're going to bet the farm on the results of a survey, better call in a pro. There's *much* more to this than you'll learn from one of my (relatively) short articles. You could start by contacting the American Statistical Association (ASA) www.amstat.org for a referral.

Even if you intend to turn this work over to someone else, I highly recommend doing at least one survey on your own. There is simply no substitute for personal experience. Having some will mean you'll be in a much better position to guide the study and interpret/implement its results. Beyond that, I believe it's a big mistake to delegate strategic planning. If you're running your own business, managing a department or a professional practice, or have significant responsibility in any organization, you should be asking these questions on a regular basis:

  • What are my customers' brand preferences?

  • How do they choose which products/services to buy?

  • Irritants and impediments: What's irritating my customers? What's keeping sales from closing? What's preventing me from hiring top people to work here?

  • Which new product features do people want the most? Which features or service options are being used now? Which ones add the most value?

  • What's changing in my customers' lives? How is the marketplace changing? Is my customer base ageing or getting younger? Shifts in ethnicity? Trending upscale or downscale?

If there's one universal truth in marketing, it is this: it's much easier and more profitable to keep an existing customer than to find a new one. That's why the most valuable use of a survey is to gauge the level of your customers' satisfaction. Another good use is to assist with price-setting. Although it's often best to let people "vote with their dollars," you can't always vary your pricing enough to evaluate if what you charge is the right amount. This topic requires a lengthy article all by itself, but for our present discussion, please know that you can get very valuable feedback on your pricing structure from a simple survey. Your "pricing reality check" can be a real eye-opener. For example, many professional service providers vastly undercharge for their services. Others haven't built enough value to be asking their current fees, and wonder why they're always scrounging for their next client…

One more thing, I'm emphasizing the collection of information to enhance an existing business rather than for a new venture or startup. That's because most of my readers are more interested in expanding their current operation than in starting a new one. However, market research is vitally important when you're blazing a new trail. The best entrepreneurs aren't wild-eyed speculators; they're cautious risk-avoiders. They use market research to model their prospective venture "on paper" before investing.

The Electronic Advantage
If you can perform an electronic survey, you're way ahead. Doing any other kind is a royal pain. A mailed survey traditionally requires an up-front bribe (I usually pocket the dollar bill and toss the survey in the trash). A telephone or in-person clipboard survey requires a lot of human capital, and can introduce bias, random factors, and unwanted elements into the study. No matter what kind of survey you undertake, it'll still require an initial investment in design: formulating questions, sequencing and formatting, etc. However if you go electronic, everything else becomes relatively easy. Whereas, paper forms have to be scanned, answers tabulated, and so forth. You might need to allocate several months, start to finish, for a large conventional study (over a thousand respondents). Cost can run into the 5 figures easily. When you go electronic, the distribution of "forms," collection of data, and tabulation are done quickly and efficiently. Nonetheless, what I'll show you next also applies to "traditional" paper/clipboard surveys.

While I highly recommend doing your survey online, there could be a major stumbling block. Who are you going to invite to take your survey? You're going to need their email addresses first. That could limit your universe considerably. But now I must digress for a moment. Do you have all your customers' email addresses? If not, why not? You should keep in (relatively) frequent contact with your customers, and one of the easiest, most appropriate ways to do that is via email. You should actively solicit email addresses and develop value-enhancing messages to send. Of course, don't restrict yourself to just building a list of people who've bought from you. You need multiple lists (prospects, suppliers, networking contacts, etc.), and tailor your communications to each. However, if you haven't already cultivated an email list, and now want to do an electronic survey, that's an issue. But not an insurmountable one. While you can rent a list to email, please be very careful. Many supposed "opt-in" lists are anything but, and the new anti-s_p_a_m laws can bite you. Another reason for not sending a survey invitation to a rented list is that you have no relationship with those people. That means they'll have little intrinsic motivation to fill it out.

And that's a good lead-in to the next topic: motivating people to complete your survey. More often than not, it's going to require a bribe of some sort. People are busy and will not invest the time to complete your survey unless there's something in it for them. But the bribe doesn't have to be monetary. My "ethical bribe" to Urbach Letter subscribers last month was this upcoming article, and the promise that you would get much more out of it if you completed my survey as part of a mini case study. A second part of the bribe was the promise to share the results of my survey with you. This can be a particularly powerful motivator in some instances. If you're collecting information that's of general interest to an industry or community, the promise to share results may be sufficient motivation. However, if your results are for internal use only, then you'll need to come up with a bribe. I've had good results from the offer of a random drawing for a PDA or iPod. Another good vertical industry bribe is the offer of an information product (white paper PDF or CD-ROM document) to everyone who completes the survey. The best items to use as motivators have high perceived value but low actual cost. Info products (special reports, e-books, audio tapes or CD's, etc.), fit this criteria extremely well. Once created, their duplication cost is minimal. Even better, if you make the content available for download, you can have zero duplication/distribution cost.

Recruiting Survey Participants
One of the main reasons for undertaking a survey is to use the answers you get for strategic guidance. However there is danger in extrapolating from the (always) limited subset of your "universe" that actually complete the survey. To minimize the chance of you'll be led astray (more on this later), you'll want to recruit a representative cross section of your population. This means trying to match the demographics of your target audience as closely as possible. However, it's also important to recruit as wide a variety of people as possible within this demographic slice. If your demographic is single professional females, age 25-44, try to get a good mix of ages within that spread. A final note about recruiting is to state the unfortunate fact that the people who most readily respond to your request to participate in the survey may not give you the most complete or accurate responses. Even with a big juicy bribe, don't expect a big-company CEO to participate. Sorry, but Bill Gates ain't fillin' out yer stinkin' survey.

Art and Science
Like many other interesting activities, survey design is a blend of art and science. It goes without saying that you must have clear objectives, or you won't derive maximum value from your effort. Your survey can provide you with two different kinds of data: quantitative and qualitative. Quantitative data can be expressed in numerical measures: dollar figures, percentages, counts, quantities, etc. When you're doing a medium to large scale survey, quant data is important because it can be "boiled down" and summarized in tables and graphs. Qualitative is fundamentally different. It's subjective and "fuzzy." You'll find the main source of qual data are the "fill in the blanks" questions on your questionnaire form. (You'll also get qualitative responses from one-on-one interviews and focus groups.) While not as "scientific" as quantitative data, you'll find reading these fill-in responses to be incredibly enlightening. Even if you know your survey respondents personally, they may write in things they'd never tell you otherwise: "I was treated rudely when I visited your office," "Your web site is ugly," or "Your competitor's product is easier to use." Even, "You've disappointed me for the last time!" Ouch! You must be prepared to hear this kind of harsh criticism, and even more importantly, act on it! If you want honest answers, please make sure you ask straight-forward questions. Don't ask, "How are you enjoying our new easy-to-use web site?" or "Do you agree our competitor has been unethical?"

The Unasked Question
This seems obvious, but you only get answers to the questions you ask. The quality of your results will depend on your ability to pose coherent questions. People will not, in general, provide additional "fill-in-the-blanks" information unless you direct them to do so. Also, even if you're posing good questions, your results will be "contaminated" with sampling error and bias. Sampling error means you've picked the wrong subset of your universe to take your survey. Bias means you've exerted undue influence in some way. Usually this influence is unintentional, but the "sponsored" research surveys, such as those funded by tobacco, nutritional supplement, or dairy industries, are usually rife with biased questions (e.g.: "Do you agree that drinking milk builds strong bodies twelve ways?").

Open versus Closed Format Questions

  Advantages Disadvantages
Open Format
(i.e. "Fill in the Blanks")
Wide range of possible responses.

Good when you don't know beforehand what the responses might be.

Can collect "soft" subjective responses

Cannot easily compile responses.

Impossible to apply statistical analysis or chart results.

Discriminates against less literate respondents.

Relatively fewer people will complete fill-in-the-blanks.

Closed Format
(Forced Choice)
Easy to fill in. Quick.

Easy to analyze quantitatively

Can vary form throughout survey to maintain engagement [check boxes, radio buttons, relative rankings, ratings (A B C D F), degree of agreement, etc.].

You won't get answers you haven't already thought of.

After a certain number of responses are collected, you know "everything," and no breakthroughs will occur.

Guidelines for Framing Questions

1. Ask for only one piece of information per question.
Don't muddy things by asking for multiple answers to a single question. For example, "Do you like the color and texture of the seat covers?" This should be split into two separate questions.

2. Avoid negatives.
It's OK to ask people what they don't like, however, be careful how you phrase the questions. For example, "Do you agree the concert series was not inclusive?" Much better to rephrase this as, "Do you agree the concert series excluded some important music genres?"

3. Keep your questions short and simple.
A rule of thumb is: less than fifteen words and not more than two commas. If your sentence has three or more clauses, you should rephrase it, regardless of the sophistication of your audience. Even very literate people don't enjoy decoding multi-tiered questions.

4. Ask precise questions.
Be sure the words you choose don't have confusing multiple meanings. For example, the word "run" has 172 different meanings, according the Random House unabridged dictionary here in the library. For another example, the word "actionable" has a completely different meaning to a marketer than it does to a lawyer.

5. Provide context.
Failing to provide a frame of reference is one of the most common errors people make in survey design. For example, in the question, "How often do you rent movies?" the time reference is missing. Is that per week, month, or year? You'd get much more meaningful information if you asked, "How many DVD's and videotapes have you rented in the past month."

Socially Acceptable Answers (versus what they really think)
It's human nature. We tend to answer in ways we think are "proper" or socially acceptable, even if these answers don't reflect our personal reality. For example, if you ask the question, "At what hour do you usually wake up?" (circle one):

5 AM    6 AM    7 AM    8 AM    9 AM

the answers you get will be biased. Because people don't want to be regarded as indolent, and because early risers are considered more productive, disciplined, and successful, many people will *say* they get up earlier than they really do. The context in which you're asking a specific question can also bias response. In the previous example, if the earlier questions in the survey were concerned with time management, work ethics, and income goals, that'll bias toward even earlier rising hours. If prior questions had to do with quality of life issues, peak productivity hours, sleep deprivation, fatigue, mistakes/accidents, etc., it becomes more socially acceptable to admit rising at a later hour. The problem of bias is exacerbated in one-on-one interviews and focus groups. That's an important advantage of online surveys, particularly if you assure respondents of anonymity.

How long should your questionnaire be?
There's no hard and fast rule. If you don't ask enough questions, you won't get the answers you need, and your effort could be wasted. Ask too many, and people will be disinclined to participate – or may bail before completing the entire survey, corrupting your results. In general, the more niched your audience, the longer you can make your survey. People in a select group are more motivated to share their experiences and opinions with you. As noted before, they're also very motivated by the carrot to share the survey results with participants. If you're still looking for a rule of thumb, try the Urbach 10/10 rule: no more than 10 questions or no more than 10 minutes to complete the survey, start to finish.

Ordering and structuring your survey
If you want to maximize the quality and quantity of your responses, you must pay attention to the ordering of your question list. The exact same set of questions, presented in changed order, can yield very different participation levels and answers. Here are some guidelines:

  • Start off easy. The first question should not require deep thought to answer. Example: "Do you like cats?" Yes/No.

  • Begin with questions relevant to the audience. Make certain your early questions are both engaging and relevant to the people participating. If you're surveying interior designers, an example might be, "This year, have more of your clients requested formal entertaining spaces, compared to last year?"

  • Start with closed-format questions. Multiple choice checkbox and yes/no questions are quick and easy to answer. Don't start your survey with an open-format fill-in-the-blank question; that's a sure way to kill response.

  • Transition from the general to the specific. In a series of questions concerning color preferences, you might order questions like this:

  1. Color affects your mood
    [strongly agree - - - - strongly disagree]

  2. In general, do you prefer (choose one):
    [earth tones]   [primary colors]   [pastels]

  3. What color is your car?
    [silver]   [black]   [white]   [blue]   [red]   [green]   [other ____ ]

  4. If the XYZ product was offered in either deep purple or sky blue, which color would you favor?
    [deep purple]   [sky blue]

  5. If XYZ were made available in a custom color of your choosing for an additional charge of $125, would you be interested in this option?
    [Yes]   [No]

Finally, if you gather demographics (age, ethnicity, sex, income, etc.) or ask personal questions (regarding health issues for example), put them at the end. After people have invested time in the survey, they feel more involved in it, and will be more likely to comply.

More tips:
If you can still gain useful results from a partially completed survey, make this clear in your instructions. When you're trying to reach busy executives, for example, you'll boost their response rate by stating that they don't have to complete the entire survey. If you do an online survey, be aware that you're automatically going to bias results towards people who are already comfortable with computers. This is much less of a consideration today than it was just a few years ago, now that plenty of grandmas, ditch diggers, and trailer folk are online.

Confidence Level
The survey has been completed and you've collected the responses. Now, how certain are you that the results you've gotten are valid? Are you confident that the responses of the limited group you surveyed are reflective of the population at large? Sometimes your actual population of interest can be quite small, possibly in the hundreds or low thousands. Nonetheless, for 100% confidence, you have to survey every single person in your "universe." In nearly every case, this simply isn't feasible. So, how confident do you need to be? Most of us are satisfied with a "confidence level" of 0.95 (95%). That means we choose enough people to end up with a statistically valid survey reflecting the responses of the larger group nineteen times out of twenty. You may have a faint glimmer of recognition of the term "confidence level" from that statistics course you took half a lifetime ago. That's where it comes from. You *can* apply some heavy duty statistical analysis to your survey design and results, but for our semi-casual application, very little statistics knowledge is actually required.

You've undoubtedly seen political polls reported like this: "21% Californians believe George Bush secretly chews tobacco. 34% believe he likes cats. This survey had a margin of error of plus or minus 3%." I guess we can live with a three percent error on that. But how much error on *your* survey can you live with? Fortunately, you can often dial-in the accuracy you're willing to accept (i.e. pay for). For example, if your universe is large, the chance of your survey yielding erroneous results is a direct function of your survey size. To get more accurate results, survey more people. How many? This chart will tell you:

Survey Size Margin of Error
100 +/-10%
300 +/- 6%
500 +/- 5%
1,000 +/- 3%

Stated another way, if your universe of interest is large (e.g. homeowners, republicans, dentists), and you choose 100 people at random to interview, you can be 95% confident that the answers you collect will represent the entire "population," although those answers could vary up to ten percent from the "real" values. For example, if your survey reports that 63% prefer your gizmo in purple instead of yellow, in actuality, between 53% and 73% of the entire population prefers purple. Maybe that 20 point range is too broad for you to base production quotas upon. The simple solution is to interview more people. If you survey 1,000, and get the same 63% preference response, you can project between 60% and 66% of the entire population will prefer purple.

What to Look For in Your Results
Look for the big picture: trends, similarities, and contradictions. Your results may show a very high or very low response in a particular area. You'll spot this instantly in the charts and graphs. Think about what possible factors contributed to this strongly represented option. You may uncover a previously unknown, very lucrative demo/psychographic niche to serve. Or a big problem that needs attention ASAP!

On the other hand, if your responses are favorably level over a wide range of demographic groups, you may uncover an aspect of your product or service with universal appeal, and can now apply a niched marketing strategy to a much broader audience.

You might see a puzzling contradiction in your survey results. Perhaps 93% of your participants favor your new forest green model, but the other 7% absolutely hate it. That's strange. Color preference isn't this bipolar; responses normally fall along a gradual spectrum. Looking further, you see that 99% of the female respondents favor the green gizmo, while only 89% of males do. Before chalking this up to plain gender preference (which would be obvious if the color was something like puce, rather than forest green), you might suspect color blindness as a factor. Between 8 and 12 percent of males of European origin have some degree red/green colorblindness. But less than half of one percent of females are color blind.

Finally, take the time to read as many of the fill-in responses as you can. I guarantee you'll discover some things that'll absolutely shock you.

Congrats. You've reached the end of this article. You now have all the tools and information you need to put together a meaningful survey. Now what are you going to do? Sadly, the majority of the people reading this will take the "do nothing option," and continue to waste money in areas of little interest to their customers or stakeholders. I don't concern myself with those folks. I write these articles for you, the top five percenter, a person who is intensely interested in gaining the crucial information you need to build your business, advance your career, and dramatically increase your income.

Return to Archive

(c) Copyright 2002-2010 Victor Urbach
This article
may be reprinted with permission and attribution