How to
Best practices for conducting B2B quantitative surveys
July 6, 2020

The pros and cons of doing surveys ‘in-house.’
The arrival of DIY research tools has changed the industry in many ways. To give one example – traditional research buyers (brands, agencies) are conducting more quantitative research ‘in-house’ rather than using specialist research agencies.
There are lots of benefits to doing a quantitative survey in-house:
- Superior knowledge – agencies are experts in research, and often may be experts in a specific industry. And, with a proper briefing, they can quickly learn a lot about a client’s business, or a particular project. However, they can never truly understand the project or business, as well as a client
- Giving a project to someone else means giving up control. You don’t have as much oversight of any issues, and therefore have less quality control. Also, agencies have their way of doing things – for example, of writing questionnaires – which might be different from your way of doing things. That may not be a significant issue, but may add to the time you need to spend on the project adapting materials
- Ultimately, it’s cheaper to do research in-house
There are also some downsides:
- If the survey is poorly designed, it will reflect poorly on the brand conducting the research. That is particularly true in B2B markets where the target audience is small, and relationships are critical. If a major customer received a survey that is full of errors and ill though-out, they might question how much they are valued
- Similarly, the research may not meet best practice guidelines around data privacy. That not only could cause reputational damage, but it could lead to unforeseen costs, e.g., GDPR fines
- Lack of independence. Internal stakeholders are more likely to let their opinions influence the design or analysis of the survey. Even if that isn’t the case, their recommendations may not be as trusted internally as an independent third parties’ recommendations
- Ultimately, low-quality B2B market research can lead to bad decisions. If a survey is poorly designed, or too few interviews are conducted, then a company might be being misled by the data
Here’s the Adience view on best practice in conducting B2B surveys yourself. For more context about quantitative research – including when to use it, and how to do it – click here.
If you wish to know more about b2b market research, then see more about Adience b2b market research agency here.
#1. Clearly define your business objective
#2. Clearly define the target audience and sample size
#3. Decide on the survey methodology
#4. Decide how to incentivize the target audience
#5. Create the questionnaire structure
#6. Write the questions
#7. Pilot the survey
#8. Launch and manage the survey
#9. Process and analyze the results
#1. Clearly define your business objective
Ask yourself the following questions:
- What are my goals for the survey?
- How will I use the information that results from the survey?
- What specific information do I need to reach these goals?
Then write a short statement detailing your goals and motivations, as well as listing any specific information you need to reach your objective.
Don’t worry about the order or wording of these information objectives. The goal at this stage is just to be comprehensive.
The short statement and list of information goals will keep you focused throughout the process.
#2. Clearly define the target audience and sample size
B2B decision-making tends to be complicated, as multiple individuals can be involved. Think about which roles to include in the survey.
Typically, you want to speak to the individuals who make or influence the final decision to buy the product category covered by the research.
Similarly, think about the mix of organizations that you would like to respond. For example, you may have decided that you want to focus on any US SMB.
For the research to be useful, the interviews need to be representative of the US SMB population. That would mean ensuring a mix of different sectors and size segments.
Once you’ve defined the target audience, you need to decide how many interviews are required. The more interviews, the more statistically robust the results will be.
But the target audience may be so small that a high number of interviews isn’t possible without budgets being very high. Try to maximize the number of interviews, but be realistic about what is feasible.
#3. Decide on the survey methodology
B2B market research surveys can be completed face-to-face, online (laptop or mobile), or by telephone.
Face-to-face surveys are often unrealistic in B2B research unless you’re conducting interviews at a major conference or event.
Even then, this is only valid if the study is about that event, or if the entire target audience is likely to be at the event. Otherwise, the participants will be biased towards attendees.
Online surveys tend to be a very cost-effective approach to doing quantitative research. They are generally cheap to set-up and execute.
However, they aren’t always possible/realistic. Sometimes surveys need to be conducted using a telephone methodology (also called CATI).
Each approach has dynamics that should be factored into the questionnaire design. For example, online surveys typically need to have shorter questions, but can use different question types (e.g., conjoint exercises are possible).
Several factors influence whether a survey is better suited to a telephone or online approach. Ultimately, it comes down to who you are trying to interview:
- Senior decision-makers tend to be harder to reach, and gatekeepers may protect them. It may be challenging to get them to take part in any type of survey. They certainly won’t take part in an online survey. Therefore, a telephone approach may be the only way to get them to complete the survey. Telephone interviewers can talk their way past gatekeepers, and use a variety of tricks to keep decision-makers engaged in the survey as it progresses
- Specific B2B audiences don’t use computers or the Internet at work (e.g., people on the factory floor). Therefore, it can be easier to interview them by phone
- On the other hand, the telephone approach tends to be more time-consuming, as participants have to listen to each question in detail, rather than reading them at their speed. Some time-poor respondents prefer to complete surveys online
- Another consideration relates to whether or not you have access to a list of respondents with accurate email addresses. If so, it is easy to invite them to take part in the research via email and to encourage them to conduct the survey online. If not, options are limited. A small number of roles are available on B2B research panels – e.g., decision-makers at SMBs, mid-management positions at larger companies – but senior decision-makers, especially at larger companies, cannot be accessed via a panel. For these audiences, a telephone survey is the only way to complete interviews reliably at scale
#4. Decide how to incentivize the target audience
B2B decision-makers a scarce resource, particularly when the study focuses on individuals in senior or niche roles.
The problem is not just that they are scarce. Securing the support of decision-makers is also tricky. Gatekeepers protect them, they have limited time, and their focus is on improving their business, not taking part in research.
However, you can incentivize B2B respondents to take part in the research if you use the right approach.
The most powerful incentives are ‘soft’:
- In most B2B markets, people buy from people, and buyers and sellers tend to have a healthy relationship. Leveraging this relationship is the most powerful incentive of all
- Another powerful soft incentive is appealing to the curiosity of a decision-maker. If the research topic, or technique, sounds interesting, decision-makers are more likely to consider taking part
- B2B respondents are time-poor, so it’s important to emphasize that the time they do spend on the research will benefit them in the long-run. We recommend emphasizing how research participation will help them and their employer through innovations or service improvements
These soft incentives aren’t always possible, and may not be enough by themselves, so ‘hard’/tangible incentives can be required:
- A common approach is to thank respondents for their time with a financial incentive, either a cash payment or a prize draw for a gift card or something like an iPad. That isn’t always appropriate, either legally (due to corporate bribery laws), ethically (due to the optics of giving well-remunerated decision-makers cash for their opinions), or practically (sometimes it isn’t even required)
- A charity donation is a useful alternative – it rewards decision-makers for their time by appealing to their sense of charity, without any of the legal/ethical issues. It also allows clients, or respondents, to direct money to a charity that they support
In our experience, if you’re looking to persuade time-poor decision-makers to participate in research, a mix of soft and hard incentives works best.
#5. Create the questionnaire structure
Questionnaire writing requires two different skills:
- The actual act of writing survey questions and responses involves attention to detail
- But well-written questions aren’t useful if the questionnaire structure is wrong. For example, if the order of questions confuses respondents, they may become confused and stop taking part in the survey
These are distinct skills that use different parts of the brain. Many people aren’t able to do both well, and no-one can do both well at the same time. Writing questions requires such focus on the detail that questionnaire writers lose sight of the big picture of the overall structure.
Treating each skill as a separate step makes it easier for you to perform both actions well. It also allows you to consult with colleagues who may be particularly good at each skill. For example, you ask a colleague who has excellent attention to detail to review the question-wording.
Our recommendation is to decide on the questionnaire structure before writing the questions.
Every B2B survey starts with a series of questions to ‘screen and profile’ survey respondents:
- We use screening questions to check that the respondent is a good fit for the survey. For example, checking that they have the correct job title and responsibilities.
- Profiling questions allow us to ensure we get the right mix of respondents. For example, we might ask the respondent how many people are employed by their organization. Doing so helps us to ensure that survey respondents represent a mix of large and small organizations, and a mix of different sectors (e.g., B2B SaaS, logistics, martech, fintech)
We recommend a maximum of 10 screening and profiling questions.
The rest of the survey should be a maximum of 20 questions. Deciding on the order of those questions is the hard part.
We recommend deciding on around 2-3 ‘sections’ for the core survey (in addition to the screening and profiling section). Each section should be on a slightly different topic, and therefore help with a different part of the overall objective.
For example, if you were undertaking a survey to test a new product idea, you might have the following sections:
- Screening and profiling the respondent and their organization
- Understanding the organization’ s/individual’s overall mindset, including their strategic priorities and pain points. This section helps us to ensure we know the target audience, and can tailor product features and messaging to resonate with their priorities and pain points
- Explore the organization’s current or future use of the product category. Doing so helps us to understand expectations for products, so that we can ‘indirectly’ test the product idea. In other words, we want to know if the new product aligns with their expectations or not
- Testing the product itself, including understanding perceived benefits, drawbacks, and the likelihood of future usage. Doing so helps us to refine messaging and features, as well as helping to predict future usage of the product
Once you’ve decided what the sections will be about, you can then start to populate each section with a question order. Use the information objectives, and any background documents, as inspiration.
But also try to think of additional ideas based on the purpose of each section. The goal at this stage isn’t to write questions out in detail, but to think about what the topic of the question should be.
When deciding on the order of sections and questions, the fundamental principle is to start broad and get more specific as the survey progresses. For example, start by asking about someone’s business in general, then ask them about a particular aspect or function of the organization.
#6. Write the questions
Once you know the questionnaire structure, you need to design the actual questions. When doing so, make sure to avoid common mistakes:
- Avoid leading questions. Regularly check your question wording to ensure that you are not deliberately or accidentally leading respondents to a particular answer
- Remove ‘double-barrelled’ questions. Bad surveys often have questions that ask participants to feedback on multiple things with one response. Questions should ask participants to give their response about one subject, so that it’s clear what their answer relates to
- Avoid the overly complicated survey question. The survey is for B2B decision-makers, but the questions should be intelligible to a typical consumer. That means that the language should be simple and avoid jargon to avoid any misinterpretation of the question
- Avoid long questions. B2B decision-makers are time-poor. Seemingly endless questions can be frustrating. The longer the question, the more likely they are to drop out
- Avoid repetitive questions. As with long questions, repetitive questions can be frustrating for the respondent, who may drop out of the survey. Besides, repetitive questions can confuse survey participants and lead to weak data. For example, imagine that there are two questions in an online survey are almost identical, but with one word changed. In this scenario, the respondent may think the questions are the same, and there is a mistake in the survey. As a result, they may not consider their response to the second question, and give an identical response
- Include ‘other’ options where relevant. B2B decision-makers tend to do things on their terms. The structured nature of questionnaires can be constraining. Even if you think you have been comprehensive with your answer options, a respondent may think the answer options are insufficient. If there is no ‘other’ option, they may drop out of the survey
- Include ‘don’t know’ options where relevant. We assume that B2B decision-makers know everything about their business. But they don’t always. For example, it’s realistic for a Purchasing Director not to know the exact number of employees in their organization. If there is not a ‘don’t know’ option, they must either select an option that isn’t true, or drop out of the survey
- Don’t ask for too much personal information. B2B decision-makers are understandably wary about sharing confidential information about their business. Even if you can reassure them that the survey is legitimate, there is a limit to what they are happy to share. So if you ask for too much confidential information, they will drop out
- Try to include 1 or 2 open questions where respondents can type anything. Open questions are an excellent way to obtain information that structured questions cannot provide. And it means that you can capture some qualitative insights without needing to conduct a separate qualitative stage of research
- Finally, be clear about you will use their information, and make sure to ask for permission to use their contact information or responses for anything other than just research analysis. Even if you’re only using their information to communicate about a prize draw, you need their approval
Ultimately, the most important thing is to pay attention to detail. Check each question and answer option to make sure it avoids the mistakes above. But also ensure that questions are grammatically correct and don’t give an incorrect impression of the brand conducting the survey.
#7. Pilot the survey
We suggest sending the survey to friendly clients or colleagues. Keep it as simple as possible – you can just send a draft version in whatever format you’re working in (e.g., Word).
Their feedback will help you to avoid mistakes like those mentioned above. It will also give you a good idea of how long the survey will take people to complete in reality.
#8. Launch and manage the survey
In B2B research, there are generally five ways to launch a survey:
- Sending it via email – this approach is free and allows you to control who is participating in the survey. The response rate will depend on how well your brand is known, and the strength of the relationship you have with the email contacts. For example, if you just email a list of contacts that you have purchased online, the response rate will be low
- Calling respondents – this approach is similar to distributing via email. It provides control over who participates, and the response rate depends on the strength of relationships with contacts on the list. However, it is more expensive due to the labor costs of those calling people on the list. On the plus side, you can typically target or guarantee a specific number of responses, as telephone response rates are generally consistent and predictable
- Buying respondents through a panel – this approach sounds great in theory. Interviewees have already signed up to take part in research, so you typically know in advance how many responses you will get. You have to pay to access the respondents, but the cost is often worthwhile. However, the reality can be different. First, not all audiences are available on panels – you won’t get senior decision-makers, and any agency that claims that senior B2B decision-makers are present on panels should be treated with caution. Second, you have no control or visibility over who is taking part. Unfortunately, some research panelists aren’t who they say they are. There are several tricks that we recommend to avoid this sort of panelist
- Embedding the survey in a website or newsletter – this can allow you to gather responses by getting the study in front of a potential respondent when they’re close to the brand or product category covered by the questions. It’s typically a cost-effective way of distributing the survey, unless you have to pay for the website banner or newsletter presence. The downside is that you cannot guarantee how many responses you will achieve. Besides, you cannot control who is taking part in the research
- Finally, distributing the link on social media – same as ’embedding the survey in a website’
Typically, we recommend using multiple methods to launching a survey, as it allows you to obtain a mix of benefits. For example, if the target audience is relatively junior, we might recommend launching the survey to database contacts via email as well as buying respondents from a panel.
This hybrid approach would allow a client to guarantee a minimum number of responses (via the panel), while distributing the survey to a broader number of people cost-effectively (via email).
Regardless of the approach, there are a few principles to bear in mind when launching and managing the survey:
- Tell participants the deadline for completing the survey. Some people won’t pay attention unless there is a deadline. Also, you don’t want responses arriving once you’ve started to do analysis
- Don’t just distribute the survey once. Reminders are acceptable and encouraged. Perhaps a potential participant was on holiday when you first invited them. For example, if distributing the survey via email, we recommend sending two targeted reminders. The first reminder should emphasize a different ‘incentive’ to the initial invite. The second reminder should emphasize the fast-approaching deadline. Additionally, you can tailor language according to whether the participant has started the survey (“almost there…”) or not
- Check the responses while interviewing is happening, rather than waiting until the end of the survey. There are a few benefits to doing this: you can identify, and remedy, quality control issues early on; early results allow you to start thinking about what the final story and recommendations might be; you can remove any individuals whose responses are incomplete or incomprehensible
#9. Process and analyze the results
When a survey is complete, the temptation is to start immediately doing analysis and building a report.
However, this can lead to low-quality research reports:
- You need to check and format the data before you start analyzing it, otherwise you’ll be referring to incorrect or incomplete data
- You need to analyze the data and understand what the story is before you start to write the report. Doing so helps to ensure the report won’t have a confused narrative, or be too long
Checking the data.
Start by looking at your data. You are looking for the following:
- Individuals who haven’t adequately answered every question
- Individuals who have ‘flat-lined,’ i.e., completed the survey too quickly by just clicking through questions without thinking about their responses
- Individuals whose answers don’t make sense, e.g., saying they haven’t heard of Brand X, but also saying that it is their favorite brand. Note: you probably shouldn’t have allowed that combination in the questionnaire, but mistakes happen!
We suggest removing all of these individuals from the final data.
Formatting the data.
Most survey platforms provide a database output that has one respondent per line and one question per column.
This format is useful, as it allows you to do calculations using Excel formulas and pivot tables.
However, this is often a time-consuming way of doing analysis. Data tables allow you to format the data in a more user-friendly way.
Tables take a bit of time to set up, but are worth it because they automate analysis that you would be doing manually.
To set-up tables, you need to think about two things:
- What analysis do you want to do for each question? For example, you might want to order responses according to the frequency of selection. Or you might want to see mean scores on specific questions
- In addition to looking at the data overall, what different ‘cuts’ of the data do you want to see? For example, do you want to compare the responses of UK respondents to US respondents? Once you have defined this, you can create a list of ‘banners’ that will allow you to understand the story by audience easily
Analyzing the data.
There are multiple techniques for analyzing data. Choosing the right one is the most crucial step in the entire survey.
This article is not designed to share all the analysis techniques, but here are some principles to bear in mind when conducting analysis:
- Look for statistically significant stories. Some differences in data are caused by the ‘margin of error.’ In other words, by the fact that you’ve not interviewed enough people
- Analyze every question, but don’t include a question in the final report just because you asked it. If it doesn’t tell you anything interesting or unique, consider whether you need it
- Remember the project objective when doing the analysis. Doing so will help to ensure that you’re focusing on stories linked to the core project goals, rather than stories that are interesting but not essential
Summary
#1. Clearly define the business objective.
Remind yourself about the project objectives, and write a short statement recapping your goals, as well as listing any specific information needed to reach these goals.
#2. Clearly define the target audience and sample size.
Think about which roles to include in the survey – typically the individuals who make or influence the decision to buy your product.
Similarly, think about the mix of organizations that you would like to respond.
#3. Decide on the survey methodology.
Typically, you will be deciding between an online or a telephone methodology. Online surveys are generally preferred, but may not always be possible (e.g. because senior decision-makers just won’t respond to an email inviting them to take part in an online survey).
#4. Decide how to incentivize the target audience.
Use a mix of ‘soft’ and ‘hard’ incentives to persuade time-poor decision-makers to take part.
#5. Create the questionnaire structure.
Aim for 2-4 sections, each with a specific topic that helps with the overall objective. Within each section, think about the order of the questions you want within that section. Use the information goals from step 1 as inspiration. At this stage, don’t write the questions out in detail – the goal is to make sure the survey will be comprehensive and have a logical flow.
#6. Write the questions.
Ultimately, the most important thing when writing survey questions is to pay attention to detail. It is important to avoid common mistakes such as: leading questions; double-barred questions; overly complicated, or long, questions; repetition; not offering an ‘other’ or ‘don’t know’; asking for too much personal information; not being clear about how you will use respondents’ information.
#7. Pilot the survey.
Send the survey, in whatever format you have it, to friendly clients or colleagues, so they can suggest improvements.
#8. Launch and manage the survey.
Decide which of the 5 methods you’re going to use to launch the survey. Typically, we recommend using multiple methods, as it allows you to gain a mix of benefits.
Additionally, when launching the survey, there are a few pieces of best practice to consider: tell participants the survey deadline; don’t just distribute the survey one – reminders are acceptable; check responses while the survey is live rather than waiting until the end of the survey.
#9. Process and analyze the results.
Don’t just jump straight to building a report. There are three important steps to take before you start reporting.
First, checking the data for individuals who should be removed from the data (e.g. because their answers don’t make sense).
Second, formatting the data (e.g. into tables) to make the analysis process quicker.
Third, when analyzing the data, remember to focus on statistically significant differences. Also, remind yourself of the project objectives to ensure you’re building a report that will be as relevant as possible.

Author
Chris Wells
Chris Wells is a B2B marketing researcher and strategist. He was previously on the management team at B2B research specialist Circle Research, winners of the Best Research Agency at the 2016 MRS Awards. Chris has helped to deliver hundreds of research and strategy projects for B2B organizations.