Project Management

How to Optimize Your Customer Satisfaction Surveys

From the Voices on Project Management Blog
by , , , , , , , , , , , , , , , , , ,
Voices on Project Management offers insights, tips, advice and personal stories from project managers in different regions and industries. The goal is to get you thinking, and spark a discussion. So, if you read something that you agree with--or even disagree with--leave a comment.

About this Blog


View Posts By:

Cameron McGaughy
Lynda Bourne
Kevin Korterud
Conrado Morlan
Peter Tarhanidis
Mario Trentim
Jen Skrabak
David Wakeman
Wanda Curlee
Christian Bisson
Ramiro Rodrigues
Soma Bhattacharya
Yasmina Khelifi
Sree Rao
Lenka Pincot
Emily Luijbregts
cyndee miller
Jorge Martin Valdes Garciatorres
Marat Oyvetsky

Past Contributors:

Rex Holmlin
Vivek Prakash
Dan Goldfischer
Linda Agyapong
Jim De Piante
Siti Hajar Abdul Hamid
Bernadine Douglas
Michael Hatfield
Deanna Landers
Kelley Hunsberger
Taralyn Frasqueri-Molina
Alfonso Bucero Torres
Marian Haus
Shobhna Raghupathy
Peter Taylor
Joanna Newman
Saira Karim
Jess Tayel
Lung-Hung Chou
Rebecca Braglio
Roberto Toledo
Geoff Mattie

Recent Posts

3 Questions To Ask Yourself This New Year

5 Big Lessons Learned During 2021

AI To Disrupt Project Management

Debunking 3 Project Management Myths

How to Optimize Your Customer Satisfaction Surveys

Categories: Best Practices

Customer satisfaction surveys are one of the most used feedback mechanisms. I have conducted several surveys for internal tools used by engineers within the companies that I worked at, and here I summarize my experience. While I talk about internal surveys, most of what I describe here is applicable for external surveys as well.

Before starting any survey, think through the three questions—why, what and how:

1. Why are we counting? It takes up valuable time creating a survey, administering it, analyzing the results, and acting on it. Respondents must spend time as well. Without a clear “why,” it’s a waste of time and effort. So always start with the “why.”

2. What are we counting? The next obvious question is the “what.” Determine what you are going to count. Ensure there is no ambiguity in the attributes you plan to count.

Also determine which metric you are going to use. There are several metrics: Net Promoter Score (NPS), Net Satisfaction Score (NSAT), Customer Satisfaction Score (CSAT), etc. Based on my experience, NPS is often used for external surveys, and it is often just one question followed by an optional open-ended question for feedback. This might not give you a good enough signal for internal tools. NSAT and CSAT are the most common ones that are measured for internal tools.

3. How are we counting? To eliminate any biases or fallacies, we need to determine how we are going to count. Here are some sub-questions to think about:

  • How many people are we going to survey? This is to make sure we have a statistically significant sample size before we draw conclusions, and we are not prey to any base rate fallacy.
  • Do we have a representative sample? We need to make sure the survey studies different personas that use the internal tools. Example: If the tool is a reporting tool, executives, engineers, researchers etc. might be some of the personas involved.
  • Are the definitions clear? This is to ensure that people do not interpret definitions differently. If you use any abbreviations or acronyms, elaborate what they mean in the survey.
  • Framing the questions will impact the survey responses. Keep the following in mind:
    • Pseudo opinions - People give an opinion even if they do not have any opinion. To prevent this, include options like “Don’t know enough to say” or “Don’t know.”
    • Answer sets - Open answer sets allow people to give their automatic perceptions. Closed answer sets provide options that the user might not have thought about. Closed answer sets will get higher completion rates and have the potential for more extreme answers. Ensure the surveys are a mix of both closed and open questions.
    • Response scales - Scales will skew the data. Example: If you are looking to determine how many times the users use the tool, the answer set could be daily, weekly, monthly, or once a week, twice a week, thrice a week. So, think through what makes more sense for the scales.

Here are some dos and don’ts to keep in mind when you think of a survey:


  1. For every question you want to include in the survey, think about what you are going to do with the responses.
  2. Keep the number of questions to the absolute minimum.
  3. Anonymous surveys ensure that the respondents are candid; however, the drawback is that if you have any follow-up questions, you will not know who submitted the feedback. My recommendation is to go with non-anonymous surveys for internal tools.
  4. Always follow up on the feedback coming out of a survey and publish the results. Let the respondents know how the survey results have been used. This encourages them to submit the survey the next time.
  5. Be mindful of the number of times you send out a survey and carefully choose the cadence. I have seen quarterly, half-yearly and yearly cadences. Choose the one that gives you enough time to act on the feedback.


  1. Do not ignore survey fatigue. It is real, particularly for internal surveys.
  2. Do not use a survey if there are other ways to get meaningful feedback.
  3. If you are not going to use the responses to a survey question in any meaningful way, do not include that question in the survey.
Posted by Sree Rao on: December 01, 2021 09:16 PM | Permalink

Comments (3)

Please login or join to subscribe to this item
Dear Sree
Very interesting theme that brought to our reflection and debate
Thanks for sharing and your opinions.

It is very important to validate the questions before carrying out the study.
The validation of the questions has multiple objectives among them, to determine if the receiver understood well what we are trying to ask

Thanks for this article. This is a useful checklist when compiling a survey for internal or external use. I am often tempted to lend more weight to some more actively involved respondents and to slightly ignore others. Perhaps this is just bias and every answer should command the same respect

Thanks for this article
I've gotten some useful insights on how to go about my surveys

Please Login/Register to leave a comment.


"Would you tell me, please, which way I ought to go from here?" "That depends a good deal on where you want to get to."

- Lewis Carroll