Applying Continuous Improvement to Survey Design

Photo by Matt Ridley on Unsplash

I love it when companies survey their current and potential users. As a user, I feel reassured that their future directions will likely increase in value and convenience to me. As an idea person, I love the opportunity to share my thoughts with people in a position to act on them.

As much as I enjoy channeling my product manager mindset through user surveys, I only complete about half of the surveys I start. Sometimes, I close the browser tab after ten minutes of unpaid clicking and typing through a stream of screens with no end in sight. The words, “in detail,” are another turn-off. If I put in the time to write a detailed essay, it better be on this blog where all 30 of my followers can see it 😂 The most common X-out factor, though, is when I’m limited to choosing from multiple partially accurate (or completely inaccurate) boxes. I figure the company isn’t targeting people like me and move on, recognizing that the company’s future directions are likely to lead away from my needs.

Now that I’m working with multiple startups on their product-market fit journeys and experiencing the joy of finding new personas whose needs our products meet, I want to do more than X out. I want to help companies seeking my user feedback understand demographics they might be overlooking. Yet, when a survey doesn’t create space to portray my use case or needs accurately, tracking down other avenues to provide my insights rarely feels like a meaningful use of my time, especially since most product communities I’ve encountered seem focused on bugs rather than ideas.

I recognize that many companies aren’t set up to act on feedback that doesn’t fit predefined perceptions about users. I also recognize that many companies do want feedback that expands their view of their users. It’d be nice to have an easy way to tell these categories apart.

Solutions I’ve thought of:

  • For multiple choice questions:
    - Allow for selecting multiple categories and optionally ask the respondent to clarify how they fit multiple categories.
    - Include an “other” category with a text field for any multiple choice question where the choices don’t cover the entire human race.
  • Include a space for feedback on the survey itself, ideally at the top for people who don’t feel completing the survey as it is is worth their time.
  • Make every question optional and allow for submitting partially completed surveys, to highlight frustration points along the survey and show which questions users do and don’t feel served by.

Continuous improvement on products, designs, and ideas is limited by a company’s access to current and potential users’ experience. Though user research aims to mitigate bias, surveys can easily cater to some groups better than others and lead to biased samples or responses. Eliminating bias and understanding the truth is a continuous process, but organizations don’t have to do it alone if they create quick, easy, and flexible ways for users to join the discussion.

Call to action: How can you make it easier for your users, team, or friends and family to check your biases.



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Enigmas Next Door

Enigmas Next Door

Facts & opinions on human centered design, community, & tech, work & online cultures. Welcome to my controverse. Twitter: @enigmasnextdoor Clubhouse: @qubit