What Netflix’s Cancellation Survey Gets Wrong (And How to Fix It)
I canceled my Netflix subscription last week. And, to no surprise, I was immediately met with a survey asking me why.
First, let me say: thank you, Netflix, for letting me cancel and then opt into a survey. I’m sure, like me, many of you have had the unpleasant experience of being forced to take a survey in order to cancel. It’s, in my opinion, rude. So yay, Netflix!
Here’s exactly what Netflix asked:
“We’re always improving our service and your feedback matters. Why did you cancel your membership with us? (Select all that apply)”
And these were the response options:
• I didn’t have a good experience with ads on Netflix
• I don’t have enough time to watch Netflix
• Netflix doesn’t provide enough value for the money
• There aren’t enough good movies on Netflix
• I prefer other video streaming services
• Netflix doesn’t offer a specific movie or TV show I want
• Netflix is too expensive for me
• Netflix only allows me to share my account with people I live with
• There aren’t enough good TV shows on Netflix
• Other (please specify): _______
At first glance, this looks like a pretty thorough list. I’d imagine many people believe this is a good survey question.
I am not one of those people.
The whole point of this survey (I assume) is to help Netflix understand why people are leaving so they can make smarter business decisions. That means the survey should be quick and easy to answer and that every single answer choice should be something Netflix can meaningfully act on. But does this survey actually do that? (Spoiler: No.)
Problem #1: Too Many Choices = Survey Fatigue
Many people talk about survey fatigue (when people stop engaging with a survey in the way you intend because it feels like too much effort) they usually mean surveys that ask too many questions. But that’s not the only way to cause fatigue (frustration, annoyance, you get the picture). Too many answer choices can be just as overwhelming.
Netflix’s survey had ten different response options—and they weren’t particularly short or easy to skim through.
When you’re asking someone a question, especially on a cancellation survey (um, hi, you’ve already lost their business), you want to make it as easy as possible for them to respond quickly and truthfully. While you absolutely want to be comprehensive and cover the full range of possible responses, you also have to be strategic.
When you throw a long list of options at people, two things can happen:
They skim and pick whatever stands out first. That means they might select an option that’s close to their reason but not quite right.
They get overwhelmed and skip the survey altogether.(I can’t be the only one who has seen a long survey question and quickly closed the tab!).
Overwhelm is not good for data quality.
Imagine that you’re Netflix. What is the goal of the survey? You want as many people as possible to answer this question thoughtfully so you can make informed business decisions that lead to greater customer retention. If the list is too long or too difficult to read through, you end up with messy, unreliable data.
Now, it’s not just the number of options that can cause fatigue—it’s also how they’re presented. A lot of surveys use randomization to mix up answer choices, which can sometimes be helpful. But in cases like this, randomization actually makes the survey harder to answer.
Take these two options: “There aren’t enough good movies on Netflix” and “There aren’t enough good TV shows on Netflix.”
These weren’t even next to each other in my survey. That’s a problem.
When people are scanning for their response, they expect similar options to be grouped together. If Netflix is asking the same thing about movies and TV shows separately, those options should be placed next to each other—not randomly scattered in the list. Otherwise, people might miss one, get frustrated, or just select the first thing that feels “close enough.”
So what’s the fix?
Make sure every response is short and scannable.
Cut down the number of choices — consider whether some options can be combined; look for choices that aren’t actually actionable (more on this next!)
Be mindful of question order. Group related answers together so people can find them quickly.
Problem #2: Some of These Answers Aren’t Actionable
My number one priority with ANY survey is that it gets you data you can actually use. That means that every answer choice has a clear meaning and next step.
Let’s take another look at this survey with that in mind.
❌ “I don’t have enough time to watch Netflix”
➡️ This one is especially weird because it’s not something Netflix can control. Instead, the real issue is likely that someone isn’t using Netflix enough to make it worth paying for.
✅ Better option: “I’m not watching Netflix enough to keep my subscription.”
❌ “Netflix doesn’t provide enough value for the money”
➡️ This sounds reasonable, but what does it actually tell Netflix? Does it mean the price is too high? Does it mean the content isn’t good enough? Does it mean the user experience isn’t great? Does it mean the person just wasn’t using Netflix enough to justify the cost?
If a bunch of people pick this option, Netflix doesn’t know what’s wrong. They might lower prices when the real issue is content. Or they might add more content when the real issue is pricing. Vague responses lead to vague business decisions.
And, here’s the thing—Netflix already asks about price and content separately.
If someone thinks the price is too high, they can already select “Netflix is too expensive for me.”
If someone thinks the content isn’t good enough, they can already select “There aren’t enough good movies/TV shows.”
So what is “not enough value for the money” capturing that these other options don’t? Probably nothing. It’s just an unnecessary catch-all that muddies the data. Instead of keeping this vague response, Netflix should just be more direct.
✅ Better option: “The subscription price is too high.” (This fully addresses cost, while content concerns are already covered by other answer choices.)
❌ “I prefer other video streaming services.”
➡️ Again, initially sounds reasonable, but this is asking about preference. That’s important. The question doesn’t tell Netflix if I’m actually switching to another service or just generally like another one better.
✅ Better option: “I’m switching to another streaming service.” (Bonus: Follow-up and ask which one!)
❌ “There aren’t enough good movies on Netflix” AND “There aren’t enough good TV shows on Netflix”
➡️ The way these options are worded makes it sound like a content quality problem, but the real issue might be a variety problem. Maybe Netflix does have good movies and TV shows, but I’ve already watched everything I wanted. That’s a very different insight.
✅ “I can’t find enough movies or TV shows I want to watch.” (personally I’m still torn over whether these need to be asked separately…does Netflix need to think about adding more content generally or does it really matter that people want more shows TV versus movies? I’d defer to Netflix on this one!).
This keeps the intent of the original question but captures a wider range of reasons why someone might feel like there’s nothing left for them on Netflix.
The takeaway here is simple: Every response option should lead to a clear, actionable insight. Netflix should be able to look at the data and immediately know what changes they might need to make.
Problem #3: The Ads Question Shouldn’t Have Been There for Me
The first option in the survey was: “I didn’t have a good experience with ads on Netflix.”
But my Netflix plan didn’t have ads.
So…Why am I seeing this?
A well-designed survey should use either survey logic or embedded data to make sure people only see relevant questions. In this case, Netflix knows which users were on an ad-supported plan. This option should have only been shown to them.
Otherwise, a few things could happen:
People who hate ads might check this option—even if they never actually experienced them.
People might misclick it, adding bad data.
People could get annoyed seeing a question that doesn’t apply to them, making them less engaged in the survey overall.
But the biggest issue? Netflix could misinterpret the results and make bad business decisions.
If Netflix sees a lot of responses saying “I didn’t have a good experience with ads,” they might assume their ad experience is terrible and make unnecessary changes. But if they’re collecting responses from people who never even had ads, they’re making business decisions based on inaccurate data.
How Netflix Should Have Asked This Question Instead
If I were designing this survey for Netflix, here’s how I would have structured it:
Question: Why did you cancel your subscription? (Select all that apply.)
The subscription price is too high.
I’m not watching Netflix enough to keep my subscription.
I’m switching to another streaming service. (Follow-up: Which one?)
Netflix doesn’t have a specific movie or TV show I wanted to watch.
I can’t find enough movies or TV shows I want to watch.
Netflix only allows me to share my account with people I live with.
(for plans with ads) I didn’t like the ads on my Netflix plan. (Follow-up: Too many ads? Not relevant? Disruptive placement?)
Other (please specify): _______
This version does a few important things:
✔️ It eliminates vague or redundant responses and ensures every answer is actionable. Every response choice in this version gives Netflix clear, specific data that points to an actionable next step—whether that’s adjusting pricing, improving content variety, or evaluating the impact of account-sharing policies.
✔️ It streamlines the response options to reduce fatigue while keeping them comprehensive. The wording is clearer, similar responses are grouped together, and unnecessary choices are removed—making it easier for people to read through and select their answer accurately.
✔️ It uses embedded data so people only see questions relevant to them. Ad-supported users get the ads question, while others don’t. This prevents Netflix from collecting misleading data and ensures they’re only asking the right people the right questions.
✔️ It asks useful follow-up questions. If someone is switching to another service, Netflix should definitely want to know which one. Similarly, if people have a bad ad experience, Netflix’s next steps would depend on whether the issue is with the type or quantity of ads.
Final Thoughts
If you’re running a survey—whether you’re Netflix, a small business, a nonprofit, or a solo-preneur—ask yourself:
Am I being specific?
Is this data actionable?
Am I making this question as easy as possible for people to answer
Should everyone see this question?
Because the best survey isn’t just one that gets responses. It’s one that gets responses you can actually use.