Feedback Request – Listen360

This week, I received an automated email with the title “Business Name – Feedback Request” containing a sort-of likert scale asking how likely I’d be to recommend them to a “friend or business associate”.

Numbers 0 - 10 rating scale shaped like a keypad
Likert scale keypad design

My feedback on this survey design from Listen360.

The design is just weird

I’ve never seen a rating scale in a keypad orientation. It confused me and my reaction was to ignore it because it doesn’t make sense. How do I know what the scale represents when it’s not presented in as a hierarchy?

Based on the tiny gray words “Very likely” next to 10, I assume clicking that will provide the highest rating while “Not likely” next to the 0, the lowest rating.

But what do the three rows of other numbers mean, really?

This scale literally goes to 11

Most likert scales rate 1-5 or 1-7 with the odd number of choices allowing the respondent to pick a neutral option instead of forcing her to agree or disagree/like or dislike. This keypad design goes one further, to 11, with options from 0-10.

Zero is extraneous unless it’s clearly marked “I hate y’all, never coming back. All I will do is complain about you on social media.” Providing 10 levels of “maybe I’ll recommend you” was overwhelming and unnecessary to me. Really, if you condense this down to where “9-7”, “6-4” and “3-1” are single choices, you have a 1-5 scale.

I guess this scale is standard for Listen360 as the example on their homepage is also 0-10 though at least displayed horizontally. I was not able to find anything on their methodology for using 11 options.

Horizontally aligned likert scale with ratings 0-10
Horizontal likert 0-10

I’m helping you, make it easy

This company is requesting that I give them valuable response data. I expect that clicking a number would submit the survey response, done. Instead, it becomes a three-screen questionnaire requesting more and more. After selecting a rating, a web page opens asking for a long-form response.

You’re asked “What could we do to improve?” for 0-7 or “What do you like about our services?” for 8-10.

Rating screen asking what do you like about our services?
Long form answer box

Oh, a submit button! Am I done? No…

Screen reads time for a quick favor? with links to rate on social media
Rate us screen

Are you for real? Do you have “time for a quick favor?” Isn’t that what I did 5 minutes ago when I clicked a number in your email? This is like when companies sneakily check the “subscribe to our newsletter” box when you just want to buy something.

No, I don’t want to rate you… What, you parsed out my name?? I have to click “No Thanks?” Why am I thanking you?

I find this kind of feedback baiting disingenuous. It began by asking if I’d recommend them, not how my experience was, and ends with an overt “LIKE US” ploy.

My feedback is, don’t send me this crap. It decreases my opinion of your company and makes me much less likely to either return to your business or to recommend you to anyone.

PayPal Redesigns Its User Agreement

PayPal's Redesigned User Agreement

I received an email from PayPal letting me know it has redesigned its User Agreement and the two women reading it simultaneously on a tablet piqued my interest.

Within are the following suspicious, claims—red flags to this skeptic (emphasis mine):

  1. We are making these updates to clarify our terms and make these agreements easier to read and navigate.
  2. We’ve worked to make this new User Agreement a more-user-friendly experience
  3. We’ve redesigned the User Agreement to simplify its format, with new color-coded headings so you can more easily find the information most relevant to your account.
  4. We’ve revised and reorganized the content of the User Agreement to be easier to follow and to include information where you’d intuitively look for it.

Back up your claims

I had a hard time just getting through the email; am I really to believe PayPal’s User Agreement is any better? It’s problematic when companies latch on to terms like “user-friendly” and “intuitively.” How do they know it’s better? Did they perform usability testing? What were the problems with the previous design?

I think organizations should be more transparent about the data they use to make any claims about the usability, ease-of-use, and customer-friendliness of its products and services.

Does an updated User Agreement—which is 73 pages when saved as a PDF—improve my customer experience with PayPal? I’ve never even looked at it before. Not listed as one of 16 “easier to follow” sections of the document is about subscription payments which is pretty much all I use PayPal for and with which I’ve had many frustrations.

(BTW, I don’t see any color-coded headings, so I’m not even sure what that’s supposed to mean. I’m going to guess that PayPal’s users didn’t think they were so great.)

These terms and claims aren’t buzzwords; UX is a serious discipline that takes time, revision and lots of data. Saying something is “easier” just because someone inside your organization decided that it is means nothing to your customers.