The French Liars: why don’t we just ask our clients what they want?

Do you want to research the needs and experiences of your clients? Ask them. You’ll rely on their willingness to answer your questions. And you’ll have to assume their answers are accurate and sincere. But are they?

Bad questions, dishonest answers

You don’t need to have a great deal of experience as a researcher to recognise that you’re best not asking just any questions. ‘How long are you prepared to wait?’: the answer to this question is easy to predict. No one wants to wait. Are your questions contrary to the interests of your clients? Because if they are, you shouldn’t expect to receive honest answers.

Nonetheless, we regularly see surveys that include questions such as: ‘Are you prepared to pay more for the same service?’ About 2% of respondents answer ‘Yes’ to the question, effectively demonstrating that a section of the respondents did not understand the question. Or that they didn’t even read it. And of course, this doesn’t mean that the other 98% wouldn’t accept it if we were to raise the price.

Questions based on fact … Answers based on anything but fact

Even for highly factual questions, where the respondent has no direct interest in giving one answer or another, the reliability of answers is disappointing. Orange, the third largest supplier of telecommunications in Europe, has experienced this first hand.

In a survey of the roaming habits of their French clients, they established that of all clients who claimed to use roaming services ‘very often’, more than 60% had not actually used them even once in the past year. The respondents were given the nickname ‘The French Liars’, a notion that serves in many research teams as a warning not to assume too quickly that respondents give honest answers. Not even to highly factual questions.

As such, only ask the questions that you are really interested in seeing answered. If you don’t gather answers and feedback, you may feel as though you’re making decisions blindly. And that may be the case. But you’re sometimes better being blind and using a cane to find your way than it is to think that you can see, only to run off a cliff.

Written by Horst Remes Customer Strategy Expert @ Onestone

Alarm bells are ringing. Are you paying attention?

Guest writer, Marleen Strubbe,
Freelance Interim Manager, Change Manager, Project Manager.

Marleen is our former colleague. Critical is her middle name and numbers get her heart racing.

Completely ignoring warning signals: how is that possible? On the 10thof April 2010, the Polish presidential airplane crashed. The president and his delegation, all killed on impact. It happened less than a kilometre from the airport.

National Geographic devoted an entire episode of Air Crash Investigationto the event. The investigators came across a very intriguing fact. During the descent, a cockpit alarm indicating a loss of height was howling away. Oddly enough, no one really responded to it. Even more disturbing: the only way the crew did respond was in manually turning off the alarm.

Allow me to explain why they did that. Flying an airplane too low when there is no airport in the neighbourhood is enough to set the alarm blaring. The Polish presidential airplane was a passenger airplane, but it often landed at military airports. As these military airports are not included in the database of passenger airports, the system interpreted the situation as if there was no airport. And so, any time the airplane was landing at a military airport, the alarm would sound. Pilots often landed at these airports. It’s understandable that they weren’t completely shocked by the howling siren. Of course, it was still an incredibly bothersome sound. And this explains why they would reset the altimeter alarm. Which is exactly what happened on this flight.

Ignoring alarms. It happens a lot. Especially in the corporate world. An alarm signal on a dashboard is almost always linked to a response such as, ‘Yes, but that’s just temporary, because …’ or ‘This doesn’t take … into account, so …’ What are the consequences?

It’s not just the value of the alarm signal that is nullified; the details of the other measurements or assessments also lose a great deal of their credibility.

We expect our measurements and assessments to reflect reality. But is this really the case? Whenever we assess or measure something, we run the risk of a possible error in the process. And then we need to decide: are our measurements accurate or not? To make this decision, we often set an arbitrary cut-off score. This score determines whether the measurement or assessment is positive or negative. Whether the dashboard display is red or green.

This cut-off score, in combination with our error, leaves us with four possible outcomes. And to explain the situation, we are using cancer screenings as an example:

  • Our measurement was correct and we have a positive score (true positive) e.g. we checked for cancer and the patient has cancer.
  • Our measurement was correct and we have a negative result (true negative) e.g. we found no sign of cancer and the patient does not have cancer.
  • Our measurement was incorrect and we have a positive score (false positive) e.g. we found no sign of cancer, but the patient does actually have cancer.
  • Our measurement was wrong and we have a negative score (false negative) e.g. we found cancer, but the patient doesn’t actually have cancer.

Where do we set our cut-off scores? We decide this ourselves. The settings for our cut-off scores determine the ratio of false negative and false positive outcomes. In the example that we gave, we would clearly prefer a false negative result to a false positive one. It’s better to mistakenly believe that someone has cancer and to have them examined more carefully than to send them home when they are actually sick. In order to avoid this happening, we set a higher cut-off score. If we set this threshold incorrectly? We end up with constant false negative results, and our measurements are then worthless.

In the example of the Polish airplane, the number of false positives was too high. The alarm lost its effect.

In these examples, there are human lives at stake. This is fortunately not the case in the corporate world. But the dashboard shouldn’t be turned into a pointless colour picture under any circumstances.