(888) 815-0802Sign In
revenue - Home page(888) 815-0802

How to Use Data Thoughtfully to Increase Your Sales, with John H. Johnson [Episode 459]

John H. Johnson, President and CEO at Edgeworth Economics, keynote speaker, and co-author of Everydata: The Misinformation Hidden in the Little Data You Consume Everyday. In this episode, we consider what the constant drip of data means for salespeople, and how to most effectively optimize your selling strategy based on the metrics.

Key Takeaways

  • John is a PhD economist with particular expertise in econometrics. Edgeworth Economics is data-driven and works by processing and explaining very large data sets. One large sector they serve is corporate litigation. John gives some detail.
  • Much of John’s time is spent teaching these issues in courtrooms. His book is designed to bring this knowledge about real-world events to a larger audience, so people can make better decisions with data.
  • The starting point is recognition. 90% of the world’s data was created in the last two years. People fear math. These two factors combine into the perfect storm for people to be misled and to misunderstand data.
  • John suggests you should ask intelligent questions. To understand statistics, think about what went into producing the number.
  • Even disciplined statisticians are prone to correlation confirmation bias. Consider, what questions you are trying to answer. Does the data give you enough complete information to answer the questions? What can it tell you?
  • Large volumes of data may tell you something meaningful about your business and sales drivers. The application of this data doesn’t replace the interpersonal skills that are needed to connect and engage with clients.
  • Making decisions on inapplicable correlations will not lead to the results you were expecting. Make sure you understand if the correlation is part of the causation.
  • John comments on common sales stats, such as the Pareto distribution of sales to salespeople. Look behind the patterns. What could be causing them?
  • Forecasting is only as good as the inputs and our ability to use past performance to predict the future. Hone in on the assumptions that underly the forecasting model. Forecasting is always probabilistic.
  • Aggregate statistics about sales may be true, but drawing specifics from generalities is not trustworthy for any specific product and industry.
  • John says managers should frame the question they want to answer and look for data that belongs to the question. Be aware where the data originates, and of assumptions under any analysis of it. Look at how it may, or may not apply.
  • John emphasizes that data is a tool. It is a complement to decision-making. Use all the tools at your disposal. There is no substitute for thinking hard about these types of problems.

Episode Transcript

Andy Paul  0:00  

Hi, this is Andy. Joining me on the show is John H. Johnson. John is President/CEO at Edgeworth Economics. So John, welcome.

 

John H. Johnson  3:12  

Great to be here. Thanks for having me. 

 

Andy Paul  3:15  

Take a minute to fill out that introduction.

 

John H. Johnson  3:20  

Okay. I thought it was nice. So I am a PhD economist with a particular expertise in a field called econometrics. My firm is a data driven company. And a lot of our work involves looking at processing and explaining very large data sets. We generally do that in a number of contexts, but one of them is in litigation, often bet the company type litigation where there’s millions or even billions of dollars at stake. But the common theme of everything we do here is how can you use analytics statistical concepts, marry that with economics and what we know about how markets work, and really provide an intuitive and credible understanding of data. 

 

Andy Paul  4:12  

So was that similar motivation then for a while you write your book everyday, though?

 

John H. Johnson  4:17  

Well, yes. I mean, you know, a lot of my time is spent in courtrooms and, you know, in that context, I’m constantly teaching about these types of data issues. And I’d wanted to sort of step back and try to bring that to a broader audience, and really relate it to the real world things that people see every day. Because when you do this kind of work, you know, you start to see in every single news story, every book you read, every time you hear something on the radio, you have a certain sort of data sense. And the goal of the book was to try to help people develop their intuition so they could make better decisions with data.

 

Andy Paul  4:58  

Yeah, so I mean, we could start with a blanket statement. Mark Twain, as I’m sure he’s aware, you know, there’s well lies, damned lies and statistics as he’s summarized it. So, it seems like, you know, a big problem and you talk about this in the book is that people reading a newspaper article online or any sort of story online or wherever they consume it, or even listening to a show like this is that they hear a data related fact and they accept it at face value.

 

John H. Johnson  5:34  

Right. I mean, the starting point, I think, is sort of recognition. There is more data today than we’ve had at any other time in humanity. So one sort of estimate from IBM says that, you know, 90% of the world’s data has been created in the last two years. That’s pretty remarkable. So we are confronted with data. We have so much more data than your smartphone, and the Power of a smartphone, we compare that I always joke about the first computer I ever saw was an Atari 400. In the 1980s. My grandfather bought, we all ran to my grandfather’s house to play Pac Man. But, you know, that little computer my smartphone has, you know, hundreds of thousands of times the computing power of what a computer could do at that point in time. Oh, so hundreds of thousands, right. So we have, you know, every day now we’re confronted with so much information. And then you couple that with the fact that people don’t historically have a lot of specialized training in statistics, math, you know, and oftentimes people are afraid of it. So that kind of creates the perfect storm for people to be misled, abused. Just misunderstand data. And people recognize I think, generally, that numbers are important things being quantified, but they don’t always know the next step. Like how do I think about a number and whether it’s actually applicable to me?

 

Andy Paul  6:59  

Right? I mean, as you’re talking In the book, there’s just the mere fact of the existence of a statistical relationship between two factors doesn’t imply there’s actually a meaning.

 

John H. Johnson  7:09  

I mean, the whole goal was to make it something that was readable, where people who maybe didn’t love math or statistics could just sort of key off the intuition. So we spend a lot of time on this key concept of correlation and causation. And what does it mean when you find some statistical relationship? And our lead example on that is about how to make kids smarter. And we talked about the fact that if you want to really make your kids smarter, you could have them buy them an iPhone, let them stay up late, teach them how to juggle, have them listen to Radiohead. These were all things that we found new stories that reported there were statistical relationships to show that there were ways to make kids smarter now He said that a little tongue in cheek, but you could pick up almost any newspaper, or the course of a week and find some story proclaiming, you know, new thing will make you smarter. Now, that’s a true statistical relationship. But it doesn’t mean the causality is there. And that’s sort of one of the big flags we constantly talk about in the book is when you see things to talk about making you smarter, healthier, happier, wiser. Think about what that relationship actually means. And people want to jump to the causality we’re actually as humans, we’re kind of wired to try to connect the dots. But that doesn’t always mean that just the pure existence of some statistical relationship actually is enough to draw that conclusion.

 

Andy Paul  8:44  

Well, I think that’s the part that logically is hard for people to understand is that and we see this in sales all the time. We’re entering the era of big data and sales. We’ve got technologies now we can use that provide a lot more transparency into our processes and activities. So, you see this all the time. I mean, if you use this subject line on email, customers are 60% more likely to open up.

 

John H. Johnson  10:07  

Oh, absolutely. And I think, you know, I try to be optimistic with people about this sense of how you develop your mystical intuition. I think you have to ask intelligent questions. That doesn’t mean that every time someone uses a number, they’re misleading you. That doesn’t mean that there is such a thing as not to sound political, like alternative facts is a phrase that’s been bounced around a lot recently.

 

Andy Paul  10:31  

And by the time this airs, they’ll probably be replaced by some new outrage.

 

John H. Johnson  10:35  

But I think that, you know, politics aside, the point is, numbers are not about I have my number and you have your number. It’s about what actually went into the number. What is the number potentially telling us what is it not telling us? There’s no easy answers, and you have to think hard data is empowering and can give you a lot of valuable insights. But it is very much still through the human eye. And so as a result of that, you know, the same biases, the same views that people bring to any issue they can bring to the data as well.

 

Andy Paul  11:11  

You know, we look at data and use it so we aren’t looking for an answer. We’re looking for an answer that fits the answer, or half, we’re looking for data that fits the answer we already have.

 

John H. Johnson  11:35  

Yeah, exactly as it sounds, confirmation bias is literally I am looking for information to confirm my pre held beliefs or my pre held answers. So when I go to the data with a strong sense that I should see a relationship that I find it but that forces me or in some way makes me look at the data in an incomplete way or a less than impartial way. That’s potentially how you get confirmation bias. So people’s preconceived notions. And oftentimes you see this in sales and industry. Those are kind of complicated issues, and you can have all the data in the world but if there isn’t some sense of is it measurable, but also are you open minded about it? It’s not going to really help you get any valuable insights.

 

Andy Paul  15:43  

Yeah, well, and I think also people tend to use data to fit an answer they already have. I mean, I just read a book that I have a guest I interviewed on the show and, and yeah, it was a well done book. A lot of impressive research went into it. But clearly there were times when, you know, the data was fitting the preconceived answer.

 

John H. Johnson  16:10  

Right. And I think that is, you know, real objective science is about posing a question hypothesis testing, trying to figure out what it means. A lot of us see whether it’s business, whether it’s in sales, whether it’s in other types of jobs, where they’re confronted with data, it’s there’s more than data in the real world. You know, you’ve got personal relationships, you’ve got HR situations, you’ve got so many different things that kind of have to be accounted for when you’re running a business. It can be hard, and in fact, it can be counterproductive to be only focused on the data. Just you can’t imagine. I can’t imagine doing all the complexities of my job if the only thing I did was, let me try to simplify my entire business down to numbers. That doesn’t mean there isn’t a lot of value that’s gained from numbers and looking at the statistics. looking at data on different elements of the business, but it would be foolish to think that myopic approach that was only based on data could actually get me all the way there.

 

Andy Paul  17:11  

Well, but that’s really what’s interesting as we start turning the conversation here until the sales world is one of the huge trends and sales we see with some proponents of the point of view that the art of selling would be defined as the interpersonal interaction is dead. It’s all about the science of selling, which is the data, right? I have, I’m going to commit so many activities. And I’m going to look at the pattern, the data based on the patterns are what I achieve with those activities. And that’s going to tell me exactly what we need to do to increase sales.

 

John H. Johnson  17:42  

Well, look, I think there’s micro and macro level type analysis. I don’t question that you can look at large volumes of data and find different levers that might at least on their surface appear to really tell you something meaningful about your business or about what drives sales. I mean, there’s a whole field of demand modeling where I I think we can do good jobs with that. But again, I think that is more of an input into how you one thinks about the structure of their sales force or their business or where they, you know, allocate their resources. But that doesn’t necessarily replace or mean the paradigm that sales also involves certain interpersonal skills or other things doesn’t also matter. I don’t think they have to be mutually exclusive. But I think sometimes people treat it as all or nothing. You know, it’s sort of the classic baseball Moneyball versus what scouts see with their eyes, right. I think a lot of the sophisticated ball clubs have integrated, serious data analysis, but I don’t think they’ve also got it. They’re scouting departments and the only thing they do is collect numbers and sit on the computer, right? So just the same kind of thing. Data can be empowering, can provide real serious answers and can provide insights. But, you know, let’s take our sales example. How do you actually capture what it means to have an interview personal sales relationship and data. How do I know this salesman is particularly good because of XYZ? Well, those are hard, hard, hard things to quantify. 

 

Andy Paul  19:36  

To your point, that’s why we need to make that integration of the soft on the hard happen with the data because yeah, it’s also a wonder nothing.

 

John H. Johnson  19:42  

All the danger is okay, I think that if we just do more of x, you know, if we have more online marketing campaigns, we see a spike in sales of 60%. And that’s enough to do it. Then you find out after the fact the reason why the online campaign was successful is because the salesman was following up individually with their clients, and you missed that you’ve missed a key component of the interaction. I’m not suggesting, though, that the data couldn’t give you very valuable insights as to whether this could work or this can’t just be saying it should be put in the context. I think that’s really trying to get at that causal relationship versus just accepting there’s a blind correlation. Look, I think that’s, I think that’s the key point. I mean, sometimes a rough hammer works, right? Sometimes you might have a particular business where it’s enough to say you say, look, I look in the data, I’ve made the determination that this tells me my sales 70% increase, that’s good enough for me. I don’t want to spend the time. I think that could be foolish. I don’t know everybody’s business, right? different businesses could operate different ways. 

 

Andy Paul  22:40  

Yeah, well, and what you see is what the pattern is, it tends to be self -reinforced. And so we’re always in sales we’re talking about geez, how do we, how do we make the middle 60% better?

 

John H. Johnson  22:58  

Well, it’s also weird because you try to force people into a box, where as opposed to playing off their strengths to try to make them better at things that maybe they’re never going to be good at. And so how do you approach that? I mean, when I think about the type of problem, you’ve laid out this example, you know, the observation of what the distribution is, there’s probably meaningful information there. But again, to me, the meaningful information isn’t just the observation of who’s in which buckets, but it’s about, really, how did you get into those buckets? And what is it under the surface that’s potentially driving them? 

 

Andy Paul  23:39  

So another one that I serve seems very interesting is forecasting, you know, and it’s sort of one of these classic things that were never good at it right, and sales and there’s all these systems that develop technology based systems. It enables you to more automatically create forecasts based on inputs in the systems when then you have managers taking that and downloading it into Excel spreadsheets and manipulating it by hand and so on.

 

John H. Johnson  28:04  

Well, right. I think that’s right. I mean, to me, I immediately think about the wide variation in the way that different people go to marketplaces. Yes, the advent of the internet, Amazon different sales channels, of course that can change the way different buyers behave. But it’s very I think it’s a gross oversimplification to try to sort of put the diversity of customers and consumers in any marketplace. To think that a number like that could actually be true. Are there pockets of people that that could be true for absolutely deserve recognition of customer differences in the way they come to a marketplace or look for a product? Is that important for salespeople? Of course it is. But sort of a blanket, you know, what is the implication of the two thirds of the way through? Does that mean you should throw your hands Because nothing you can do means only focusing on the last third of the experience? Well, it doesn’t really seem like it has a real good practical implication. I think if you go a lot deeper to make sense of that.

 

Andy Paul  29:11  

Yeah. Well in two other ones we’ll talk about real quickly here before the end, as is, you know, once our classic one is that 50% of all opera qualified opportunities in your pipeline, and in no decision right customer makes a decision not to buy anything. And another one is that cert comes down researches that 50% of all sales reps don’t make quota. Now, to me, it’s like that last one particular, very hard to make sense out of.

 

John H. Johnson  29:40  

Yeah, so when I hear numbers like that, a couple of things I think about that, I thought were kind of some good guidance. So what you really care about if you’re a sales person, or you’re running a sales business, I think is what are the practical implications of those two numbers that you just gave price and so on. some respects like with all statistics, we talked about this a lot in our book about the fact that when someone uses a phrase, you literally take who you are, you know, there’s an implication that that statistics has an input a direct implication for what you should do, how you should behave, how you should act. Right. And and like all aggregate statistics, it could well be on average that both of those things are true. I don’t know for sure. I haven’t looked into those numbers specifically. But does it mean the average doesn’t mask incredible variation? And what really matters to you is not some highly generalized number. But the specifics that apply to your business, your sales, base your customers so that you know if it really matters or not, that’s what’s going to matter. And that’s where the numbers actually become actionable, and useful. And so I’m always very leery of any sort of broad statistics, not because we can’t learn something, but it’s how you translate that to your own situation that really matters.

 

Andy Paul  31:05  

So you’ve got this new generation of technologies over the last three to five years that are generating incredible amounts of data about sales activities, and outcomes and so on. And creating a lot of correlations, let’s say. And this is just gonna become more and more prevalent. So So how to how should managers would be recommendation for managers what, how to treat those data, because again, was the tendency we’ve had to do sort of the default to, to our confirmation bias and use it to confirm something we already believe but what would you say is just sort of a working that day to day is our working rule of thumb that I could use to say, How do I manage this and how do I put it to good use?

 

John H. Johnson  31:52  

Alright, so I think a few things I would say. So first, make sure you’re framing questions appropriately. If you want to use data, you should frame it in terms of what is the question I’m trying to answer? Second, I think knowing what and where data comes from? What assumptions go into any analysis, you see? And then third, thinking very hard about what are the potential implications of what I’m looking at? And how does it actually relate back to my first question, you know, what am I trying to answer? So, the more you can be rigorous, the more you can look at numbers, but ask the important questions. You know, it really is like anything else, any other science, you’re trying to dig deeper to understand what the meaning is for you or your business? And that usually means the superficial answers aren’t good enough. You can avoid a lot of confirmation bias by really skeptical looking at numbers in a way that says, Okay, I think this supports my position, but how could it not be right and yourself that sort of question. Again, a lot of what I try to get people to think about when I speak to them when I talk about my book is, you don’t have to be a data expert. You don’t have to go get a PhD from MIT to do this. To be a good consumer of data, what you do have to do is ask intelligent questions and bring enough structure to a process when you look and think about data, that you’re actually trying to have some objective objectivity in the thinking process. I think data is at its best when it provides discipline to thinking and thought processes. And I think that’s really the the hard part that we confront sales is the given the way that sort of evolving these days there’s we’re sort of at a bounce around inflection point a little bit where those some some parts of the business are really focused on certain quantity of activities over quality of activities, right. So it’s, it’s, the thought seems to have disappeared, because we’re just gonna execute this process. But then I think a lot of businesses face that not just sales, but you’re absolutely right, that it’s just, you’ve got this new tool. Does that mean you know I bought a new hammer, does that mean I should hammer every single thing in my office and just keep using the hammer over and over with no blindly ignoring Anything else? Do I need it? No. Same thing with data. Data is a compliment to our decision making a compliment to our businesses, but it doesn’t mean you have to solely rely on it. And I think there’s some inherent dangers in that.

 

Andy Paul  35:20  

Well, good. Well, John, thanks again for being on the show. And remember, friends, thank you for joining us today. So thanks again for joining me Until next time, this is Andy Paul. Good selling everyone.