Big Data and Leadership and Management

robotThere is an article in the December 2013 issue of the Atlantic titled, “They’re Watching You at Work.” It’s not so much about privacy or spying, as it is about big data, and how that can be applied, not to sales and marketing, but to the work of leadership and management. More accurately, the author investigated “people analytics,” or “the application of predictive analytics to people’s careers,” and he admits (as the title suggests) that the whole thing is a bit…creepy.

The challenge here, of course, is that we need to hire and promote the right people. Anyone who’s been in management has experienced this challenge, and I doubt more than a handful would say they mastered it. It’s hard. We struggle. We’re frustrated when that person we hired doesn’t work out, or when that other person we moved into management doesn’t flourish in the role. Particularly for large organizations, there’s a lot on the line here. It’s very expensive to hire someone or replace a manager, so it’s imperative that we do it right.

So if it’s very important, and few of us are particularly good at it, then why not give big data a shot? Well, some companies are. Shell Oil has an innovation/R&D department that solicits ideas from both inside and outside the company, looking for disruptive ideas they can apply to the business. The department gets a lot of ideas, and the small staff must figure out which ones are going to make it through to actual development. Not an easy task, right? So here’s the twist. There’s a company that makes computer games that are actually sophisticated assessment instruments. They can measure a lot of behaviors and decisions during the game play, and this one computer game turned out to be super accurate in predicting which people’s ideas would actually be successfully developed in the context of Shell’s innovation work. The unit wants to use the software to narrow down the number of ideas they evaluate, looking only at the ones that come from the people that the computer game says are most likely to produce successful ideas. This would keep the staff from wasting time on the “hopeless folks,” as one manager put it. And understand that those “hopeless folks” all have advanced degrees and look great on paper…they probably interview well too.

Does anyone else cringe a little at this idea? You’ve got 100 really smart people giving you ideas, and you’ll just throw 80 of those ideas in the bin because the computer game told you to? That can’t work, can it? What about intuition? These are complex, game-changing business ideas–not something I think could be automated. Don’t we need humans actually making those decisions? Won’t we miss lots of really great ideas doing it this way?

Maybe, but maybe not. I can’t argue one way or another for Shell’s specific situation, but I can say this: our gut-level aversion to the idea of using data and computers to make subtle and complex leadership and management decisions is based on a grand delusion that our current approach is successful. If you’ve seen Moneyball, then you know what I’m talking about, yet we’re still fairly slow to acknowledge the truth when it comes to management, and particularly hiring and promoting, where big data is headed first.

We need to face the truth: we just don’t know what we’re doing.

Sorry, I know there are people out there who have proven methods for assessing leadership and hiring the right people, and some may be better than others, but if we’re all so good at this, then tell me how this is possible: tall people get paid more. Seriously. I probably shouldn’t reveal this secret, as it only stands to benefit me personally, but it’s just one example of bias in our human decision making. When orchestras held blind auditions a few years back, suddenly the number of women who earned spots went up fivefold. Hmmm. How were those orchestra directors missing those talented women previously? No one’s going to argue that tall men deserve to get more promotions and pay than short men, or that men are inherently better than women at playing the French horn, but this is precisely what happens when we use our current approach to decision making, an approach where we don’t acknowledge the biases we have, or we assume as smart people that we can overcome them (even though the data clearly indicate we can’t).

Step one is acknowledging that we don’t know, that our current systems are NOT perfect, and then I think we should give these new data-based approaches a shot. They may not work either, mind you. We mustn’t succumb to the hype of big data. Remember, the computer game’s predictive algorithm in Shell’s case was tied back to what the humans said were the characteristics of people who were good at innovation, and the game only correctly predicted which of the idea generators actually were successful in the Shell system. Since humans are designing the data parameters, we may end up getting results that simply correlate with what we already believe, or verify what works in our already-messed-up system, rather than letting the data show us objective pathways to better results. It’s important we stay disciplined and keep the whole system in mind as we explore data-based approaches to people decisions.

For example, I remember hearing about one manager that used “big data” to determine that there was a single personality characteristic that was the greatest predictor of turnover among his call-center staff: curiosity. The curious people didn’t stay in the job as long. So his conclusion? Don’t hire curious people, and your turnover numbers go down. Hooray! You’re now stuck with an army of people lacking a key leadership skill who will never leave! Maybe that’s fine, but maybe lack of curiosity has an impact on performance that he hasn’t seen yet. And maybe the dip in customer service scores won’t come right away, and maybe the customers won’t be able to articulate that their frustration was with the call center employees’ lack of curiosity, but it could be there. And maybe it’s only the non-curious people who put up with YOUR call center, and if you ran it differently, curiosity wouldn’t end up pushing you to leave. I am skeptical, for instance, that the call center people at Zappos lack curiosity. If we don’t push hard on these big data solutions, we’ll end up being as misguided as our currently biased approaches.

All that being said, I applaud these new, predictive analytics approaches to the way we lead and manage. Management is so desperately in need of innovation, that we MUST support this kind of experimenting. I’m okay with a healthy dose of skepticism, and let’s be disciplined about it, but by all means let’s try some things that our guts might find objectionable at first. Let’s do things that we think don’t make sense. Remember, most of our management practices were invented in the early 1900s. This was one of the most startling revelations I encountered when writing Humanize a few years ago. The fact that we’re only just now innovating our management practices is scarier to me than the idea of using computers to tell me whom I should hire.


  1. 02.01.2014 at 10:12 am

    Awesome post, Jamie. Although I’ll respectfully disagree. I don’t think it is good that companies are trying this. Or at least, I don’t think it is a step in the right direction and it will likely cause more harm than good. I agree that we don’t know what we are doing, but I don’t think the solution is to double-down our bet on the “mechanical, linear, conventional, positivist, quantification, measurement-obsessed” approach. I think we need to realize that the solution requires a new type of thinking (other than the one that got us into this mess). A type of thinking that not only recognizes that we don’t know, but that we CAN’T know. Here I’m thinking about the problem of induction (or the Black Swan) and how our models are always going to be hilariously inaccurate. We keep making predictions about hiring and performance (and almost everything in politics, finance, etc.) and we keep being wrong. I don’t think the solution is to keep coming up with new ways of predicting, but to take a sober look at our deep (and faulty) epistomological assumptions and move forward from there. We can make the situation better, but not by furthering the illusion that we can predict this stuff. This in turn actually makes us more vulnerable. In our lust for efficiency we seek out and destroy the types of redundancy and chaos that all organic systems need to survive and grow. The Moneyball approach might help with some small issues, but baseball still finds itself in the same situation it did before. When we recognize that we can’t predict, we actually liberate ourselves. We have more energy and focus on the things that we can control. I’m writing a book now on this type of model, so if you’d like to talk more about it let me know. This conversation is only heating up and I think we will all be thinking a lot about it.

    • 03.01.2014 at 7:30 am

      Fabulous comment, Chris! I suspect we agree more than disagree. I am with you on moving away from predicting, and I certainly was aware of the irony of me suggesting exploring this mechanical approach when I wrote a book titled Humanize! I definitely don’t think big data is “the answer.” But I do think it can help us learn things. I don’t think baseball is really in the same situation, entirely. The moneyball stats don’t guarantee anything (the A’s haven’t won the world series using that approach). But they did win a bunch of games without spending as much money, so I think there has been some learning there). I’d love to set up a time to talk to you more about your book (and the next one I’m thinking about).

      • 03.01.2014 at 7:45 am

        Exactly. The Humanize approach is a great example of a more elegant map of the territory. I’m currently using Mobile as a map in a similar way, and while we all must use maps (because we are always approximating reality and never getting to it), the industry’s general failure to acknowledge this fundamental “approximation-ness” is going to get us into trouble. It’s the whole rearranging the deck chairs on the Titanic thing. We are obsessing over the small things precisely because we can measure them. We are better at creating problems that match our measurement tools than inventing real solutions that match our existing problems.