Curious, Spurious, and Cyclical: Or why data driven decision making is garbage ...
May 10, 2017
Data driven decision making is garbage. Pure bunk. One foot in the grave, and the other on a banana peel. “Wait”, you say, “don’t you live and breathe science, and data, and problems and answers?” Yep—sure do. Let us elaborate.
Oh, and just to confirm, there are no bears in this post.
It is no secret that companies are attempting to leverage data to make better informed decisions. Proponents of data informed strategy point to the sheer volume and diversity that is accumulated every day from innumerable sources. Managers and executives, it is claimed, can leverage this influx of information to better analyze their firm’s internal or external environments and design accompanying practices. Critics of data driven analytics, who are also in the minority, note that access to diverse data is not in and of itself a novel or new phenomenon. To that, they would say that data driven decision making can be crazy harmful. The 1960’s, for example, saw a landslide of information coming down the mountain in terms of survey and demographic information. It was believed that, no joke, a literal social utopia could be formed by just going through the data. An entire center for survey analysis was formed at the University of Michigan with this very purpose—to create a purely analytically driven society where rationality, and actuality, rule the day.
Alas, as one may observe, this utopia did not emerge. The researchers at U of M found themselves up data creek with half a paddle. Projects were began and abandoned. Spurious correlations between x and y variables drove actual political and social policy initiatives. In short, the researchers focused too heavily on the data and not the process. They began their work with assumptions as to what the problem or question was. And these untested assumptions drove them down the (wrong) proverbial rabbit hole.
At interstitio we pride ourselves on the process of data driven decision making. The process is a creative, even artistic, curiosity driven episode grounded in rigorous science. The scientific method requires that initial conditions, also known as assumptions, be tested before one can even begin to think about the manifest problems or questions. This is the start of the process. Too often, organizations gather data based on chains of implicit assumptions about what the problem is. From this, they get answers to problems they themselves engineered. It’s curious, and spurious, and cyclical.
Think about a problem facing your organization, or perhaps even a supposed troubled employee. Now, force yourself to answer this question: do you know—like, fully and without a doubt-you-just-met-the-partner-of-your-dreams, put your life on the line—that your assumption about the problem is as close to accurate as possible? Now, answer one more question: If your prior answer was “no,” how do you go about testing your assumptions?
A good deal of the work we do begins with those two questions. It’s amazing how often we, as people, cruise by on schemas and assumptions. It’s even more amazing that teams of highly trained professionals guiding massive firms do the same.
Don’t be those folks. Question your assumptions, and then question those questions. Find ways to gather data on your assumptions, before gathering more data. Do better than U of M did as you look to make your own organizational utopia. We’d love to help, so don’t hesitate to drop us a line.