Seventy-nine percent of salespeople routinely input the same data into multiple systems as reported in Bluewolf’s fifth-annual The State of Salesforce report. In fact, data was the number one challenge of the 1,700 companies surveyed.
Siloed business systems and incompatible data are the major culprits. That leaves companies faced with more than just lost productivity, complex and costly integration but little real-time visibility, increased customer frustration and a larger issue – roadblocks to innovation.
As companies look to embrace more and more automation ranging from augmented reality, intelligent applications to artificial intelligence – data accuracy, compatibility and completeness become major stumbling blocks.
The old adage of “garbage in, garbage out” should be redefined to “garbage in, garbage out with higher levels of risk.”
I saw this first-hand with a client who wanted to do customer cohort-based LCTV analysis that required data from multiple systems. What should have been a day’s worth of sales, marketing operations and customer success effort ended up including the entire IT team trying to understand the root cause of why the data they had was incompatible and how come it was also so incomplete. The analysis, critical to driving sales strategies for the rest of the year, is unresolved after two weeks; the sales and marketing team focused on a quick and dirty ‘workaround.’ What data they were able to use was in different formats though some of it was outdated and all of it was largely untrusted.
Data has been the crux of corporate productivity, transformation and effectiveness challenges for decades. Yet it is largely shuffled aside because solving the root data problem is hard and requires lots of human effort as this is not quick technology fix.
Christine Crandell: How does data impact the quality of customer experiences?
Greg Layok: The more you know about your customers, the better you’re able to serve them. Bad data or incomplete data can cause businesses to serve their clients irrelevant offers. This is akin to being tone deaf to what your customer base really wants. Younger generations expect companies to know them, and know them well, and as these generations become leaders they will bring those expectations to B2B interactions. Good data will be required for the technologies of tomorrow, such as machine learning and artificial intelligence which require a foundation of high-quality, complete data to work.
Crandell: Increasingly users accept the results of analytics as the ‘right answer.’ Why aren’t they questioning the results?
Layok: This is a skills issue. Every MBA program needs to include a class on how to understand the information that’s put in front of us. Because the data isn’t lying; we just have to understand how to read it. If you realize you’re making the wrong decision, go back to your analysis. What were you missing? Data can help you make decisions, but you still need to understand which levers and KPIs affect your business. Data should also be tested. Don’t blindly take a big jump off a cliff because you are confident in your “one source of truth.”
Crandell: What do you tell companies who want a “one source of truth”?
Layok: We warn clients not to confuse “system of record” with “single source of truth.” The idea of having a single source of the truth is that you can analyze a set of data in a way that gives the full context behind that data. Where the system of record creates a record, such as a new customer, the source of truth is what can tell you how about that customer, such as where they are in the sales cycle. Making sure everyone calculates metrics the same way is essential to single source of truth. In our business (consulting), if my practice includes PTO when calculating utilization but another practice does not, we would not have a single source of truth. This is why analytics departments, with representation in the C-suite, are cropping up separate from finance, IT, and operations.
Crandell: Considering that the same data set could be used to produce a desired outcome, how can businesses make sure that the ‘truth’ is reported?
Layok: One of the biggest flaws in organizational use of data is confusing correlation with causation. As more companies embark on “big data” journeys, employees who are not necessarily trained in statistics or data science are being asked to analyze data. And when untrained people spot correlating factors, they often confuse the correlating variables with cause and effect. Compounding this issue is that access to dashboards and models with the intent of driving data-based decisions is widely granted. But easy access to data does not mean those with access have the proper background to be reading the data correctly. Organizations must employ workers who are trained in statistics, actuarial science, or data science – or provide the proper education to those who are not – to make sure the truth is reported.
Crandell: What are three best practices that companies should follow to get the most out of their data ?
1. Define ownership of analytics. Establish your governance processes to get the single version of truth.
2. Deliver broad executive education to help them understand how to interpret the data they’re receiving.
3. Foster a data-driven culture. If only a few people care about the data, it won’t go anywhere. Data must be a priority for all employees.
Crandell: What can companies do to not lose trust in their data?
Layok: In this age where everyone wants to be in data science, the analyzers must truly understand the underlying models. When data is analyzed with the right scientific rigor, it will produce the right response. Data did not lead us down the wrong path when most models predicted Hillary Clinton to win the presidency in 2016: The data was fundamentally flawed. It did not have the right underlying assumptions, was incomplete, and there were significant data collection issues, leaving predictions open to wide margins of error. It was a case of bad data and unchecked models.