Big data has made a big impact on the business landscape, but not as big an impact as it could. Despite the astonishing amount of data being generated and gathered for analysis, in the US, fewer than 60 percent of companies are routinely generating value or revenue from data.
What accounts for this? In order to be acted upon, data can’t just be available to the people making the decisions. The term “big data” came into existence to refer to data sets so large that available analysis and storage methods were unable to effectively handle them. With this sort of an overload, decision makers can’t be expected to find actionable information in the noise.
User-Unfriendliness
Available analytics tools can also fall short, because becoming skilled in using an analytics tool is also outside the purview of many decision makers. If big data is simply too large for marketers, project managers, C-level executives, and others to handle effectively, then the available tools are often too user-unfriendly. They may require nontechnical employees to learn new vernacular, uncomfortable interfaces, or even elements of database management, formal logic, or query design in order to get the information they need. This accidental barrier may reduce the amount of usable data that reaches decision makers, or eliminate their engagement with data entirely.
Natural Language Processing
Fortunately, there are non-technical endpoints for users who need to interpret and investigate massive data sets. Consider the case of Google.
In 2014, big data and prediction researcher Mikael Huss estimated that Google processed 100 petabytes (100 million gigabytes) of data per day – over three times as much as the NSA. At the time, they were also estimated to be storing 15,000 petabytes of information. But the average Google user isn’t writing queries to mine the data: they’re more likely to type in “how do I maximize ROI” or “cats jumping in boxes”. These searches produce usable results because Google has devoted extensive time and effort into natural language processing research.
Natural language processing allows Google to abstract extremely complicated search algorithms and hide them from the end user. To be sure, this abstraction protects their proprietary algorithms, but it also reduces the barrier to user engagement. And in the majority of cases, the end user isn’t interested in the process. They’re interested in an actionable result.
In short, Google has designed their computer systems to understand the language of humans – rather than training humans to speak the language of computers.
If big data is to see mass business adoption, or to reach its full potential in the businesses which do adopt, businesses should take a page from Google’s handbook and make querying the datasets as painless as possible.