In the age of big data, the amount of information flowing round and through our systems at any one time is truly remarkable, and the intelligent machines being created to deal with it are becoming even more so.
IBM, for instance, recently has unveiled a super-computer called Watson which can ingest data at the rate of 67 million pages a second. It is obtaining this ever-increasing flow of data from the millions of connected devices in the Internet of Things.
The purpose of this latest IBM innovation is to establish a question and answer dialogue between it and humans, whereby Watson will run unimaginable amounts of data and answer in plain speech and in real time.
Perhaps we should not be surprised. It is the direction of travel for analytics and it has been made possible by the seemingly infinite capacity of the cloud with its interconnected stacks of thousands upon thousands of servers.
Google, as always, has been a forerunner of this type of Q&A interaction. When you ask it a question now, it automatically comes up with the answer to that question before searching the individual sites within it.
The question for businesses, however – large or small, local or global – is how business intelligence systems can add in that functionality and tailor it specifically to the needs and aspirations of a particular company.
Companies are like individuals. Each one is different and each has its own business DNA. The challenge for suppliers is to understand the KPIs required and do the difficult bit of taking the data and turning it into a mechanism for running the business more efficiently.
Presenting the data in a digestible form is key. NetSuite, the cloud-based ERP system, has a dashboard function in which large volumes of data can be presented in a way that supports directors and managers in running and growing the business. As soon as the data changes, the dashboard changes.
And new products will emerge to expedite this process. Cloud Data Exchange, launching from Eureka Solutions this spring, will allow data to be pulled from multiple systems into one integrated dataset which can then be reported on.
Some have suggested that 2017 will be the year of data analytics, but a more human element of that scenario is that it will also be the year of data engineers, who may find themselves in a starring role.
Data engineers are the ones who have to understand the underlying database, by taking samples of the data and building them up into bigger numbers – all the while checking accuracy by running back to the totals of the entire data.
This, of necessity, is an exercise in extrapolation which – to have worth – must continually be overlain on the ongoing stream of data.
IBM’s Watson, in contrast, is heading is down the road of taking the human element out of the equation and allowing the machines themselves to direct travel by analysing the entire dataset, rather than simply samples.
Recent key speeches on artificial intelligence have looked at how AI will take data, parse it and present useful, usable information. Larger businesses are already actively looking at this scenario.
But, viewed from 2017 and the current perspectives of suppliers and clients, that is probably quite a few years away. Our priority will remain understanding business requirements and accessing data for customer in the interests of efficiency and future planning.