Let the Machines Do the Work

Disruptive IT Trends3 minutes readJan 14th, 2014
SHARE +

In the world of big data today so much emphasis is on the handling of data per the four V’s (volume, variety, velocity and variability) and their effect on data management. However, established enterprises are less enamored with the data management side of things; many having well-established business intelligence and data management practices in place for years. What is driving enterprises toward big data analytics is the promise of better business insights and the need to become more data-driven in their decision making. Enter the resurgence of data mining and machine learning.

Data mining, or the ability to identify and extract patterns out of large volumes of data, is nothing new. What has changed with the latest big data trends are:

  • The underlying platforms to perform this task at scale and at lower cost;
  • The shift from storing only data you have deemed useful to a default position to ALL data is useful;
  • And the recognition that all this new computing power doesn’t do much good without the skilled data scientists to extract business insights from it.

What is most attractive to enterprises looking to use big data to their advantage are the new products, markets, and customers that can be identified and targeted by a through application of data science. What is the least known is that before any “science” can be performed, the basics of data quality must be addressed. Raw data, in unprocessed form, is rarely a useful thing in itself. There is much more refinement that is necessary to be performed before useful insights maybe extracted.

At Unisys, we have been applying analytics to large volumes of data for years. Often our reporting requirements for various end-user support services have required a manual effort to classify and code records for further analysis up-stream. This data quality effort was large and manually performed. It allowed for more advanced analytics downstream of the data, but the process was not scalable and tied-up key resources with domain knowledge for weeks at a time. Our data science team worked together with our domain experts to remedy this issue, first by understanding the domain problem and then understanding the various data sources that were being combined. Data scientists were then able to automate the once manual coding and classification process by applying machine learning techniques. This let the machines do the work, and dramatically reduced the manual effort, freeing-up resources with valuable domain expertise.

This is an example of how we at Unisys are letting the machines do the work and enabling both downstream analytics and increased customer responsiveness. To support the needs of our customers in innovating the way they work with data, we have developed our Big Data Analytics as a Service (BDAaaS) offering. BDAaaS brings together the platforms to handle complex analytics over large data volumes with the data scientists to extract business insights to gain efficiency and effectiveness in a consumption-based model. These services allow our clients to get quick access at the advanced analytics they desire, now, without having first to figure out the shifting landscape of the big data industry, hire the right people, and build a scalable analytics environment.

Tags-   Analytics Big data Smart computing