What’s the Big Hadoop-la?

On Point3 minutes readApr 2nd, 2013
SHARE +

The big data space is as complex as it is large. The challenges of big data – volume, variety, velocity and the sheer number of vendors – creates a confusing dilemma for government agencies to select the best platform to meet mission needs. Similar to the cloud space, vendors are using big data as the single term that addresses many of the problems agencies may be experiencing.

Agencies seek insight which their analysts use to advance their mission by converting raw data into actionable information and knowledge. Huge and growing amounts of raw data must be integrated and sliced and diced to support data-driven decision making.

Fortunately, a new simplified method is helping agencies zero-in on the crowded big data space with a vendor neutral approach to picking the most flexible technology for their needs. Using a framework such as the following, agencies pinpoint what value should be extracted from existing data and which data will be most useful by focusing on:

  1. Efficient Data Processing: Reducing the cost of storing, integrating and accessing data across an agency.
  2. Effective Information Management: Improving timeliness and quality of decision making with access to more comprehensive data.
  3. Expressive Analytics: Using data in a format that is easily digested and incorporated into decision making by identifying new opportunities (predictive) or challenges through pattern discovery (forensic).

A major financial agency, as one example, is experiencing an influx of data as its mission adapts to the increased data processing needs of the expanding mass of petabytes of data following the ongoing economic crisis. Analyzing economic indicators forecasting sensitive data trends, the agency is trying to identify new data management technology to sift through data from a host of new sources, including hedge funds and mortgage institutions. Unfortunately the agency’s current architecture does not support the slicing and dicing of data that their analysts need now. By using a mission-driven approach that focuses on the analytics and the data, the agency will get through the analysis paralysis that can exist in this complex landscape. This will enable them to achieve mission agility and avoid the costly missteps that can occur in such a rapidly changing landscape.

An agency’s specific mission and its data challenges should be the main drivers in selecting the most efficient and effective data analytics platform – technology should not be the driver. So, what is the big Hadoop-la? Hadoop – as powerful and as useful a data processing platform it may be – does not guarantee the analytics solution agencies need.

Software like Hadoop – with its multiple versions such as Cloudera, MapR, Microsoft HD Insights, Hortonworks and a host of others – constantly changes the landscape of technologies from which to select. Many solutions exist but no “one size” fits all.

Implementing data analytics reference architecture within an agency’s existing infrastructure together with a structured data analytics methodology will yield a comprehensive analytics platform that improves mission agility, increases efficiencies and decreases costs.

Tags-   Big data