Wrapping your Head Around Hadoop Jun 12, 2015
Just about everybody agrees that Hadoop is one of those transformational technologies that comes along about twice a decade to change the way everyone thinks about enterprise IT. The challenge facing solution providers and their customers now is figuring out exactly what to do with it.
Gartner, for example, recently published a report that goes so far as to suggest that Hadoop adoption is slowing as IT organizations grapple with how to apply Hadoop in production environments. A big reason for that, suggests a separate survey published this week by Software AG, is the simple fact that most organizations don’t have business processes in place that can absorb massive amounts of data in real time.
At the core of the debate is how Hadoop will be applied today vs. tomorrow. As an inexpensive mechanism for storing massive amounts of data, Hadoop has no equal. The question then becomes whether Hadoop is merely an extension of existing data warehouse investments to access a “Big Data lake” of information or whether Hadoop will become that actual data warehouse itself. Most IT organizations spend a fortune on data warehouses and would surely love to eliminate that expense. But the odds of actually doing that at scale anytime soon are, of course, are near zero given the relative immaturity of the Hadoop platform.
As is often the case there is also another third side that solution providers need to consider. MapR Technologies, a provider of a distribution of Hadoop, made a case this week at the Hadoop Summit 2015 conference that it will be only a matter of time before transaction processing and analytics will converge on top of Hadoop. By taking advantage of in-memory computing, the whole notion of how operational analytics are delivered inside the enterprise will be turned on its head, said Jack Norris, chief marketing officer at MapR. To back up that effort, MapR also announced its technology partner alliance program now includes Centrify, Dataguise, Datameer, HP Security Voltage, Informatica, Protegrity, Syncsort, Talend, Teradata, Waterline Data and Zaloni.
As is often the case with any emerging technology, all three of these visions of how Hadoop will be used inside the enterprise no doubt will come to pass at different points in time in the future. The challenge facing solution providers that want to play in the big data space is finding a way to map out a big data journey that is plausible enough today for organizations to want to make investments that will result in better business decisions tomorrow.