Research Firm Advises Analytics Stakeholders and Security Professionals to Build Plans for Securing Hadoop-Based Assets Jan 9, 2017
Dataguise, a technology leader in secure business execution, announced inclusion in a report by Gartner titled, “Rethink and Extend Data Security Policies to Include Hadoop.” The report provides best practices for addressing data security concerns related to Apache Hadoop deployments and highlights several leading vendors in the category to support these endeavors.
According to the report, “The continuing growth of Hadoop as a platform for data analysis and, increasingly, for more operational data processing uses has created data security issues that are not being addressed. Unlike DBMSs, Hadoop software stacks have not had built-in security capabilities and, because they increase utilization of file system-based data that is not otherwise protected, new vulnerabilities can emerge that compromise carefully crafted data security regimes.”
As a pioneer in the detection, protection, monitoring, and auditing of sensitive data in Hadoop environments, we are well aware of the issues organizations face by not addressing the exposure of Big Data,” said JT Sison, Vice President of Marketing and Business Development. “Dataguise DgSecure is unique in its breadth and covers a considerable range of concerns around the security and compliance of Apache Hadoop and other highly scalable data frameworks. We appreciate Gartner for mentioning us in the report. Gartner has again nailed the importance of broadening one’s perspective when it comes to Hadoop. As mentioned in the report, there are many threats to consider regardless of the data framework selected so it will be necessary for organizations to orchestrate a Hadoop security stack. Considered by many as a foundational building block in this venture, DgSecure is well situated in countless Hadoop installations. We invite CIOs, CDOs, security professionals, and others to visit with our team regarding new projects and those under consideration.”
Protecting dark data is an often overlooked challenge because many organizations managing Hadoop infrastructure have not implemented measures or technologies to track what is being stored. With the deployment of proper data retention architecture, including the installation of sensitive data detection, monitoring and auditing solutions, enterprises are better able to understand the information being stored and can apply the appropriate policies to ensure compliance. These policies should be designed to help the organization increase efficiencies and significantly reduce security concerns.
Organizations adopting Hadoop as a data lake platform seek to put original format data directly into a repository where it can be used for ‘agility’ and ‘self-service’ with no understanding of what is actually stored there,” wrote Merv Adrian, Research Analyst for Gartner. “Unlike DBMSs, which are typically used to store known data that conforms to predetermined policies about quality, ownership and standards, Hadoop creates the possibility of presenting users with ‘dark data.”